WorldWideScience

Sample records for instrumental variables approach

  1. The productivity of mental health care: an instrumental variable approach.

    Science.gov (United States)

    Lu, Mingshan

    1999-06-01

    BACKGROUND: Like many other medical technologies and treatments, there is a lack of reliable evidence on treatment effectiveness of mental health care. Increasingly, data from non-experimental settings are being used to study the effect of treatment. However, as in a number of studies using non-experimental data, a simple regression of outcome on treatment shows a puzzling negative and significant impact of mental health care on the improvement of mental health status, even after including a large number of potential control variables. The central problem in interpreting evidence from real-world or non-experimental settings is, therefore, the potential "selection bias" problem in observational data set. In other words, the choice/quantity of mental health care may be correlated with other variables, particularly unobserved variables, that influence outcome and this may lead to a bias in the estimate of the effect of care in conventional models. AIMS OF THE STUDY: This paper addresses the issue of estimating treatment effects using an observational data set. The information in a mental health data set obtained from two waves of data in Puerto Rico is explored. The results using conventional models - in which the potential selection bias is not controlled - and that from instrumental variable (IV) models - which is what was proposed in this study to correct for the contaminated estimation from conventional models - are compared. METHODS: Treatment effectiveness is estimated in a production function framework. Effectiveness is measured as the improvement in mental health status. To control for the potential selection bias problem, IV approaches are employed. The essence of the IV method is to use one or more instruments, which are observable factors that influence treatment but do not directly affect patient outcomes, to isolate the effect of treatment variation that is independent of unobserved patient characteristics. The data used in this study are the first (1992

  2. A review of instrumental variable estimators for Mendelian randomization.

    Science.gov (United States)

    Burgess, Stephen; Small, Dylan S; Thompson, Simon G

    2017-10-01

    Instrumental variable analysis is an approach for obtaining causal inferences on the effect of an exposure (risk factor) on an outcome from observational data. It has gained in popularity over the past decade with the use of genetic variants as instrumental variables, known as Mendelian randomization. An instrumental variable is associated with the exposure, but not associated with any confounder of the exposure-outcome association, nor is there any causal pathway from the instrumental variable to the outcome other than via the exposure. Under the assumption that a single instrumental variable or a set of instrumental variables for the exposure is available, the causal effect of the exposure on the outcome can be estimated. There are several methods available for instrumental variable estimation; we consider the ratio method, two-stage methods, likelihood-based methods, and semi-parametric methods. Techniques for obtaining statistical inferences and confidence intervals are presented. The statistical properties of estimates from these methods are compared, and practical advice is given about choosing a suitable analysis method. In particular, bias and coverage properties of estimators are considered, especially with weak instruments. Settings particularly relevant to Mendelian randomization are prioritized in the paper, notably the scenario of a continuous exposure and a continuous or binary outcome.

  3. Social interactions and college enrollment: A combined school fixed effects/instrumental variables approach.

    Science.gov (United States)

    Fletcher, Jason M

    2015-07-01

    This paper provides some of the first evidence of peer effects in college enrollment decisions. There are several empirical challenges in assessing the influences of peers in this context, including the endogeneity of high school, shared group-level unobservables, and identifying policy-relevant parameters of social interactions models. This paper addresses these issues by using an instrumental variables/fixed effects approach that compares students in the same school but different grade-levels who are thus exposed to different sets of classmates. In particular, plausibly exogenous variation in peers' parents' college expectations are used as an instrument for peers' college choices. Preferred specifications indicate that increasing a student's exposure to college-going peers by ten percentage points is predicted to raise the student's probability of enrolling in college by 4 percentage points. This effect is roughly half the magnitude of growing up in a household with married parents (vs. an unmarried household). Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Sensitivity analysis and power for instrumental variable studies.

    Science.gov (United States)

    Wang, Xuran; Jiang, Yang; Zhang, Nancy R; Small, Dylan S

    2018-03-31

    In observational studies to estimate treatment effects, unmeasured confounding is often a concern. The instrumental variable (IV) method can control for unmeasured confounding when there is a valid IV. To be a valid IV, a variable needs to be independent of unmeasured confounders and only affect the outcome through affecting the treatment. When applying the IV method, there is often concern that a putative IV is invalid to some degree. We present an approach to sensitivity analysis for the IV method which examines the sensitivity of inferences to violations of IV validity. Specifically, we consider sensitivity when the magnitude of association between the putative IV and the unmeasured confounders and the direct effect of the IV on the outcome are limited in magnitude by a sensitivity parameter. Our approach is based on extending the Anderson-Rubin test and is valid regardless of the strength of the instrument. A power formula for this sensitivity analysis is presented. We illustrate its usage via examples about Mendelian randomization studies and its implications via a comparison of using rare versus common genetic variants as instruments. © 2018, The International Biometric Society.

  5. The effect of patient satisfaction with pharmacist consultation on medication adherence: an instrumental variable approach

    Directory of Open Access Journals (Sweden)

    Gu NY

    2008-12-01

    Full Text Available There are limited studies on quantifying the impact of patient satisfaction with pharmacist consultation on patient medication adherence. Objectives: The objective of this study is to evaluate the effect of patient satisfaction with pharmacist consultation services on medication adherence in a large managed care organization. Methods: We analyzed data from a patient satisfaction survey of 6,916 patients who had used pharmacist consultation services in Kaiser Permanente Southern California from 1993 to 1996. We compared treating patient satisfaction as exogenous, in a single-equation probit model, with a bivariate probit model where patient satisfaction was treated as endogenous. Different sets of instrumental variables were employed, including measures of patients' emotional well-being and patients' propensity to fill their prescriptions at a non-Kaiser Permanente (KP pharmacy. The Smith-Blundell test was used to test whether patient satisfaction was endogenous. Over-identification tests were used to test the validity of the instrumental variables. The Staiger-Stock weak instrument test was used to evaluate the explanatory power of the instrumental variables. Results: All tests indicated that the instrumental variables method was valid and the instrumental variables used have significant explanatory power. The single equation probit model indicated that the effect of patient satisfaction with pharmacist consultation was significant (p<0.010. However, the bivariate probit models revealed that the marginal effect of pharmacist consultation on medication adherence was significantly greater than the single equation probit. The effect increased from 7% to 30% (p<0.010 after controlling for endogeneity bias. Conclusion: After appropriate adjustment for endogeneity bias, patients satisfied with their pharmacy services are substantially more likely to adhere to their medication. The results have important policy implications given the increasing focus

  6. Instrumental Variables in the Long Run

    DEFF Research Database (Denmark)

    Casey, Gregory; Klemp, Marc Patrick Brag

    2017-01-01

    In the study of long-run economic growth, it is common to use historical or geographical variables as instruments for contemporary endogenous regressors. We study the interpretation of these conventional instrumental variable (IV) regressions in a general, yet simple, framework. Our aim...... quantitative implications for the field of long-run economic growth. We also use our framework to examine related empirical techniques. We find that two prominent regression methodologies - using gravity-based instruments for trade and including ancestry-adjusted variables in linear regression models - have...... is to estimate the long-run causal effect of changes in the endogenous explanatory variable. We find that conventional IV regressions generally cannot recover this parameter of interest. To estimate this parameter, therefore, we develop an augmented IV estimator that combines the conventional regression...

  7. Power calculator for instrumental variable analysis in pharmacoepidemiology.

    Science.gov (United States)

    Walker, Venexia M; Davies, Neil M; Windmeijer, Frank; Burgess, Stephen; Martin, Richard M

    2017-10-01

    Instrumental variable analysis, for example with physicians' prescribing preferences as an instrument for medications issued in primary care, is an increasingly popular method in the field of pharmacoepidemiology. Existing power calculators for studies using instrumental variable analysis, such as Mendelian randomization power calculators, do not allow for the structure of research questions in this field. This is because the analysis in pharmacoepidemiology will typically have stronger instruments and detect larger causal effects than in other fields. Consequently, there is a need for dedicated power calculators for pharmacoepidemiological research. The formula for calculating the power of a study using instrumental variable analysis in the context of pharmacoepidemiology is derived before being validated by a simulation study. The formula is applicable for studies using a single binary instrument to analyse the causal effect of a binary exposure on a continuous outcome. An online calculator, as well as packages in both R and Stata, are provided for the implementation of the formula by others. The statistical power of instrumental variable analysis in pharmacoepidemiological studies to detect a clinically meaningful treatment effect is an important consideration. Research questions in this field have distinct structures that must be accounted for when calculating power. The formula presented differs from existing instrumental variable power formulae due to its parametrization, which is designed specifically for ease of use by pharmacoepidemiologists. © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association

  8. Eliminating Survivor Bias in Two-stage Instrumental Variable Estimators.

    Science.gov (United States)

    Vansteelandt, Stijn; Walter, Stefan; Tchetgen Tchetgen, Eric

    2018-07-01

    Mendelian randomization studies commonly focus on elderly populations. This makes the instrumental variables analysis of such studies sensitive to survivor bias, a type of selection bias. A particular concern is that the instrumental variable conditions, even when valid for the source population, may be violated for the selective population of individuals who survive the onset of the study. This is potentially very damaging because Mendelian randomization studies are known to be sensitive to bias due to even minor violations of the instrumental variable conditions. Interestingly, the instrumental variable conditions continue to hold within certain risk sets of individuals who are still alive at a given age when the instrument and unmeasured confounders exert additive effects on the exposure, and moreover, the exposure and unmeasured confounders exert additive effects on the hazard of death. In this article, we will exploit this property to derive a two-stage instrumental variable estimator for the effect of exposure on mortality, which is insulated against the above described selection bias under these additivity assumptions.

  9. Falsification Testing of Instrumental Variables Methods for Comparative Effectiveness Research.

    Science.gov (United States)

    Pizer, Steven D

    2016-04-01

    To demonstrate how falsification tests can be used to evaluate instrumental variables methods applicable to a wide variety of comparative effectiveness research questions. Brief conceptual review of instrumental variables and falsification testing principles and techniques accompanied by an empirical application. Sample STATA code related to the empirical application is provided in the Appendix. Comparative long-term risks of sulfonylureas and thiazolidinediones for management of type 2 diabetes. Outcomes include mortality and hospitalization for an ambulatory care-sensitive condition. Prescribing pattern variations are used as instrumental variables. Falsification testing is an easily computed and powerful way to evaluate the validity of the key assumption underlying instrumental variables analysis. If falsification tests are used, instrumental variables techniques can help answer a multitude of important clinical questions. © Health Research and Educational Trust.

  10. Instrumental variable methods in comparative safety and effectiveness research.

    Science.gov (United States)

    Brookhart, M Alan; Rassen, Jeremy A; Schneeweiss, Sebastian

    2010-06-01

    Instrumental variable (IV) methods have been proposed as a potential approach to the common problem of uncontrolled confounding in comparative studies of medical interventions, but IV methods are unfamiliar to many researchers. The goal of this article is to provide a non-technical, practical introduction to IV methods for comparative safety and effectiveness research. We outline the principles and basic assumptions necessary for valid IV estimation, discuss how to interpret the results of an IV study, provide a review of instruments that have been used in comparative effectiveness research, and suggest some minimal reporting standards for an IV analysis. Finally, we offer our perspective of the role of IV estimation vis-à-vis more traditional approaches based on statistical modeling of the exposure or outcome. We anticipate that IV methods will be often underpowered for drug safety studies of very rare outcomes, but may be potentially useful in studies of intended effects where uncontrolled confounding may be substantial.

  11. Instrumental variable methods in comparative safety and effectiveness research†

    Science.gov (United States)

    Brookhart, M. Alan; Rassen, Jeremy A.; Schneeweiss, Sebastian

    2010-01-01

    Summary Instrumental variable (IV) methods have been proposed as a potential approach to the common problem of uncontrolled confounding in comparative studies of medical interventions, but IV methods are unfamiliar to many researchers. The goal of this article is to provide a non-technical, practical introduction to IV methods for comparative safety and effectiveness research. We outline the principles and basic assumptions necessary for valid IV estimation, discuss how to interpret the results of an IV study, provide a review of instruments that have been used in comparative effectiveness research, and suggest some minimal reporting standards for an IV analysis. Finally, we offer our perspective of the role of IV estimation vis-à-vis more traditional approaches based on statistical modeling of the exposure or outcome. We anticipate that IV methods will be often underpowered for drug safety studies of very rare outcomes, but may be potentially useful in studies of intended effects where uncontrolled confounding may be substantial. PMID:20354968

  12. Instrumental variables I: instrumental variables exploit natural variation in nonexperimental data to estimate causal relationships.

    Science.gov (United States)

    Rassen, Jeremy A; Brookhart, M Alan; Glynn, Robert J; Mittleman, Murray A; Schneeweiss, Sebastian

    2009-12-01

    The gold standard of study design for treatment evaluation is widely acknowledged to be the randomized controlled trial (RCT). Trials allow for the estimation of causal effect by randomly assigning participants either to an intervention or comparison group; through the assumption of "exchangeability" between groups, comparing the outcomes will yield an estimate of causal effect. In the many cases where RCTs are impractical or unethical, instrumental variable (IV) analysis offers a nonexperimental alternative based on many of the same principles. IV analysis relies on finding a naturally varying phenomenon, related to treatment but not to outcome except through the effect of treatment itself, and then using this phenomenon as a proxy for the confounded treatment variable. This article demonstrates how IV analysis arises from an analogous but potentially impossible RCT design, and outlines the assumptions necessary for valid estimation. It gives examples of instruments used in clinical epidemiology and concludes with an outline on estimation of effects.

  13. On the Interpretation of Instrumental Variables in the Presence of Specification Errors

    Directory of Open Access Journals (Sweden)

    P.A.V.B. Swamy

    2015-01-01

    Full Text Available The method of instrumental variables (IV and the generalized method of moments (GMM, and their applications to the estimation of errors-in-variables and simultaneous equations models in econometrics, require data on a sufficient number of instrumental variables that are both exogenous and relevant. We argue that, in general, such instruments (weak or strong cannot exist.

  14. Management Approach for Earth Venture Instrument

    Science.gov (United States)

    Hope, Diane L.; Dutta, Sanghamitra

    2013-01-01

    The Earth Venture Instrument (EVI) element of the Earth Venture Program calls for developing instruments for participation on a NASA-arranged spaceflight mission of opportunity to conduct innovative, integrated, hypothesis or scientific question-driven approaches to pressing Earth system science issues. This paper discusses the EVI element and the management approach being used to manage both an instrument development activity as well as the host accommodations activity. In particular the focus will be on the approach being used for the first EVI (EVI-1) selected instrument, Tropospheric Emissions: Monitoring of Pollution (TEMPO), which will be hosted on a commercial GEO satellite and some of the challenges encountered to date and corresponding mitigations that are associated with the management structure for the TEMPO Mission and the architecture of EVI.

  15. Econometrics in outcomes research: the use of instrumental variables.

    Science.gov (United States)

    Newhouse, J P; McClellan, M

    1998-01-01

    We describe an econometric technique, instrumental variables, that can be useful in estimating the effectiveness of clinical treatments in situations when a controlled trial has not or cannot be done. This technique relies upon the existence of one or more variables that induce substantial variation in the treatment variable but have no direct effect on the outcome variable of interest. We illustrate the use of the technique with an application to aggressive treatment of acute myocardial infarction in the elderly.

  16. Optimal Inference for Instrumental Variables Regression with non-Gaussian Errors

    DEFF Research Database (Denmark)

    Cattaneo, Matias D.; Crump, Richard K.; Jansson, Michael

    This paper is concerned with inference on the coefficient on the endogenous regressor in a linear instrumental variables model with a single endogenous regressor, nonrandom exogenous regressors and instruments, and i.i.d. errors whose distribution is unknown. It is shown that under mild smoothness...

  17. Invited Commentary: Using Financial Credits as Instrumental Variables for Estimating the Causal Relationship Between Income and Health.

    Science.gov (United States)

    Pega, Frank

    2016-05-01

    Social epidemiologists are interested in determining the causal relationship between income and health. Natural experiments in which individuals or groups receive income randomly or quasi-randomly from financial credits (e.g., tax credits or cash transfers) are increasingly being analyzed using instrumental variable analysis. For example, in this issue of the Journal, Hamad and Rehkopf (Am J Epidemiol. 2016;183(9):775-784) used an in-work tax credit called the Earned Income Tax Credit as an instrument to estimate the association between income and child development. However, under certain conditions, the use of financial credits as instruments could violate 2 key instrumental variable analytic assumptions. First, some financial credits may directly influence health, for example, through increasing a psychological sense of welfare security. Second, financial credits and health may have several unmeasured common causes, such as politics, other social policies, and the motivation to maximize the credit. If epidemiologists pursue such instrumental variable analyses, using the amount of an unconditional, universal credit that an individual or group has received as the instrument may produce the most conceptually convincing and generalizable evidence. However, other natural income experiments (e.g., lottery winnings) and other methods that allow better adjustment for confounding might be more promising approaches for estimating the causal relationship between income and health. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. Is foreign direct investment good for health in low and middle income countries? An instrumental variable approach.

    Science.gov (United States)

    Burns, Darren K; Jones, Andrew P; Goryakin, Yevgeniy; Suhrcke, Marc

    2017-05-01

    There is a scarcity of quantitative research into the effect of FDI on population health in low and middle income countries (LMICs). This paper investigates the relationship using annual panel data from 85 LMICs between 1974 and 2012. When controlling for time trends, country fixed effects, correlation between repeated observations, relevant covariates, and endogeneity via a novel instrumental variable approach, we find FDI to have a beneficial effect on overall health, proxied by life expectancy. When investigating age-specific mortality rates, we find a stronger beneficial effect of FDI on adult mortality, yet no association with either infant or child mortality. Notably, FDI effects on health remain undetected in all models which do not control for endogeneity. Exploring the effect of sector-specific FDI on health in LMICs, we provide preliminary evidence of a weak inverse association between secondary (i.e. manufacturing) sector FDI and overall life expectancy. Our results thus suggest that FDI has provided an overall benefit to population health in LMICs, particularly in adults, yet investments into the secondary sector could be harmful to health. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Instrumental variables estimation under a structural Cox model

    DEFF Research Database (Denmark)

    Martinussen, Torben; Nørbo Sørensen, Ditte; Vansteelandt, Stijn

    2017-01-01

    Instrumental variable (IV) analysis is an increasingly popular tool for inferring the effect of an exposure on an outcome, as witnessed by the growing number of IV applications in epidemiology, for instance. The majority of IV analyses of time-to-event endpoints are, however, dominated by heurist...

  20. Bias and Bias Correction in Multi-Site Instrumental Variables Analysis of Heterogeneous Mediator Effects

    Science.gov (United States)

    Reardon, Sean F.; Unlu, Faith; Zhu, Pei; Bloom, Howard

    2013-01-01

    We explore the use of instrumental variables (IV) analysis with a multi-site randomized trial to estimate the effect of a mediating variable on an outcome in cases where it can be assumed that the observed mediator is the only mechanism linking treatment assignment to outcomes, as assumption known in the instrumental variables literature as the…

  1. Causal null hypotheses of sustained treatment strategies: What can be tested with an instrumental variable?

    Science.gov (United States)

    Swanson, Sonja A; Labrecque, Jeremy; Hernán, Miguel A

    2018-05-02

    Sometimes instrumental variable methods are used to test whether a causal effect is null rather than to estimate the magnitude of a causal effect. However, when instrumental variable methods are applied to time-varying exposures, as in many Mendelian randomization studies, it is unclear what causal null hypothesis is tested. Here, we consider different versions of causal null hypotheses for time-varying exposures, show that the instrumental variable conditions alone are insufficient to test some of them, and describe additional assumptions that can be made to test a wider range of causal null hypotheses, including both sharp and average causal null hypotheses. Implications for interpretation and reporting of instrumental variable results are discussed.

  2. A statistical approach to instrument calibration

    Science.gov (United States)

    Robert R. Ziemer; David Strauss

    1978-01-01

    Summary - It has been found that two instruments will yield different numerical values when used to measure identical points. A statistical approach is presented that can be used to approximate the error associated with the calibration of instruments. Included are standard statistical tests that can be used to determine if a number of successive calibrations of the...

  3. Censored Quantile Instrumental Variable Estimates of the Price Elasticity of Expenditure on Medical Care.

    Science.gov (United States)

    Kowalski, Amanda

    2016-01-02

    Efforts to control medical care costs depend critically on how individuals respond to prices. I estimate the price elasticity of expenditure on medical care using a censored quantile instrumental variable (CQIV) estimator. CQIV allows estimates to vary across the conditional expenditure distribution, relaxes traditional censored model assumptions, and addresses endogeneity with an instrumental variable. My instrumental variable strategy uses a family member's injury to induce variation in an individual's own price. Across the conditional deciles of the expenditure distribution, I find elasticities that vary from -0.76 to -1.49, which are an order of magnitude larger than previous estimates.

  4. Instrumental variable estimation of treatment effects for duration outcomes

    NARCIS (Netherlands)

    G.E. Bijwaard (Govert)

    2007-01-01

    textabstractIn this article we propose and implement an instrumental variable estimation procedure to obtain treatment effects on duration outcomes. The method can handle the typical complications that arise with duration data of time-varying treatment and censoring. The treatment effect we

  5. Combining fixed effects and instrumental variable approaches for estimating the effect of psychosocial job quality on mental health: evidence from 13 waves of a nationally representative cohort study.

    Science.gov (United States)

    Milner, Allison; Aitken, Zoe; Kavanagh, Anne; LaMontagne, Anthony D; Pega, Frank; Petrie, Dennis

    2017-06-23

    Previous studies suggest that poor psychosocial job quality is a risk factor for mental health problems, but they use conventional regression analytic methods that cannot rule out reverse causation, unmeasured time-invariant confounding and reporting bias. This study combines two quasi-experimental approaches to improve causal inference by better accounting for these biases: (i) linear fixed effects regression analysis and (ii) linear instrumental variable analysis. We extract 13 annual waves of national cohort data including 13 260 working-age (18-64 years) employees. The exposure variable is self-reported level of psychosocial job quality. The instruments used are two common workplace entitlements. The outcome variable is the Mental Health Inventory (MHI-5). We adjust for measured time-varying confounders. In the fixed effects regression analysis adjusted for time-varying confounders, a 1-point increase in psychosocial job quality is associated with a 1.28-point improvement in mental health on the MHI-5 scale (95% CI: 1.17, 1.40; P variable analysis, a 1-point increase psychosocial job quality is related to 1.62-point improvement on the MHI-5 scale (95% CI: -0.24, 3.48; P = 0.088). Our quasi-experimental results provide evidence to confirm job stressors as risk factors for mental ill health using methods that improve causal inference. © The Author 2017. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  6. Instrumented Impact Testing: Influence of Machine Variables and Specimen Position

    International Nuclear Information System (INIS)

    Lucon, E.; McCowan, C. N.; Santoyo, R. A.

    2008-01-01

    An investigation has been conducted on the influence of impact machine variables and specimen positioning on characteristic forces and absorbed energies from instrumented Charpy tests. Brittle and ductile fracture behavior has been investigated by testing NIST reference samples of low, high and super-high energy levels. Test machine variables included tightness of foundation, anvil and striker bolts, and the position of the center of percussion with respect to the center of strike. For specimen positioning, we tested samples which had been moved away or sideways with respect to the anvils. In order to assess the influence of the various factors, we compared mean values in the reference (unaltered) and altered conditions; for machine variables, t-test analyses were also performed in order to evaluate the statistical significance of the observed differences. Our results indicate that the only circumstance which resulted in variations larger than 5 percent for both brittle and ductile specimens is when the sample is not in contact with the anvils. These findings should be taken into account in future revisions of instrumented Charpy test standards.

  7. Instrumented Impact Testing: Influence of Machine Variables and Specimen Position

    Energy Technology Data Exchange (ETDEWEB)

    Lucon, E.; McCowan, C. N.; Santoyo, R. A.

    2008-09-15

    An investigation has been conducted on the influence of impact machine variables and specimen positioning on characteristic forces and absorbed energies from instrumented Charpy tests. Brittle and ductile fracture behavior has been investigated by testing NIST reference samples of low, high and super-high energy levels. Test machine variables included tightness of foundation, anvil and striker bolts, and the position of the center of percussion with respect to the center of strike. For specimen positioning, we tested samples which had been moved away or sideways with respect to the anvils. In order to assess the influence of the various factors, we compared mean values in the reference (unaltered) and altered conditions; for machine variables, t-test analyses were also performed in order to evaluate the statistical significance of the observed differences. Our results indicate that the only circumstance which resulted in variations larger than 5 percent for both brittle and ductile specimens is when the sample is not in contact with the anvils. These findings should be taken into account in future revisions of instrumented Charpy test standards.

  8. Comparison of variance estimators for metaanalysis of instrumental variable estimates

    NARCIS (Netherlands)

    Schmidt, A. F.; Hingorani, A. D.; Jefferis, B. J.; White, J.; Groenwold, R. H H; Dudbridge, F.; Ben-Shlomo, Y.; Chaturvedi, N.; Engmann, J.; Hughes, A.; Humphries, S.; Hypponen, E.; Kivimaki, M.; Kuh, D.; Kumari, M.; Menon, U.; Morris, R.; Power, C.; Price, J.; Wannamethee, G.; Whincup, P.

    2016-01-01

    Background: Mendelian randomization studies perform instrumental variable (IV) analysis using genetic IVs. Results of individual Mendelian randomization studies can be pooled through meta-analysis. We explored how different variance estimators influence the meta-analysed IV estimate. Methods: Two

  9. The XRF spectrometer and the selection of analysis conditions (instrumental variables)

    International Nuclear Information System (INIS)

    Willis, J.P.

    2002-01-01

    Full text: This presentation will begin with a brief discussion of EDXRF and flat- and curved-crystal WDXRF spectrometers, contrasting the major differences between the three types. The remainder of the presentation will contain a detailed overview of the choice and settings of the many instrumental variables contained in a modern WDXRF spectrometer, and will discuss critically the choices facing the analyst in setting up a WDXRF spectrometer for different elements and applications. In particular it will discuss the choice of tube target (when a choice is possible), the kV and mA settings, tube filters, collimator masks, collimators, analyzing crystals, secondary collimators, detectors, pulse height selection, X-ray path medium (air, nitrogen, vacuum or helium), counting times for peak and background positions and their effect on counting statistics and lower limit of detection (LLD). The use of Figure of Merit (FOM) calculations to objectively choose the best combination of instrumental variables also will be discussed. This presentation will be followed by a shorter session on a subsequent day entitled - A Selection of XRF Conditions - Practical Session, where participants will be given the opportunity to discuss in groups the selection of the best instrumental variables for three very diverse applications. Copyright (2002) Australian X-ray Analytical Association Inc

  10. Finite-sample instrumental variables inference using an asymptotically pivotal statistic

    NARCIS (Netherlands)

    Bekker, P; Kleibergen, F

    2003-01-01

    We consider the K-statistic, Kleibergen's (2002, Econometrica 70, 1781-1803) adaptation of the Anderson-Rubin (AR) statistic in instrumental variables regression. Whereas Kleibergen (2002) especially analyzes the asymptotic behavior of the statistic, we focus on finite-sample properties in, a

  11. Finite-sample instrumental variables Inference using an Asymptotically Pivotal Statistic

    NARCIS (Netherlands)

    Bekker, P.; Kleibergen, F.R.

    2001-01-01

    The paper considers the K-statistic, Kleibergen’s (2000) adaptation ofthe Anderson-Rubin (AR) statistic in instrumental variables regression.Compared to the AR-statistic this K-statistic shows improvedasymptotic efficiency in terms of degrees of freedom in overidentifiedmodels and yet it shares,

  12. Testing concordance of instrumental variable effects in generalized linear models with application to Mendelian randomization

    Science.gov (United States)

    Dai, James Y.; Chan, Kwun Chuen Gary; Hsu, Li

    2014-01-01

    Instrumental variable regression is one way to overcome unmeasured confounding and estimate causal effect in observational studies. Built on structural mean models, there has been considerale work recently developed for consistent estimation of causal relative risk and causal odds ratio. Such models can sometimes suffer from identification issues for weak instruments. This hampered the applicability of Mendelian randomization analysis in genetic epidemiology. When there are multiple genetic variants available as instrumental variables, and causal effect is defined in a generalized linear model in the presence of unmeasured confounders, we propose to test concordance between instrumental variable effects on the intermediate exposure and instrumental variable effects on the disease outcome, as a means to test the causal effect. We show that a class of generalized least squares estimators provide valid and consistent tests of causality. For causal effect of a continuous exposure on a dichotomous outcome in logistic models, the proposed estimators are shown to be asymptotically conservative. When the disease outcome is rare, such estimators are consistent due to the log-linear approximation of the logistic function. Optimality of such estimators relative to the well-known two-stage least squares estimator and the double-logistic structural mean model is further discussed. PMID:24863158

  13. Finite-sample instrumental variables inference using an asymptotically pivotal statistic

    NARCIS (Netherlands)

    Bekker, Paul A.; Kleibergen, Frank

    2001-01-01

    The paper considers the K-statistic, Kleibergen’s (2000) adaptation of the Anderson-Rubin (AR) statistic in instrumental variables regression. Compared to the AR-statistic this K-statistic shows improved asymptotic efficiency in terms of degrees of freedom in overidenti?ed models and yet it shares,

  14. Impact of instrumental response on observed ozonesonde profiles: First-order estimates and implications for measures of variability

    Science.gov (United States)

    Clifton, G. T.; Merrill, J. T.; Johnson, B. J.; Oltmans, S. J.

    2009-12-01

    Ozonesondes provide information on the ozone distribution up to the middle stratosphere. Ozone profiles often feature layers, with vertically discrete maxima and minima in the mixing ratio. Layers are especially common in the UT/LS regions and originate from wave breaking, shearing and other transport processes. ECC sondes, however, have a moderate response time to significant changes in ozone. A sonde can ascend over 350 meters before it responds fully to a step change in ozone. This results in an overestimate of the altitude assigned to layers and an underestimate of the underlying variability in the amount of ozone. An estimate of the response time is made for each instrument during the preparation for flight, but the profile data are typically not processed to account for the response. Here we present a method of categorizing the response time of ECC instruments and an analysis of a low-pass filter approximation to the effects on profile data. Exponential functions were fit to the step-up and step-down responses using laboratory data. The resulting response time estimates were consistent with results from standard procedures, with the up-step response time exceeding the down-step value somewhat. A single-pole Butterworth filter that approximates the instrumental effect was used with synthetic layered profiles to make first-order estimates of the impact of the finite response time. Using a layer analysis program previously applied to observed profiles we find that instrumental effects can attenuate ozone variability by 20-45% in individual layers, but that the vertical offset in layer altitudes is moderate, up to about 150 meters. We will present results obtained using this approach, coupled with data on the distribution of layer characteristics found using the layer analysis procedure on profiles from Narragansett, Rhode Island and other US sites to quantify the impact on overall variability estimates given ambient distributions of layer occurrence, thickness

  15. Risk exposure mitigation: Approaches and recognised instruments (5

    Directory of Open Access Journals (Sweden)

    Matić Vesna

    2014-01-01

    Full Text Available The risk management function development in banks, along with the development of tools that banks can use throughout this process, has had the strong support in international standards, not only in the recommended approaches for calculating economic capital requirements, but also in the qualitatively new treatment of risk exposure mitigation instruments (Basel Accord II. The array of eligible instruments for exposure mitigation under the recommended approaches for their treatment becomes the essential element of economic capital requirements calculation, both in relation to certain types of risk, and in relation to aggregate exposure.

  16. Risk exposure mitigation: Approaches and recognised instruments (3

    Directory of Open Access Journals (Sweden)

    Matić Vesna

    2014-01-01

    Full Text Available The risk management function development in banks, along with the development of tools that banks can use throughout this process, has had the strong support in international standards, not only in the recommended approaches for calculating economic capital requirements, but also in the qualitatively new treatment of risk exposure mitigation instruments (Basel Accord II. The array of eligible instruments for exposure mitigation under the recommended approaches for their treatment becomes the essential element of economic capital requirements calculation, both in relation to certain types of risk, and in relation to aggregate exposure.

  17. Risk exposure mitigation: Approaches and recognised instruments (6

    Directory of Open Access Journals (Sweden)

    Matić Vesna

    2015-01-01

    Full Text Available The risk management function development in banks, along with the development of tools that banks can use throughout this process, has had the strong support in international standards, not only in the recommended approaches for calculating economic capital requirements, but also in the qualitatively new treatment of risk exposure mitigation instruments (Basel Accord II. The array of eligible instruments for exposure mitigation under the recommended approaches for their treatment becomes the essential element of economic capital requirements calculation, both in relation to certain types of risk, and in relation to aggregate exposure.

  18. Instrumental variables estimation of exposure effects on a time-to-event endpoint using structural cumulative survival models

    DEFF Research Database (Denmark)

    Martinussen, Torben; Vansteelandt, Stijn; Tchetgen Tchetgen, Eric J.

    2017-01-01

    The use of instrumental variables for estimating the effect of an exposure on an outcome is popular in econometrics, and increasingly so in epidemiology. This increasing popularity may be attributed to the natural occurrence of instrumental variables in observational studies that incorporate elem...

  19. Bayesian methods for meta-analysis of causal relationships estimated using genetic instrumental variables

    DEFF Research Database (Denmark)

    Burgess, Stephen; Thompson, Simon G; Thompson, Grahame

    2010-01-01

    Genetic markers can be used as instrumental variables, in an analogous way to randomization in a clinical trial, to estimate the causal relationship between a phenotype and an outcome variable. Our purpose is to extend the existing methods for such Mendelian randomization studies to the context o...

  20. The Structure of Character Strengths: Variable- and Person-Centered Approaches

    Directory of Open Access Journals (Sweden)

    Małgorzata Najderska

    2018-02-01

    Full Text Available This article examines the structure of character strengths (Peterson and Seligman, 2004 following both variable-centered and person-centered approaches. We used the International Personality Item Pool-Values in Action (IPIP-VIA questionnaire. The IPIP-VIA measures 24 character strengths and consists of 213 direct and reversed items. The present study was conducted in a heterogeneous group of N = 908 Poles (aged 18–78, M = 28.58. It was part of a validation project of a Polish version of the IPIP-VIA questionnaire. The variable-centered approach was used to examine the structure of character strengths on both the scale and item levels. The scale-level results indicated a four-factor structure that can be interpreted based on four of the five personality traits from the Big Five theory (excluding neuroticism. The item-level analysis suggested a slightly different and limited set of character strengths (17 not 24. After conducting a second-order analysis, a four-factor structure emerged, and three of the factors could be interpreted as being consistent with the scale-level factors. Three character strength profiles were found using the person-centered approach. Two of them were consistent with alpha and beta personality metatraits. The structure of character strengths can be described by using categories from the Five Factor Model of personality and metatraits. They form factors similar to some personality traits and occur in similar constellations as metatraits. The main contributions of this paper are: (1 the validation of IPIP-VIA conducted in variable-centered approach in a new research group (Poles using a different measurement instrument; (2 introducing the person-centered approach to the study of the structure of character strengths.

  1. The Effect of Birth Weight on Academic Performance: Instrumental Variable Analysis.

    Science.gov (United States)

    Lin, Shi Lin; Leung, Gabriel Matthew; Schooling, C Mary

    2017-05-01

    Observationally, lower birth weight is usually associated with poorer academic performance; whether this association is causal or the result of confounding is unknown. To investigate this question, we obtained an effect estimate, which can have a causal interpretation under specific assumptions, of birth weight on educational attainment using instrumental variable analysis based on single nucleotide polymorphisms determining birth weight combined with results from the Social Science Genetic Association Consortium study of 126,559 Caucasians. We similarly obtained an estimate of the effect of birth weight on academic performance in 4,067 adolescents from Hong Kong's (Chinese) Children of 1997 birth cohort (1997-2016), using twin status as an instrumental variable. Birth weight was not associated with years of schooling (per 100-g increase in birth weight, -0.006 years, 95% confidence interval (CI): -0.02, 0.01) or college completion (odds ratio = 1.00, 95% CI: 0.96, 1.03). Birth weight was also unrelated to academic performance in adolescents (per 100-g increase in birth weight, -0.004 grade, 95% CI: -0.04, 0.04) using instrumental variable analysis, although conventional regression gave a small positive association (0.02 higher grade, 95% CI: 0.01, 0.03). Observed associations of birth weight with academic performance may not be causal, suggesting that interventions should focus on the contextual factors generating this correlation. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Statistical Analysis for Multisite Trials Using Instrumental Variables with Random Coefficients

    Science.gov (United States)

    Raudenbush, Stephen W.; Reardon, Sean F.; Nomi, Takako

    2012-01-01

    Multisite trials can clarify the average impact of a new program and the heterogeneity of impacts across sites. Unfortunately, in many applications, compliance with treatment assignment is imperfect. For these applications, we propose an instrumental variable (IV) model with person-specific and site-specific random coefficients. Site-specific IV…

  3. Bias and Bias Correction in Multisite Instrumental Variables Analysis of Heterogeneous Mediator Effects

    Science.gov (United States)

    Reardon, Sean F.; Unlu, Fatih; Zhu, Pei; Bloom, Howard S.

    2014-01-01

    We explore the use of instrumental variables (IV) analysis with a multisite randomized trial to estimate the effect of a mediating variable on an outcome in cases where it can be assumed that the observed mediator is the only mechanism linking treatment assignment to outcomes, an assumption known in the IV literature as the exclusion restriction.…

  4. Instrumental variables estimation of exposure effects on a time-to-event endpoint using structural cumulative survival models.

    Science.gov (United States)

    Martinussen, Torben; Vansteelandt, Stijn; Tchetgen Tchetgen, Eric J; Zucker, David M

    2017-12-01

    The use of instrumental variables for estimating the effect of an exposure on an outcome is popular in econometrics, and increasingly so in epidemiology. This increasing popularity may be attributed to the natural occurrence of instrumental variables in observational studies that incorporate elements of randomization, either by design or by nature (e.g., random inheritance of genes). Instrumental variables estimation of exposure effects is well established for continuous outcomes and to some extent for binary outcomes. It is, however, largely lacking for time-to-event outcomes because of complications due to censoring and survivorship bias. In this article, we make a novel proposal under a class of structural cumulative survival models which parameterize time-varying effects of a point exposure directly on the scale of the survival function; these models are essentially equivalent with a semi-parametric variant of the instrumental variables additive hazards model. We propose a class of recursive instrumental variable estimators for these exposure effects, and derive their large sample properties along with inferential tools. We examine the performance of the proposed method in simulation studies and illustrate it in a Mendelian randomization study to evaluate the effect of diabetes on mortality using data from the Health and Retirement Study. We further use the proposed method to investigate potential benefit from breast cancer screening on subsequent breast cancer mortality based on the HIP-study. © 2017, The International Biometric Society.

  5. Important variables for parents' postnatal sense of security: evaluating a new Swedish instrument (the PPSS instrument).

    Science.gov (United States)

    Persson, Eva K; Dykes, Anna-Karin

    2009-08-01

    to evaluate dimensions of both parents' postnatal sense of security the first week after childbirth, and to determine associations between the PPSS instrument and different sociodemographic and situational background variables. evaluative, cross-sectional design. 113 mothers and 99 fathers with children live born at term, from five hospitals in southern Sweden. mothers and fathers had similar feelings concerning postnatal sense of security. Of the dimensions in the PPSS instrument, a sense of midwives'/nurses' empowering behaviour, a sense of one's own general well-being and a sense of the mother's well-being as experienced by the father were the most important dimensions for parents' experienced security. A sense of affinity within the family (for both parents) and a sense of manageable breast feeding (for mothers) were not significantly associated with their experienced security. A sense of participation during pregnancy and general anxiety were significantly associated background variables for postnatal sense of security for both parents. For the mothers, parity and a sense that the father was participating during pregnancy were also significantly associated. more focus on parents' participation during pregnancy as well as midwives'/nurses' empowering behaviour during the postnatal period will be beneficial for both parents' postnatal sense of security.

  6. A selective review of the first 20 years of instrumental variables models in health-services research and medicine.

    Science.gov (United States)

    Cawley, John

    2015-01-01

    The method of instrumental variables (IV) is useful for estimating causal effects. Intuitively, it exploits exogenous variation in the treatment, sometimes called natural experiments or instruments. This study reviews the literature in health-services research and medical research that applies the method of instrumental variables, documents trends in its use, and offers examples of various types of instruments. A literature search of the PubMed and EconLit research databases for English-language journal articles published after 1990 yielded a total of 522 original research articles. Citations counts for each article were derived from the Web of Science. A selective review was conducted, with articles prioritized based on number of citations, validity and power of the instrument, and type of instrument. The average annual number of papers in health services research and medical research that apply the method of instrumental variables rose from 1.2 in 1991-1995 to 41.8 in 2006-2010. Commonly-used instruments (natural experiments) in health and medicine are relative distance to a medical care provider offering the treatment and the medical care provider's historic tendency to administer the treatment. Less common but still noteworthy instruments include randomization of treatment for reasons other than research, randomized encouragement to undertake the treatment, day of week of admission as an instrument for waiting time for surgery, and genes as an instrument for whether the respondent has a heritable condition. The use of the method of IV has increased dramatically in the past 20 years, and a wide range of instruments have been used. Applications of the method of IV have in several cases upended conventional wisdom that was based on correlations and led to important insights about health and healthcare. Future research should pursue new applications of existing instruments and search for new instruments that are powerful and valid.

  7. Association of Body Mass Index with Depression, Anxiety and Suicide-An Instrumental Variable Analysis of the HUNT Study.

    Directory of Open Access Journals (Sweden)

    Johan Håkon Bjørngaard

    Full Text Available While high body mass index is associated with an increased risk of depression and anxiety, cumulative evidence indicates that it is a protective factor for suicide. The associations from conventional observational studies of body mass index with mental health outcomes are likely to be influenced by reverse causality or confounding by ill-health. In the present study, we investigated the associations between offspring body mass index and parental anxiety, depression and suicide in order to avoid problems with reverse causality and confounding by ill-health.We used data from 32,457 mother-offspring and 27,753 father-offspring pairs from the Norwegian HUNT-study. Anxiety and depression were assessed using the Hospital Anxiety and Depression Scale and suicide death from national registers. Associations between offspring and own body mass index and symptoms of anxiety and depression and suicide mortality were estimated using logistic and Cox regression. Causal effect estimates were estimated with a two sample instrument variable approach using offspring body mass index as an instrument for parental body mass index.Both own and offspring body mass index were positively associated with depression, while the results did not indicate any substantial association between body mass index and anxiety. Although precision was low, suicide mortality was inversely associated with own body mass index and the results from the analysis using offspring body mass index supported these results. Adjusted odds ratios per standard deviation body mass index from the instrumental variable analysis were 1.22 (95% CI: 1.05, 1.43 for depression, 1.10 (95% CI: 0.95, 1.27 for anxiety, and the instrumental variable estimated hazard ratios for suicide was 0.69 (95% CI: 0.30, 1.63.The present study's results indicate that suicide mortality is inversely associated with body mass index. We also found support for a positive association between body mass index and depression, but not

  8. The contextual effects of social capital on health: a cross-national instrumental variable analysis.

    Science.gov (United States)

    Kim, Daniel; Baum, Christopher F; Ganz, Michael L; Subramanian, S V; Kawachi, Ichiro

    2011-12-01

    Past research on the associations between area-level/contextual social capital and health has produced conflicting evidence. However, interpreting this rapidly growing literature is difficult because estimates using conventional regression are prone to major sources of bias including residual confounding and reverse causation. Instrumental variable (IV) analysis can reduce such bias. Using data on up to 167,344 adults in 64 nations in the European and World Values Surveys and applying IV and ordinary least squares (OLS) regression, we estimated the contextual effects of country-level social trust on individual self-rated health. We further explored whether these associations varied by gender and individual levels of trust. Using OLS regression, we found higher average country-level trust to be associated with better self-rated health in both women and men. Instrumental variable analysis yielded qualitatively similar results, although the estimates were more than double in size in both sexes when country population density and corruption were used as instruments. The estimated health effects of raising the percentage of a country's population that trusts others by 10 percentage points were at least as large as the estimated health effects of an individual developing trust in others. These findings were robust to alternative model specifications and instruments. Conventional regression and to a lesser extent IV analysis suggested that these associations are more salient in women and in women reporting social trust. In a large cross-national study, our findings, including those using instrumental variables, support the presence of beneficial effects of higher country-level trust on self-rated health. Previous findings for contextual social capital using traditional regression may have underestimated the true associations. Given the close linkages between self-rated health and all-cause mortality, the public health gains from raising social capital within and across

  9. Reconstructing pre-instrumental streamflow in Eastern Australia using a water balance approach

    Science.gov (United States)

    Tozer, C. R.; Kiem, A. S.; Vance, T. R.; Roberts, J. L.; Curran, M. A. J.; Moy, A. D.

    2018-03-01

    Streamflow reconstructions based on paleoclimate proxies provide much longer records than the short instrumental period records on which water resource management plans are currently based. In Australia there is a lack of in-situ high resolution paleoclimate proxy records, but remote proxies with teleconnections to Australian climate have utility in producing streamflow reconstructions. Here we investigate, via a case study for a catchment in eastern Australia, the novel use of an Antarctic ice-core based rainfall reconstruction within a Budyko-framework to reconstruct ∼1000 years of annual streamflow. The resulting streamflow reconstruction captures interannual to decadal variability in the instrumental streamflow, validating both the use of the ice core rainfall proxy record and the Budyko-framework method. In the preinstrumental era the streamflow reconstruction shows longer wet and dry epochs and periods of streamflow variability that are higher than observed in the instrumental era. Importantly, for both the instrumental record and preinstrumental reconstructions, the wet (dry) epochs in the rainfall record are shorter (longer) in the streamflow record and this non-linearity must be considered when inferring hydroclimatic risk or historical water availability directly from rainfall proxy records alone. These insights provide a better understanding of present infrastructure vulnerability in the context of past climate variability for eastern Australia. The streamflow reconstruction presented here also provides a better understanding of the range of hydroclimatic variability possible, and therefore represents a more realistic baseline on which to quantify the potential impacts of anthropogenic climate change on water security.

  10. Cross-cultural adaptation of instruments assessing breastfeeding determinants: a multi-step approach

    Science.gov (United States)

    2014-01-01

    Background Cross-cultural adaptation is a necessary process to effectively use existing instruments in other cultural and language settings. The process of cross-culturally adapting, including translation, of existing instruments is considered a critical set to establishing a meaningful instrument for use in another setting. Using a multi-step approach is considered best practice in achieving cultural and semantic equivalence of the adapted version. We aimed to ensure the content validity of our instruments in the cultural context of KwaZulu-Natal, South Africa. Methods The Iowa Infant Feeding Attitudes Scale, Breastfeeding Self-Efficacy Scale-Short Form and additional items comprise our consolidated instrument, which was cross-culturally adapted utilizing a multi-step approach during August 2012. Cross-cultural adaptation was achieved through steps to maintain content validity and attain semantic equivalence in the target version. Specifically, Lynn’s recommendation to apply an item-level content validity index score was followed. The revised instrument was translated and back-translated. To ensure semantic equivalence, Brislin’s back-translation approach was utilized followed by the committee review to address any discrepancies that emerged from translation. Results Our consolidated instrument was adapted to be culturally relevant and translated to yield more reliable and valid results for use in our larger research study to measure infant feeding determinants effectively in our target cultural context. Conclusions Undertaking rigorous steps to effectively ensure cross-cultural adaptation increases our confidence that the conclusions we make based on our self-report instrument(s) will be stronger. In this way, our aim to achieve strong cross-cultural adaptation of our consolidated instruments was achieved while also providing a clear framework for other researchers choosing to utilize existing instruments for work in other cultural, geographic and population

  11. XML for nuclear instrument control and monitoring: an approach towards standardisation

    International Nuclear Information System (INIS)

    Bharade, S.K.; Ananthakrishnan, T.S.; Kataria, S.K.; Singh, S.K.

    2004-01-01

    Communication among heterogeneous system with applications running under different operating systems and applications developed under different platforms has undergone rapid changes due to the adoption of XML standards. These are being developed for different industries like Chemical, Medical, Commercial etc. The High Energy Physics community has already a standard for exchange of data among different applications , under heterogeneous distributed systems like the CMS Data Acquisition System. There are a large number of Nuclear Instruments supplied by different manufactures which are increasingly getting connected. This approach is getting wider acceptance in instruments at reactor sites, accelerator sites and complex nuclear experiments -especially at centres like CERN. In order for these instruments to be able to describe the data which is available from them in a platform independent manner XML approach has been developed. This paper is the first attempt at Electronics Division for proposing an XML standard for control, monitoring, Data Acquisition and Analysis generated by Nuclear Instruments at Accelerator sites, Nuclear Reactor plant and Laboratory. The gamut of Nuclear Instruments include Multichannel Analysers, Health Physics Instruments, Accelerator Control Systems, Reactor Regulating systems, Flux mapping Systems etc. (author)

  12. Instrumental variables estimates of peer effects in social networks.

    Science.gov (United States)

    An, Weihua

    2015-03-01

    Estimating peer effects with observational data is very difficult because of contextual confounding, peer selection, simultaneity bias, and measurement error, etc. In this paper, I show that instrumental variables (IVs) can help to address these problems in order to provide causal estimates of peer effects. Based on data collected from over 4000 students in six middle schools in China, I use the IV methods to estimate peer effects on smoking. My design-based IV approach differs from previous ones in that it helps to construct potentially strong IVs and to directly test possible violation of exogeneity of the IVs. I show that measurement error in smoking can lead to both under- and imprecise estimations of peer effects. Based on a refined measure of smoking, I find consistent evidence for peer effects on smoking. If a student's best friend smoked within the past 30 days, the student was about one fifth (as indicated by the OLS estimate) or 40 percentage points (as indicated by the IV estimate) more likely to smoke in the same time period. The findings are robust to a variety of robustness checks. I also show that sharing cigarettes may be a mechanism for peer effects on smoking. A 10% increase in the number of cigarettes smoked by a student's best friend is associated with about 4% increase in the number of cigarettes smoked by the student in the same time period. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. LARF: Instrumental Variable Estimation of Causal Effects through Local Average Response Functions

    Directory of Open Access Journals (Sweden)

    Weihua An

    2016-07-01

    Full Text Available LARF is an R package that provides instrumental variable estimation of treatment effects when both the endogenous treatment and its instrument (i.e., the treatment inducement are binary. The method (Abadie 2003 involves two steps. First, pseudo-weights are constructed from the probability of receiving the treatment inducement. By default LARF estimates the probability by a probit regression. It also provides semiparametric power series estimation of the probability and allows users to employ other external methods to estimate the probability. Second, the pseudo-weights are used to estimate the local average response function conditional on treatment and covariates. LARF provides both least squares and maximum likelihood estimates of the conditional treatment effects.

  14. Fasting Glucose and the Risk of Depressive Symptoms: Instrumental-Variable Regression in the Cardiovascular Risk in Young Finns Study.

    Science.gov (United States)

    Wesołowska, Karolina; Elovainio, Marko; Hintsa, Taina; Jokela, Markus; Pulkki-Råback, Laura; Pitkänen, Niina; Lipsanen, Jari; Tukiainen, Janne; Lyytikäinen, Leo-Pekka; Lehtimäki, Terho; Juonala, Markus; Raitakari, Olli; Keltikangas-Järvinen, Liisa

    2017-12-01

    Type 2 diabetes (T2D) has been associated with depressive symptoms, but the causal direction of this association and the underlying mechanisms, such as increased glucose levels, remain unclear. We used instrumental-variable regression with a genetic instrument (Mendelian randomization) to examine a causal role of increased glucose concentrations in the development of depressive symptoms. Data were from the population-based Cardiovascular Risk in Young Finns Study (n = 1217). Depressive symptoms were assessed in 2012 using a modified Beck Depression Inventory (BDI-I). Fasting glucose was measured concurrently with depressive symptoms. A genetic risk score for fasting glucose (with 35 single nucleotide polymorphisms) was used as an instrumental variable for glucose. Glucose was not associated with depressive symptoms in the standard linear regression (B = -0.04, 95% CI [-0.12, 0.04], p = .34), but the instrumental-variable regression showed an inverse association between glucose and depressive symptoms (B = -0.43, 95% CI [-0.79, -0.07], p = .020). The difference between the estimates of standard linear regression and instrumental-variable regression was significant (p = .026) CONCLUSION: Our results suggest that the association between T2D and depressive symptoms is unlikely to be caused by increased glucose concentrations. It seems possible that T2D might be linked to depressive symptoms due to low glucose levels.

  15. Instrument Variables for Reducing Noise in Parallel MRI Reconstruction

    Directory of Open Access Journals (Sweden)

    Yuchou Chang

    2017-01-01

    Full Text Available Generalized autocalibrating partially parallel acquisition (GRAPPA has been a widely used parallel MRI technique. However, noise deteriorates the reconstructed image when reduction factor increases or even at low reduction factor for some noisy datasets. Noise, initially generated from scanner, propagates noise-related errors during fitting and interpolation procedures of GRAPPA to distort the final reconstructed image quality. The basic idea we proposed to improve GRAPPA is to remove noise from a system identification perspective. In this paper, we first analyze the GRAPPA noise problem from a noisy input-output system perspective; then, a new framework based on errors-in-variables (EIV model is developed for analyzing noise generation mechanism in GRAPPA and designing a concrete method—instrument variables (IV GRAPPA to remove noise. The proposed EIV framework provides possibilities that noiseless GRAPPA reconstruction could be achieved by existing methods that solve EIV problem other than IV method. Experimental results show that the proposed reconstruction algorithm can better remove the noise compared to the conventional GRAPPA, as validated with both of phantom and in vivo brain data.

  16. Robust best linear estimation for regression analysis using surrogate and instrumental variables.

    Science.gov (United States)

    Wang, C Y

    2012-04-01

    We investigate methods for regression analysis when covariates are measured with errors. In a subset of the whole cohort, a surrogate variable is available for the true unobserved exposure variable. The surrogate variable satisfies the classical measurement error model, but it may not have repeated measurements. In addition to the surrogate variables that are available among the subjects in the calibration sample, we assume that there is an instrumental variable (IV) that is available for all study subjects. An IV is correlated with the unobserved true exposure variable and hence can be useful in the estimation of the regression coefficients. We propose a robust best linear estimator that uses all the available data, which is the most efficient among a class of consistent estimators. The proposed estimator is shown to be consistent and asymptotically normal under very weak distributional assumptions. For Poisson or linear regression, the proposed estimator is consistent even if the measurement error from the surrogate or IV is heteroscedastic. Finite-sample performance of the proposed estimator is examined and compared with other estimators via intensive simulation studies. The proposed method and other methods are applied to a bladder cancer case-control study.

  17. Eutrophication Modeling Using Variable Chlorophyll Approach

    International Nuclear Information System (INIS)

    Abdolabadi, H.; Sarang, A.; Ardestani, M.; Mahjoobi, E.

    2016-01-01

    In this study, eutrophication was investigated in Lake Ontario to identify the interactions among effective drivers. The complexity of such phenomenon was modeled using a system dynamics approach based on a consideration of constant and variable stoichiometric ratios. The system dynamics approach is a powerful tool for developing object-oriented models to simulate complex phenomena that involve feedback effects. Utilizing stoichiometric ratios is a method for converting the concentrations of state variables. During the physical segmentation of the model, Lake Ontario was divided into two layers, i.e., the epilimnion and hypolimnion, and differential equations were developed for each layer. The model structure included 16 state variables related to phytoplankton, herbivorous zooplankton, carnivorous zooplankton, ammonium, nitrate, dissolved phosphorus, and particulate and dissolved carbon in the epilimnion and hypolimnion during a time horizon of one year. The results of several tests to verify the model, close to 1 Nash-Sutcliff coefficient (0.98), the data correlation coefficient (0.98), and lower standard errors (0.96), have indicated well-suited model’s efficiency. The results revealed that there were significant differences in the concentrations of the state variables in constant and variable stoichiometry simulations. Consequently, the consideration of variable stoichiometric ratios in algae and nutrient concentration simulations may be applied in future modeling studies to enhance the accuracy of the results and reduce the likelihood of inefficient control policies.

  18. [Severe idiopathic scoliosis. Does the approach and the instruments used modify the results?].

    Science.gov (United States)

    Sánchez-Márquez, J M; Sánchez Pérez-Grueso, F J; Pérez Martín-Buitrago, M; Fernández-Baíllo, N; García-Fernández, A; Quintáns-Rodríguez, J

    2014-01-01

    The aim of this work is to evaluate and compare the radiographic results and complications of the surgical treatment of adolescents with idiopathic scoliosis greater than 75 degrees, using a double approach (DA) or an isolated posterior approach with hybrid instruments (posterior hybrid [PH]), or with «all-pedicle screws» (posterior screws [PS]). A retrospective review was performed on 69 patients with idiopathic scoliosis greater than 75°, with a follow-up of more than 2 years, to analyze the flexibility of the curves, the correction obtained, and the complications depending on the type of surgery. The Kruskal-Wallis test for non-parametric variables was used for the statistical analysis. There were no statistically significant differences between the 3 patient groups in the pre-surgical Cobb angle values (DA=89°, PH=83°, PS=83°), in the immediate post-surgical (DA=34°, PH=33°, PS=30°), nor at the end of follow-up (DA=36°, PH=36°, PS=33°) (P>.05). The percentage correction (DA=60%, PH=57%, PS=60%) was similar between groups (P>.05). The percentage of complications associated with the procedure was 20.8% in DA, 10% in PH and 20% in PS. Two patients in the PS group showed changes, with no neurological lesions, in the spinal cord monitoring, and one patient in the same group suffered a delayed and transient incomplete lesion. No significant differences were observed in the correction of severe idiopathic scoliosis between patients operated using the double or isolated posterior approach, regardless of the type of instrumentation used. Copyright © 2013 SECOT. Published by Elsevier Espana. All rights reserved.

  19. Decoupling Solar Variability and Instrument Trends Using the Multiple Same-Irradiance-Level (MuSIL) Analysis Technique

    Science.gov (United States)

    Woods, Thomas N.; Eparvier, Francis G.; Harder, Jerald; Snow, Martin

    2018-05-01

    The solar spectral irradiance (SSI) dataset is a key record for studying and understanding the energetics and radiation balance in Earth's environment. Understanding the long-term variations of the SSI over timescales of the 11-year solar activity cycle and longer is critical for many Sun-Earth research topics. Satellite measurements of the SSI have been made since the 1970s, most of them in the ultraviolet, but recently also in the visible and near-infrared. A limiting factor for the accuracy of previous solar variability results is the uncertainties for the instrument degradation corrections, which need fairly large corrections relative to the amount of solar cycle variability at some wavelengths. The primary objective of this investigation has been to separate out solar cycle variability and any residual uncorrected instrumental trends in the SSI measurements from the Solar Radiation and Climate Experiment (SORCE) mission and the Thermosphere, Mesosphere, Ionosphere, Energetic, and Dynamics (TIMED) mission. A new technique called the Multiple Same-Irradiance-Level (MuSIL) analysis has been developed, which examines an SSI time series at different levels of solar activity to provide long-term trends in an SSI record, and the most common result is a downward trend that most likely stems from uncorrected instrument degradation. This technique has been applied to each wavelength in the SSI records from SORCE (2003 - present) and TIMED (2002 - present) to provide new solar cycle variability results between 27 nm and 1600 nm with a resolution of about 1 nm at most wavelengths. This technique, which was validated with the highly accurate total solar irradiance (TSI) record, has an estimated relative uncertainty of about 5% of the measured solar cycle variability. The MuSIL results are further validated with the comparison of the new solar cycle variability results from different solar cycles.

  20. 8 years of Solar Spectral Irradiance Variability Observed from the ISS with the SOLAR/SOLSPEC Instrument

    Science.gov (United States)

    Damé, Luc; Bolsée, David; Meftah, Mustapha; Irbah, Abdenour; Hauchecorne, Alain; Bekki, Slimane; Pereira, Nuno; Cessateur, Marchand; Gäel; , Marion; et al.

    2016-10-01

    Accurate measurements of Solar Spectral Irradiance (SSI) are of primary importance for a better understanding of solar physics and of the impact of solar variability on climate (via Earth's atmospheric photochemistry). The acquisition of a top of atmosphere reference solar spectrum and of its temporal and spectral variability during the unusual solar cycle 24 is of prime interest for these studies. These measurements are performed since April 2008 with the SOLSPEC spectro-radiometer from the far ultraviolet to the infrared (166 nm to 3088 nm). This instrument, developed under a fruitful LATMOS/BIRA-IASB collaboration, is part of the Solar Monitoring Observatory (SOLAR) payload, externally mounted on the Columbus module of the International Space Station (ISS). The SOLAR mission, with its actual 8 years duration, will cover almost the entire solar cycle 24. We present here the in-flight operations and performances of the SOLSPEC instrument, including the engineering corrections, calibrations and improved know-how procedure for aging corrections. Accordingly, a SSI reference spectrum from the UV to the NIR will be presented, together with its variability in the UV, as measured by SOLAR/SOLSPEC for 8 years. Uncertainties on these measurements and comparisons with other instruments will be briefly discussed.

  1. Job demands and job strain as risk factors for employee wellbeing in elderly care: an instrumental-variables analysis.

    Science.gov (United States)

    Elovainio, Marko; Heponiemi, Tarja; Kuusio, Hannamaria; Jokela, Markus; Aalto, Anna-Mari; Pekkarinen, Laura; Noro, Anja; Finne-Soveri, Harriet; Kivimäki, Mika; Sinervo, Timo

    2015-02-01

    The association between psychosocial work environment and employee wellbeing has repeatedly been shown. However, as environmental evaluations have typically been self-reported, the observed associations may be attributable to reporting bias. Applying instrumental-variable regression, we used staffing level (the ratio of staff to residents) as an unconfounded instrument for self-reported job demands and job strain to predict various indicators of wellbeing (perceived stress, psychological distress and sleeping problems) among 1525 registered nurses, practical nurses and nursing assistants working in elderly care wards. In ordinary regression, higher self-reported job demands and job strain were associated with increased risk of perceived stress, psychological distress and sleeping problems. The effect estimates for the associations of these psychosocial factors with perceived stress and psychological distress were greater, but less precisely estimated, in an instrumental-variables analysis which took into account only the variation in self-reported job demands and job strain that was explained by staffing level. No association between psychosocial factors and sleeping problems was observed with the instrumental-variable analysis. These results support a causal interpretation of high self-reported job demands and job strain being risk factors for employee wellbeing. © The Author 2014. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  2. Approaches of High School Instrumental Music Educators in Response to Student Challenges

    Science.gov (United States)

    Edgar, Scott N.

    2016-01-01

    The purpose of this multiple instrumental case study was to explore approaches of four high school instrumental music educators assuming the role of facilitative teacher in responding to challenges affecting the social and emotional well-being of their students. This study utilized the framework of social emotional learning as a lens to view the…

  3. A pilot's opinion - VTOL control design requirements for the instrument approach task.

    Science.gov (United States)

    Patton, J. M., Jr.

    1972-01-01

    This paper presents pilot opinion supported by test data concerning flight control and display concepts and control system design requirements for VTOL aircraft in the instrument approach task. Material presented is drawn from research flights in the following aircraft: Dornier DO-31, Short SC-1, LTV XC-142A, and Boeing-Vertol CH-46. The control system concepts and mechanizations employed in the above aircraft are discussed, and the effect of control system augmentation is shown on performance. Operational procedures required in the instrument approach task are described, with comments on need for automation and combining of control functions.

  4. Evaluating disease management programme effectiveness: an introduction to instrumental variables.

    Science.gov (United States)

    Linden, Ariel; Adams, John L

    2006-04-01

    This paper introduces the concept of instrumental variables (IVs) as a means of providing an unbiased estimate of treatment effects in evaluating disease management (DM) programme effectiveness. Model development is described using zip codes as the IV. Three diabetes DM outcomes were evaluated: annual diabetes costs, emergency department (ED) visits and hospital days. Both ordinary least squares (OLS) and IV estimates showed a significant treatment effect for diabetes costs (P = 0.011) but neither model produced a significant treatment effect for ED visits. However, the IV estimate showed a significant treatment effect for hospital days (P = 0.006) whereas the OLS model did not. These results illustrate the utility of IV estimation when the OLS model is sensitive to the confounding effect of hidden bias.

  5. Institution, Financial Sector, and Economic Growth: Use The Institutions As An Instrument Variable

    OpenAIRE

    Albertus Girik Allo

    2016-01-01

    Institution has been investigated having indirect role on economic growth. This paper aims to evaluate whether the quality of institution matters for economic growth. By applying institution as instrumental variable at Foreign Direct Investment (FDI), quality of institution significantly influence economic growth. This study applies two set of data period, namely 1985-2013 and 2000-2013, available online in the World Bank (WB). The first data set, 1985-2013 is used to estimate the role of fin...

  6. Health insurance and the demand for medical care: Instrumental variable estimates using health insurer claims data.

    Science.gov (United States)

    Dunn, Abe

    2016-07-01

    This paper takes a different approach to estimating demand for medical care that uses the negotiated prices between insurers and providers as an instrument. The instrument is viewed as a textbook "cost shifting" instrument that impacts plan offerings, but is unobserved by consumers. The paper finds a price elasticity of demand of around -0.20, matching the elasticity found in the RAND Health Insurance Experiment. The paper also studies within-market variation in demand for prescription drugs and other medical care services and obtains comparable price elasticity estimates. Published by Elsevier B.V.

  7. Disability as deprivation of capabilities: Estimation using a large-scale survey in Morocco and Tunisia and an instrumental variable approach.

    Science.gov (United States)

    Trani, Jean-Francois; Bakhshi, Parul; Brown, Derek; Lopez, Dominique; Gall, Fiona

    2018-05-25

    The capability approach pioneered by Amartya Sen and Martha Nussbaum offers a new paradigm to examine disability, poverty and their complex associations. Disability is hence defined as a situation in which a person with an impairment faces various forms of restrictions in functionings and capabilities. Additionally, poverty is not the mere absence of income but a lack of ability to achieve essential functionings; disability is consequently the poverty of capabilities of persons with impairment. It is the lack of opportunities in a given context and agency that leads to persons with disabilities being poorer than other social groups. Consequently, poverty of people with disabilities comprises of complex processes of social exclusion and disempowerment. Despite growing evidence that persons with disabilities face higher levels of poverty, the literature from low and middle-income countries that analyzes the causal link between disability and poverty, remains limited. Drawing on data from a large case control field survey carried out between December 24th , 2013 and February 16th , 2014 in Tunisia and between November 4th , 2013 and June 12th , 2014 in Morocco, we examined the effect of impairment on various basic capabilities, health related quality of life and multidimensional poverty - indicators of poor wellbeing-in Morocco and Tunisia. To demonstrate a causal link between impairment and deprivation of capabilities, we used instrumental variable regression analyses. In both countries, we found lower access to jobs for persons with impairment. Health related quality of life was also lower for this group who also faced a higher risk of multidimensional poverty. There was no significant direct effect of impairment on access to school and acquiring literacy in both countries, and on access to health care and expenses in Tunisia, while having an impairment reduced access to healthcare facilities in Morocco and out of pocket expenditures. These results suggest that

  8. Gene Variants Associated with Antisocial Behaviour: A Latent Variable Approach

    Science.gov (United States)

    Bentley, Mary Jane; Lin, Haiqun; Fernandez, Thomas V.; Lee, Maria; Yrigollen, Carolyn M.; Pakstis, Andrew J.; Katsovich, Liliya; Olds, David L.; Grigorenko, Elena L.; Leckman, James F.

    2013-01-01

    Objective: The aim of this study was to determine if a latent variable approach might be useful in identifying shared variance across genetic risk alleles that is associated with antisocial behaviour at age 15 years. Methods: Using a conventional latent variable approach, we derived an antisocial phenotype in 328 adolescents utilizing data from a…

  9. DESIGNING AFFECTIVE INSTRUMENT BASED ON SCIENTIFIC APPROACH FOR ENGLISH LANGUAGE LEARNING

    Directory of Open Access Journals (Sweden)

    Maisarah Ira

    2018-01-01

    Full Text Available This research was describing the designing of instrument for affective assessment in English language teaching. The focus of the designing was only for observation sheet that will be used by English teachers during the teaching and learning process. The instrument was designed based on scientific approach that has five stages namely observing, questioning, experimenting, associating, and communicating. In the designing process, ADDIE Model was used as the method of research. The designing of instrument was considering the gap between the reality and the teachers’ need. The result showed that the designing was also notice to the affective taxonomy such as receiving, responding, valuing, organization, and characterization. Then, three key words were used as the indicator to show the five levels of affective taxonomy such as seriously, volunteer, and without asked by teacher. Furthermore, eighteen types of affective such as religious, honesty, responsible, discipline, hard work, self confidence, logical thinking, critical thinking, creative, innovative, independent, curiosity, love knowledge, respect, polite, democracy, emotional intelligence, and pluralist were put on each stage of scientific approach. So, it is hoped that can be implemented in all of context of English language teaching at schools and can assess the students’ affective comprehensively.

  10. Pilot dynamics for instrument approach tasks: Full panel multiloop and flight director operations

    Science.gov (United States)

    Weir, D. H.; Mcruer, D. T.

    1972-01-01

    Measurements and interpretations of single and mutiloop pilot response properties during simulated instrument approach are presented. Pilot subjects flew Category 2-like ILS approaches in a fixed base DC-8 simulaton. A conventional instrument panel and controls were used, with simulated vertical gust and glide slope beam bend forcing functions. Reduced and interpreted pilot describing functions and remmant are given for pitch attitude, flight director, and multiloop (longitudinal) control tasks. The response data are correlated with simultaneously recorded eye scanning statistics, previously reported in NASA CR-1535. The resulting combined response and scanning data and their interpretations provide a basis for validating and extending the theory of manual control displays.

  11. Institution, Financial Sector, and Economic Growth: Use The Institutions As An Instrument Variable

    Directory of Open Access Journals (Sweden)

    Albertus Girik Allo

    2016-06-01

    Full Text Available Institution has been investigated having indirect role on economic growth. This paper aims to evaluate whether the quality of institution matters for economic growth. By applying institution as instrumental variable at Foreign Direct Investment (FDI, quality of institution significantly influence economic growth. This study applies two set of data period, namely 1985-2013 and 2000-2013, available online in the World Bank (WB. The first data set, 1985-2013 is used to estimate the role of financial sector on economic growth, focuses on 67 countries. The second data set, 2000-2013 determine the role of institution on financial sector and economic growth by applying 2SLS estimation method. We define institutional variables as set of indicators: Control of Corruption, Political Stability and Absence of Violence, and Voice and Accountability provide declining impact of FDI to economic growth.

  12. An Instrumental Variable Probit (IVP Analysis on Depressed Mood in Korea: The Impact of Gender Differences and Other Socio-Economic Factors

    Directory of Open Access Journals (Sweden)

    Lara Gitto

    2015-08-01

    Full Text Available Background Depression is a mental health state whose frequency has been increasing in modern societies. It imposes a great burden, because of the strong impact on people’s quality of life and happiness. Depression can be reliably diagnosed and treated in primary care: if more people could get effective treatments earlier, the costs related to depression would be reversed. The aim of this study was to examine the influence of socio-economic factors and gender on depressed mood, focusing on Korea. In fact, in spite of the great amount of empirical studies carried out for other countries, few epidemiological studies have examined the socio-economic determinants of depression in Korea and they were either limited to samples of employed women or did not control for individual health status. Moreover, as the likely data endogeneity (i.e. the possibility of correlation between the dependent variable and the error term as a result of autocorrelation or simultaneity, such as, in this case, the depressed mood due to health factors that, in turn might be caused by depression, might bias the results, the present study proposes an empirical approach, based on instrumental variables, to deal with this problem. Methods Data for the year 2008 from the Korea National Health and Nutrition Examination Survey (KNHANES were employed. About seven thousands of people (N= 6,751, of which 43% were males and 57% females, aged from 19 to 75 years old, were included in the sample considered in the analysis. In order to take into account the possible endogeneity of some explanatory variables, two Instrumental Variables Probit (IVP regressions were estimated; the variables for which instrumental equations were estimated were related to the participation of women to the workforce and to good health, as reported by people in the sample. Explanatory variables were related to age, gender, family factors (such as the number of family members and marital status and socio

  13. An Instrumental Variable Probit (IVP) analysis on depressed mood in Korea: the impact of gender differences and other socio-economic factors.

    Science.gov (United States)

    Gitto, Lara; Noh, Yong-Hwan; Andrés, Antonio Rodríguez

    2015-04-16

    Depression is a mental health state whose frequency has been increasing in modern societies. It imposes a great burden, because of the strong impact on people's quality of life and happiness. Depression can be reliably diagnosed and treated in primary care: if more people could get effective treatments earlier, the costs related to depression would be reversed. The aim of this study was to examine the influence of socio-economic factors and gender on depressed mood, focusing on Korea. In fact, in spite of the great amount of empirical studies carried out for other countries, few epidemiological studies have examined the socio-economic determinants of depression in Korea and they were either limited to samples of employed women or did not control for individual health status. Moreover, as the likely data endogeneity (i.e. the possibility of correlation between the dependent variable and the error term as a result of autocorrelation or simultaneity, such as, in this case, the depressed mood due to health factors that, in turn might be caused by depression), might bias the results, the present study proposes an empirical approach, based on instrumental variables, to deal with this problem. Data for the year 2008 from the Korea National Health and Nutrition Examination Survey (KNHANES) were employed. About seven thousands of people (N= 6,751, of which 43% were males and 57% females), aged from 19 to 75 years old, were included in the sample considered in the analysis. In order to take into account the possible endogeneity of some explanatory variables, two Instrumental Variables Probit (IVP) regressions were estimated; the variables for which instrumental equations were estimated were related to the participation of women to the workforce and to good health, as reported by people in the sample. Explanatory variables were related to age, gender, family factors (such as the number of family members and marital status) and socio-economic factors (such as education

  14. ExoMars Trace Gas Orbiter Instrument Modelling Approach to Streamline Science Operations

    Science.gov (United States)

    Munoz Fernandez, Michela; Frew, David; Ashman, Michael; Cardesin Moinelo, Alejandro; Garcia Beteta, Juan Jose; Geiger, Bernhard; Metcalfe, Leo; Nespoli, Federico; Muniz Solaz, Carlos

    2018-05-01

    ExoMars Trace Gas Orbiter (TGO) science operations activities are centralised at ESAC's Science Operations Centre (SOC). The SOC receives the inputs from the principal investigators (PIs) in order to implement and deliver the spacecraft pointing requests and instrument timelines to the Mission Operations Centre (MOC). The high number of orbits per planning cycle has made it necessary to abstract the planning interactions between the SOC and the PI teams at the observation level. This paper describes the modelling approach we have conducted for TGOís instruments to streamline science operations. We have created dynamic observation types that scale to adapt to the conditions specified by the PI teams including observation timing, and pointing block parameters calculated from observation geometry. This approach is considered and improvement with respect to previous missions where the generation of the observation pointing and commanding requests was performed manually by the instrument teams. Automation software assists us to effectively handle the high density of planned orbits with increasing volume of scientific data and to successfully meet opportunistic scientific goals and objectives. Our planning tool combines the instrument observation definition files provided by the PIs together with the flight dynamics products to generate the Pointing Requests and the instrument timeline (ITL). The ITL contains all the validated commands at the TC sequence level and computes the resource envelopes (data rate, power, data volume) within the constraints. At the SOC, our main goal is to maximise the science output while minimising the number of iterations among the teams, ensuring that the timeline does not violate the state transitions allowed in the Mission Operations Rules and Constraints Document.

  15. Gene variants associated with antisocial behaviour: a latent variable approach.

    Science.gov (United States)

    Bentley, Mary Jane; Lin, Haiqun; Fernandez, Thomas V; Lee, Maria; Yrigollen, Carolyn M; Pakstis, Andrew J; Katsovich, Liliya; Olds, David L; Grigorenko, Elena L; Leckman, James F

    2013-10-01

    The aim of this study was to determine if a latent variable approach might be useful in identifying shared variance across genetic risk alleles that is associated with antisocial behaviour at age 15 years. Using a conventional latent variable approach, we derived an antisocial phenotype in 328 adolescents utilizing data from a 15-year follow-up of a randomized trial of a prenatal and infancy nurse-home visitation programme in Elmira, New York. We then investigated, via a novel latent variable approach, 450 informative genetic polymorphisms in 71 genes previously associated with antisocial behaviour, drug use, affiliative behaviours and stress response in 241 consenting individuals for whom DNA was available. Haplotype and Pathway analyses were also performed. Eight single-nucleotide polymorphisms (SNPs) from eight genes contributed to the latent genetic variable that in turn accounted for 16.0% of the variance within the latent antisocial phenotype. The number of risk alleles was linearly related to the latent antisocial variable scores. Haplotypes that included the putative risk alleles for all eight genes were also associated with higher latent antisocial variable scores. In addition, 33 SNPs from 63 of the remaining genes were also significant when added to the final model. Many of these genes interact on a molecular level, forming molecular networks. The results support a role for genes related to dopamine, norepinephrine, serotonin, glutamate, opioid and cholinergic signalling as well as stress response pathways in mediating susceptibility to antisocial behaviour. This preliminary study supports use of relevant behavioural indicators and latent variable approaches to study the potential 'co-action' of gene variants associated with antisocial behaviour. It also underscores the cumulative relevance of common genetic variants for understanding the aetiology of complex behaviour. If replicated in future studies, this approach may allow the identification of a

  16. One-stage posterior approaches for treatment of thoracic spinal infection: Transforaminal and costotransversectomy, compared with anterior approach with posterior instrumentation.

    Science.gov (United States)

    Kao, Fu-Cheng; Tsai, Tsung-Ting; Niu, Chi-Chien; Lai, Po-Liang; Chen, Lih-Huei; Chen, Wen-Jer

    2017-10-01

    Treating thoracic infective spondylodiscitis with anterior surgical approaches carry a relatively high risk of perioperative and postoperative complications. Posterior approaches have been reported to result in lower complication rates than anterior procedures, but more evidence is needed to demonstrate the safety and efficacy of 1-stage posterior approaches for treating infectious thoracic spondylodiscitis.Preoperative and postoperative clinical data, of 18 patients who underwent 2 types of 1-stage posterior procedures, costotransversectomy and transforaminal thoracic interbody debridement and fusion and 7 patients who underwent anterior debridement and reconstruction with posterior instrumentation, were retrospectively assessed.The clinical outcomes of patients treated with 1-stage posterior approaches were generally good, with good infection control, back pain relief, kyphotic angle correction, and either partial or solid union for fusion status. Furthermore, they achieved shorter surgical time, fewer postoperative complications, and shorter hospital stay than the patients underwent anterior debridement with posterior instrumentation.The results suggested that treating thoracic spondylodiscitis with a single-stage posterior approach might prevent postoperative complications and avoid respiratory problems associated with anterior approaches. Single-stage posterior approaches would be recommended for thoracic spine infection, especially for patients with medical comorbidities.

  17. Modern spinal instrumentation. Part 2: Multimodality imaging approach for assessment of complications

    International Nuclear Information System (INIS)

    Allouni, A.K.; Davis, W.; Mankad, K.; Rankine, J.; Davagnanam, I.

    2013-01-01

    Radiologists frequently encounter studies demonstrating spinal instrumentation, either as part of the patient's postoperative evaluation, or as incidental to a study performed for another purpose. It is important for the reporting radiologist to identify potential complications of commonly used spinal implants. Part 1 of this review examined both the surgical approaches used and the normal appearances of these spinal implants and bone grafting techniques. This second part of the review will focus on the multimodal imaging strategy adopted in the assessment of the instrumented spine and the demonstration of imaging findings of common postoperative complications.

  18. Solving the Omitted Variables Problem of Regression Analysis Using the Relative Vertical Position of Observations

    Directory of Open Access Journals (Sweden)

    Jonathan E. Leightner

    2012-01-01

    Full Text Available The omitted variables problem is one of regression analysis’ most serious problems. The standard approach to the omitted variables problem is to find instruments, or proxies, for the omitted variables, but this approach makes strong assumptions that are rarely met in practice. This paper introduces best projection reiterative truncated projected least squares (BP-RTPLS, the third generation of a technique that solves the omitted variables problem without using proxies or instruments. This paper presents a theoretical argument that BP-RTPLS produces unbiased reduced form estimates when there are omitted variables. This paper also provides simulation evidence that shows OLS produces between 250% and 2450% more errors than BP-RTPLS when there are omitted variables and when measurement and round-off error is 1 percent or less. In an example, the government spending multiplier, , is estimated using annual data for the USA between 1929 and 2010.

  19. Variable Stars in the Field of V729 Aql

    Science.gov (United States)

    Cagaš, P.

    2017-04-01

    Wide field instruments can be used to acquire light curves of tens or even hundreds of variable stars per night, which increases the probability of new discoveries of interesting variable stars and generally increases the efficiency of observations. At the same time, wide field instruments produce a large amount of data, which must be processed using advanced software. The traditional approach, typically used by amateur astronomers, requires an unacceptable amount of time needed to process each data set. New functionality, built into SIPS software package, can shorten the time needed to obtain light curves by several orders of magnitude. Also, newly introduced SILICUPS software is intended for post-processing of stored light curves. It can be used to visualize observations from many nights, to find variable star periods, evaluate types of variability, etc. This work provides an overview of tools used to process data from the large field of view around the variable star V729 Aql. and demonstrates the results.

  20. Approaches for modeling within subject variability in pharmacometric count data analysis: dynamic inter-occasion variability and stochastic differential equations.

    Science.gov (United States)

    Deng, Chenhui; Plan, Elodie L; Karlsson, Mats O

    2016-06-01

    Parameter variation in pharmacometric analysis studies can be characterized as within subject parameter variability (WSV) in pharmacometric models. WSV has previously been successfully modeled using inter-occasion variability (IOV), but also stochastic differential equations (SDEs). In this study, two approaches, dynamic inter-occasion variability (dIOV) and adapted stochastic differential equations, were proposed to investigate WSV in pharmacometric count data analysis. These approaches were applied to published count models for seizure counts and Likert pain scores. Both approaches improved the model fits significantly. In addition, stochastic simulation and estimation were used to explore further the capability of the two approaches to diagnose and improve models where existing WSV is not recognized. The results of simulations confirmed the gain in introducing WSV as dIOV and SDEs when parameters vary randomly over time. Further, the approaches were also informative as diagnostics of model misspecification, when parameters changed systematically over time but this was not recognized in the structural model. The proposed approaches in this study offer strategies to characterize WSV and are not restricted to count data.

  1. Using and Developing Measurement Instruments in Science Education: A Rasch Modeling Approach. Science & Engineering Education Sources

    Science.gov (United States)

    Liu, Xiufeng

    2010-01-01

    This book meets a demand in the science education community for a comprehensive and introductory measurement book in science education. It describes measurement instruments reported in refereed science education research journals, and introduces the Rasch modeling approach to developing measurement instruments in common science assessment domains,…

  2. Control approach development for variable recruitment artificial muscles

    Science.gov (United States)

    Jenkins, Tyler E.; Chapman, Edward M.; Bryant, Matthew

    2016-04-01

    This study characterizes hybrid control approaches for the variable recruitment of fluidic artificial muscles with double acting (antagonistic) actuation. Fluidic artificial muscle actuators have been explored by researchers due to their natural compliance, high force-to-weight ratio, and low cost of fabrication. Previous studies have attempted to improve system efficiency of the actuators through variable recruitment, i.e. using discrete changes in the number of active actuators. While current variable recruitment research utilizes manual valve switching, this paper details the current development of an online variable recruitment control scheme. By continuously controlling applied pressure and discretely controlling the number of active actuators, operation in the lowest possible recruitment state is ensured and working fluid consumption is minimized. Results provide insight into switching control scheme effects on working fluids, fabrication material choices, actuator modeling, and controller development decisions.

  3. Virtual instrumentation: a new approach for control and instrumentation - application in containment studies facility

    International Nuclear Information System (INIS)

    Gole, N.V.; Shanware, V.M.; Sebastian, A.; Subramaniam, K.

    2001-01-01

    PC based data-acquisition has emerged as a rapidly developing area particularly with respect to process instrumentation. Computer based data acquisition in process instrumentation combined with Supervisory Control and Data Acquisition (SCADA) software has introduced extensive possibilities with respect to formats for presentation of information. The concept of presenting data using any instrument format with the help of software tools to simulate the instrument on screen, needs to be understood, in order to be able to make use of its vast potential. The purpose of this paper is to present the significant features of the Virtual Instrumentation concept and discuss its application in the instrumentation and control system of containment studies facility (CSF). Factors involved in the development of the virtual instrumentation based I and C system for CSF are detailed and a functional overview of the system configuration is given. (author)

  4. A novel variable selection approach that iteratively optimizes variable space using weighted binary matrix sampling.

    Science.gov (United States)

    Deng, Bai-chuan; Yun, Yong-huan; Liang, Yi-zeng; Yi, Lun-zhao

    2014-10-07

    In this study, a new optimization algorithm called the Variable Iterative Space Shrinkage Approach (VISSA) that is based on the idea of model population analysis (MPA) is proposed for variable selection. Unlike most of the existing optimization methods for variable selection, VISSA statistically evaluates the performance of variable space in each step of optimization. Weighted binary matrix sampling (WBMS) is proposed to generate sub-models that span the variable subspace. Two rules are highlighted during the optimization procedure. First, the variable space shrinks in each step. Second, the new variable space outperforms the previous one. The second rule, which is rarely satisfied in most of the existing methods, is the core of the VISSA strategy. Compared with some promising variable selection methods such as competitive adaptive reweighted sampling (CARS), Monte Carlo uninformative variable elimination (MCUVE) and iteratively retaining informative variables (IRIV), VISSA showed better prediction ability for the calibration of NIR data. In addition, VISSA is user-friendly; only a few insensitive parameters are needed, and the program terminates automatically without any additional conditions. The Matlab codes for implementing VISSA are freely available on the website: https://sourceforge.net/projects/multivariateanalysis/files/VISSA/.

  5. 26 CFR 1.1275-5 - Variable rate debt instruments.

    Science.gov (United States)

    2010-04-01

    ... nonpublicly traded property. A debt instrument (other than a tax-exempt obligation) that would otherwise... variations in the cost of newly borrowed funds in the currency in which the debt instrument is denominated... on the yield of actively traded personal property (within the meaning of section 1092(d)(1)). (ii...

  6. Confirming theoretical pay constructs of a variable pay scheme

    Directory of Open Access Journals (Sweden)

    Sibangilizwe Ncube

    2013-05-01

    Full Text Available Orientation: Return on the investment in variable pay programmes remains controversial because their cost versus contribution cannot be empirically justified. Research purpose: This study validates the findings of the model developed by De Swardt on the factors related to successful variable pay programmes. Motivation for the study: Many organisations blindly implement variable pay programmes without any means to assess the impact these programmes have on the company’s performance. This study was necessary to validate the findings of an existing instrument that validates the contribution of variable pay schemes. Research design, approach and method: The study was conducted using quantitative research. A total of 300 completed questionnaires from a non-purposive sample of 3000 participants in schemes across all South African industries were returned and analysed. Main findings: Using exploratory and confirmatory factor analysis, it was found that the validation instrument developed by De Swardt is still largely valid in evaluating variable pay schemes. The differences between the study and the model were reported. Practical/managerial implications: The study confirmed the robustness of an existing model that enables practitioners to empirically validate the use of variable pay plans. This model assists in the design and implementation of variable pay programmes that meet critical success factors. Contribution/value-add: The study contributed to the development of a measurement instrument that will assess whether a variable pay plan contributes to an organisation’s success.

  7. Instrumental variable estimation of the causal effect of plasma 25-hydroxy-vitamin D on colorectal cancer risk: a mendelian randomization analysis.

    Directory of Open Access Journals (Sweden)

    Evropi Theodoratou

    Full Text Available Vitamin D deficiency has been associated with several common diseases, including cancer and is being investigated as a possible risk factor for these conditions. We reported the striking prevalence of vitamin D deficiency in Scotland. Previous epidemiological studies have reported an association between low dietary vitamin D and colorectal cancer (CRC. Using a case-control study design, we tested the association between plasma 25-hydroxy-vitamin D (25-OHD and CRC (2,001 cases, 2,237 controls. To determine whether plasma 25-OHD levels are causally linked to CRC risk, we applied the control function instrumental variable (IV method of the mendelian randomization (MR approach using four single nucleotide polymorphisms (rs2282679, rs12785878, rs10741657, rs6013897 previously shown to be associated with plasma 25-OHD. Low plasma 25-OHD levels were associated with CRC risk in the crude model (odds ratio (OR: 0.76, 95% Confidence Interval (CI: 0.71, 0.81, p: 1.4×10(-14 and after adjusting for age, sex and other confounding factors. Using an allele score that combined all four SNPs as the IV, the estimated causal effect was OR 1.16 (95% CI 0.60, 2.23, whilst it was 0.94 (95% CI 0.46, 1.91 and 0.93 (0.53, 1.63 when using an upstream (rs12785878, rs10741657 and a downstream allele score (rs2282679, rs6013897, respectively. 25-OHD levels were inversely associated with CRC risk, in agreement with recent meta-analyses. The fact that this finding was not replicated when the MR approach was employed might be due to weak instruments, giving low power to demonstrate an effect (<0.35. The prevalence and degree of vitamin D deficiency amongst individuals living in northerly latitudes is of considerable importance because of its relationship to disease. To elucidate the effect of vitamin D on CRC cancer risk, additional large studies of vitamin D and CRC risk are required and/or the application of alternative methods that are less sensitive to weak instrument

  8. Breastfeeding and the risk of childhood asthma: A two-stage instrumental variable analysis to address endogeneity.

    Science.gov (United States)

    Sharma, Nivita D

    2017-09-01

    Several explanations for the inconsistent results on the effects of breastfeeding on childhood asthma have been suggested. The purpose of this study was to investigate one unexplored explanation, which is the presence of a potential endogenous relationship between breastfeeding and childhood asthma. Endogeneity exists when an explanatory variable is correlated with the error term for reasons such as selection bias, reverse causality, and unmeasured confounders. Unadjusted endogeneity will bias the effect of breastfeeding on childhood asthma. To investigate potential endogeneity, a cross-sectional study of breastfeeding practices and incidence of childhood asthma in 87 pediatric patients in Georgia, the USA, was conducted using generalized linear modeling and a two-stage instrumental variable analysis. First, the relationship between breastfeeding and childhood asthma was analyzed without considering endogeneity. Second, tests for presence of endogeneity were performed and having detected endogeneity between breastfeeding and childhood asthma, a two-stage instrumental variable analysis was performed. The first stage of this analysis estimated the duration of breastfeeding and the second-stage estimated the risk of childhood asthma. When endogeneity was not taken into account, duration of breastfeeding was found to significantly increase the risk of childhood asthma (relative risk ratio [RR]=2.020, 95% confidence interval [CI]: [1.143-3.570]). After adjusting for endogeneity, duration of breastfeeding significantly reduced the risk of childhood asthma (RR=0.003, 95% CI: [0.000-0.240]). The findings suggest that researchers should consider evaluating how the presence of endogeneity could affect the relationship between duration of breastfeeding and the risk of childhood asthma. © 2017 EAACI and John Wiley and Sons A/S. Published by John Wiley and Sons Ltd.

  9. Spectral Kernel Approach to Study Radiative Response of Climate Variables and Interannual Variability of Reflected Solar Spectrum

    Science.gov (United States)

    Jin, Zhonghai; Wielicki, Bruce A.; Loukachine, Constantin; Charlock, Thomas P.; Young, David; Noeel, Stefan

    2011-01-01

    The radiative kernel approach provides a simple way to separate the radiative response to different climate parameters and to decompose the feedback into radiative and climate response components. Using CERES/MODIS/Geostationary data, we calculated and analyzed the solar spectral reflectance kernels for various climate parameters on zonal, regional, and global spatial scales. The kernel linearity is tested. Errors in the kernel due to nonlinearity can vary strongly depending on climate parameter, wavelength, surface, and solar elevation; they are large in some absorption bands for some parameters but are negligible in most conditions. The spectral kernels are used to calculate the radiative responses to different climate parameter changes in different latitudes. The results show that the radiative response in high latitudes is sensitive to the coverage of snow and sea ice. The radiative response in low latitudes is contributed mainly by cloud property changes, especially cloud fraction and optical depth. The large cloud height effect is confined to absorption bands, while the cloud particle size effect is found mainly in the near infrared. The kernel approach, which is based on calculations using CERES retrievals, is then tested by direct comparison with spectral measurements from Scanning Imaging Absorption Spectrometer for Atmospheric Cartography (SCIAMACHY) (a different instrument on a different spacecraft). The monthly mean interannual variability of spectral reflectance based on the kernel technique is consistent with satellite observations over the ocean, but not over land, where both model and data have large uncertainty. RMS errors in kernel ]derived monthly global mean reflectance over the ocean compared to observations are about 0.001, and the sampling error is likely a major component.

  10. On the shape of posterior densities and credible sets in instrumental variable regression models with reduced rank: an application of flexible sampling methods using neural networks

    NARCIS (Netherlands)

    Hoogerheide, L.F.; Kaashoek, J.F.; van Dijk, H.K.

    2007-01-01

    Likelihoods and posteriors of instrumental variable (IV) regression models with strong endogeneity and/or weak instruments may exhibit rather non-elliptical contours in the parameter space. This may seriously affect inference based on Bayesian credible sets. When approximating posterior

  11. Financial and testamentary capacity evaluations: procedures and assessment instruments underneath a functional approach.

    Science.gov (United States)

    Sousa, Liliana B; Simões, Mário R; Firmino, Horácio; Peisah, Carmelle

    2014-02-01

    Mental health professionals are frequently involved in mental capacity determinations. However, there is a lack of specific measures and well-defined procedures for these evaluations. The main purpose of this paper is to provide a review of financial and testamentary capacity evaluation procedures, including not only the traditional neuropsychological and functional assessment but also the more recently developed forensic assessment instruments (FAIs), which have been developed to provide a specialized answer to legal systems regarding civil competencies. Here the main guidelines, papers, and other references are reviewed in order to achieve a complete and comprehensive selection of instruments used in the assessment of financial and testamentary capacity. Although some specific measures for financial abilities have been developed recently, the same is not true for testamentary capacity. Here are presented several instruments or methodologies for assessing financial and testamentary capacity, including neuropsychological assessment, functional assessment scales, performance based functional assessment instruments, and specific FAIs. FAIs are the only specific instruments intended to provide a specific and direct answer to the assessment of financial capacity based on legal systems. Considering the need to move from a diagnostic to a functional approach in financial and testamentary capacity evaluations, it is essential to consider both general functional examination as well as cognitive functioning.

  12. Effect of corruption on healthcare satisfaction in post-soviet nations: A cross-country instrumental variable analysis of twelve countries.

    Science.gov (United States)

    Habibov, Nazim

    2016-03-01

    There is the lack of consensus about the effect of corruption on healthcare satisfaction in transitional countries. Interpreting the burgeoning literature on this topic has proven difficult due to reverse causality and omitted variable bias. In this study, the effect of corruption on healthcare satisfaction is investigated in a set of 12 Post-Socialist countries using instrumental variable regression on the sample of 2010 Life in Transition survey (N = 8655). The results indicate that experiencing corruption significantly reduces healthcare satisfaction. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.

  13. On the shape of posterior densities and credible sets in instrumental variable regression models with reduced rank: an application of flexible sampling methods using neural networks

    NARCIS (Netherlands)

    L.F. Hoogerheide (Lennart); J.F. Kaashoek (Johan); H.K. van Dijk (Herman)

    2005-01-01

    textabstractLikelihoods and posteriors of instrumental variable regression models with strong endogeneity and/or weak instruments may exhibit rather non-elliptical contours in the parameter space. This may seriously affect inference based on Bayesian credible sets. When approximating such contours

  14. Estimating variability in functional images using a synthetic resampling approach

    International Nuclear Information System (INIS)

    Maitra, R.; O'Sullivan, F.

    1996-01-01

    Functional imaging of biologic parameters like in vivo tissue metabolism is made possible by Positron Emission Tomography (PET). Many techniques, such as mixture analysis, have been suggested for extracting such images from dynamic sequences of reconstructed PET scans. Methods for assessing the variability in these functional images are of scientific interest. The nonlinearity of the methods used in the mixture analysis approach makes analytic formulae for estimating variability intractable. The usual resampling approach is infeasible because of the prohibitive computational effort in simulating a number of sinogram. datasets, applying image reconstruction, and generating parametric images for each replication. Here we introduce an approach that approximates the distribution of the reconstructed PET images by a Gaussian random field and generates synthetic realizations in the imaging domain. This eliminates the reconstruction steps in generating each simulated functional image and is therefore practical. Results of experiments done to evaluate the approach on a model one-dimensional problem are very encouraging. Post-processing of the estimated variances is seen to improve the accuracy of the estimation method. Mixture analysis is used to estimate functional images; however, the suggested approach is general enough to extend to other parametric imaging methods

  15. Autonomous optical navigation using nanosatellite-class instruments: a Mars approach case study

    Science.gov (United States)

    Enright, John; Jovanovic, Ilija; Kazemi, Laila; Zhang, Harry; Dzamba, Tom

    2018-02-01

    This paper examines the effectiveness of small star trackers for orbital estimation. Autonomous optical navigation has been used for some time to provide local estimates of orbital parameters during close approach to celestial bodies. These techniques have been used extensively on spacecraft dating back to the Voyager missions, but often rely on long exposures and large instrument apertures. Using a hyperbolic Mars approach as a reference mission, we present an EKF-based navigation filter suitable for nanosatellite missions. Observations of Mars and its moons allow the estimator to correct initial errors in both position and velocity. Our results show that nanosatellite-class star trackers can produce good quality navigation solutions with low position (<300 {m}) and velocity (<0.15 {m/s}) errors as the spacecraft approaches periapse.

  16. Climate Informed Economic Instruments to Enhance Urban Water Supply Resilience to Hydroclimatological Variability and Change

    Science.gov (United States)

    Brown, C.; Carriquiry, M.; Souza Filho, F. A.

    2006-12-01

    Hydroclimatological variability presents acute challenges to urban water supply providers. The impact is often most severe in developing nations where hydrologic and climate variability can be very high, water demand is unmet and increasing, and the financial resources to mitigate the social effects of that variability are limited. Furthermore, existing urban water systems face a reduced solution space, constrained by competing and conflicting interests, such as irrigation demand, recreation and hydropower production, and new (relative to system design) demands to satisfy environmental flow requirements. These constraints magnify the impacts of hydroclimatic variability and increase the vulnerability of urban areas to climate change. The high economic and social costs of structural responses to hydrologic variability, such as groundwater utilization and the construction or expansion of dams, create a need for innovative alternatives. Advances in hydrologic and climate forecasting, and the increasing sophistication and acceptance of incentive-based mechanisms for achieving economically efficient water allocation offer potential for improving the resilience of existing water systems to the challenge of variable supply. This presentation will explore the performance of a system of climate informed economic instruments designed to facilitate the reduction of hydroclimatologic variability-induced impacts on water-sensitive stakeholders. The system is comprised of bulk water option contracts between urban water suppliers and agricultural users and insurance indexed on reservoir inflows designed to cover the financial needs of the water supplier in situations where the option is likely to be exercised. Contract and insurance parameters are linked to forecasts and the evolution of seasonal precipitation and streamflow and designed for financial and political viability. A simulation of system performance is presented based on ongoing work in Metro Manila, Philippines. The

  17. A variable resolution right TIN approach for gridded oceanographic data

    Science.gov (United States)

    Marks, David; Elmore, Paul; Blain, Cheryl Ann; Bourgeois, Brian; Petry, Frederick; Ferrini, Vicki

    2017-12-01

    Many oceanographic applications require multi resolution representation of gridded data such as for bathymetric data. Although triangular irregular networks (TINs) allow for variable resolution, they do not provide a gridded structure. Right TINs (RTINs) are compatible with a gridded structure. We explored the use of two approaches for RTINs termed top-down and bottom-up implementations. We illustrate why the latter is most appropriate for gridded data and describe for this technique how the data can be thinned. While both the top-down and bottom-up approaches accurately preserve the surface morphology of any given region, the top-down method of vertex placement can fail to match the actual vertex locations of the underlying grid in many instances, resulting in obscured topology/bathymetry. Finally we describe the use of the bottom-up approach and data thinning in two applications. The first is to provide thinned, variable resolution bathymetry data for tests of storm surge and inundation modeling, in particular hurricane Katrina. Secondly we consider the use of the approach for an application to an oceanographic data grid of 3-D ocean temperature.

  18. Fuzzy associative memories for instrument fault detection

    International Nuclear Information System (INIS)

    Heger, A.S.

    1996-01-01

    A fuzzy logic instrument fault detection scheme is developed for systems having two or three redundant sensors. In the fuzzy logic approach the deviation between each signal pairing is computed and classified into three fuzzy sets. A rule base is created allowing the human perception of the situation to be represented mathematically. Fuzzy associative memories are then applied. Finally, a defuzzification scheme is used to find the centroid location, and hence the signal status. Real-time analyses are carried out to evaluate the instantaneous signal status as well as the long-term results for the sensor set. Instantaneous signal validation results are used to compute a best estimate for the measured state variable. The long-term sensor validation method uses a frequency fuzzy variable to determine the signal condition over a specific period. To corroborate the methodology synthetic data representing various anomalies are analyzed with both the fuzzy logic technique and the parity space approach. (Author)

  19. Bayesian approach to errors-in-variables in regression models

    Science.gov (United States)

    Rozliman, Nur Aainaa; Ibrahim, Adriana Irawati Nur; Yunus, Rossita Mohammad

    2017-05-01

    In many applications and experiments, data sets are often contaminated with error or mismeasured covariates. When at least one of the covariates in a model is measured with error, Errors-in-Variables (EIV) model can be used. Measurement error, when not corrected, would cause misleading statistical inferences and analysis. Therefore, our goal is to examine the relationship of the outcome variable and the unobserved exposure variable given the observed mismeasured surrogate by applying the Bayesian formulation to the EIV model. We shall extend the flexible parametric method proposed by Hossain and Gustafson (2009) to another nonlinear regression model which is the Poisson regression model. We shall then illustrate the application of this approach via a simulation study using Markov chain Monte Carlo sampling methods.

  20. Problems with radiological surveillance instrumentation

    International Nuclear Information System (INIS)

    Swinth, K.L.; Tanner, J.E.; Fleming, D.M.

    1984-09-01

    Many radiological surveillance instruments are in use at DOE facilities throughout the country. These instruments are an essential part of all health physics programs, and poor instrument performance can increase program costs or compromise program effectiveness. Generic data from simple tests on newly purchased instruments shows that many instruments will not meet requirements due to manufacturing defects. In other cases, lack of consideration of instrument use has resulted in poor acceptance of instruments and poor reliability. The performance of instruments is highly variable for electronic and mechanical performance, radiation response, susceptibility to interferences and response to environmental factors. Poor instrument performance in these areas can lead to errors or poor accuracy in measurements

  1. Problems with radiological surveillance instrumentation

    International Nuclear Information System (INIS)

    Swinth, K.L.; Tanner, J.E.; Fleming, D.M.

    1985-01-01

    Many radiological surveillance instruments are in use at DOE facilities throughout the country. These instruments are an essential part of all health physics programs, and poor instrument performance can increase program costs or compromise program effectiveness. Generic data from simple tests on newly purchased instruments shows that many instruments will not meet requirements due to manufacturing defects. In other cases, lack of consideration of instrument use has resulted in poor acceptance of instruments and poor reliability. The performance of instruments is highly variable for electronic and mechanical performance, radiation response, susceptibility to interferences and response to environmental factors. Poor instrument performance in these areas can lead to errors or poor accuracy in measurements

  2. On a Modular Approach to the Design of Integrated Social Surveys

    Directory of Open Access Journals (Sweden)

    Ioannidis Evangelos

    2016-06-01

    Full Text Available This article considers a modular approach to the design of integrated social surveys. The approach consists of grouping variables into ‘modules’, each of which is then allocated to one or more ‘instruments’. Each instrument is then administered to a random sample of population units, and each sample unit responds to all modules of the instrument. This approach offers a way of designing a system of integrated social surveys that balances the need to limit the cost and the need to obtain sufficient information. The allocation of the modules to instruments draws on the methodology of split questionnaire designs. The composition of the instruments, that is, how the modules are allocated to instruments, and the corresponding sample sizes are obtained as a solution to an optimisation problem. This optimisation involves minimisation of respondent burden and data collection cost, while respecting certain design constraints usually encountered in practice. These constraints may include, for example, the level of precision required and dependencies between the variables. We propose using a random search algorithm to find approximate optimal solutions to this problem. The algorithm is proved to fulfil conditions that ensure convergence to the global optimum and can also produce an efficient design for a split questionnaire.

  3. Instrumental variable analysis as a complementary analysis in studies of adverse effects : venous thromboembolism and second-generation versus third-generation oral contraceptives

    NARCIS (Netherlands)

    Boef, Anna G C; Souverein, Patrick C|info:eu-repo/dai/nl/243074948; Vandenbroucke, Jan P; van Hylckama Vlieg, Astrid; de Boer, Anthonius|info:eu-repo/dai/nl/075097346; le Cessie, Saskia; Dekkers, Olaf M

    2016-01-01

    PURPOSE: A potentially useful role for instrumental variable (IV) analysis may be as a complementary analysis to assess the presence of confounding when studying adverse drug effects. There has been discussion on whether the observed increased risk of venous thromboembolism (VTE) for

  4. Knowledge based expert system approach to instrumentation selection (INSEL

    Directory of Open Access Journals (Sweden)

    S. Barai

    2004-08-01

    Full Text Available The selection of appropriate instrumentation for any structural measurement of civil engineering structure is a complex task. Recent developments in Artificial Intelligence (AI can help in an organized use of experiential knowledge available on instrumentation for laboratory and in-situ measurement. Usually, the instrumentation decision is based on the experience and judgment of experimentalists. The heuristic knowledge available for different types of measurement is domain dependent and the information is scattered in varied knowledge sources. The knowledge engineering techniques can help in capturing the experiential knowledge. This paper demonstrates a prototype knowledge based system for INstrument SELection (INSEL assistant where the experiential knowledge for various structural domains can be captured and utilized for making instrumentation decision. In particular, this Knowledge Based Expert System (KBES encodes the heuristics on measurement and demonstrates the instrument selection process with reference to steel bridges. INSEL runs on a microcomputer and uses an INSIGHT 2+ environment.

  5. A new approach for modelling variability in residential construction projects

    Directory of Open Access Journals (Sweden)

    Mehrdad Arashpour

    2013-06-01

    Full Text Available The construction industry is plagued by long cycle times caused by variability in the supply chain. Variations or undesirable situations are the result of factors such as non-standard practices, work site accidents, inclement weather conditions and faults in design. This paper uses a new approach for modelling variability in construction by linking relative variability indicators to processes. Mass homebuilding sector was chosen as the scope of the analysis because data is readily available. Numerous simulation experiments were designed by varying size of capacity buffers in front of trade contractors, availability of trade contractors, and level of variability in homebuilding processes. The measurements were shown to lead to an accurate determination of relationships between these factors and production parameters. The variability indicator was found to dramatically affect the tangible performance measures such as home completion rates. This study provides for future analysis of the production homebuilding sector, which may lead to improvements in performance and a faster product delivery to homebuyers.

  6. A new approach for modelling variability in residential construction projects

    Directory of Open Access Journals (Sweden)

    Mehrdad Arashpour

    2013-06-01

    Full Text Available The construction industry is plagued by long cycle times caused by variability in the supply chain. Variations or undesirable situations are the result of factors such as non-standard practices, work site accidents, inclement weather conditions and faults in design. This paper uses a new approach for modelling variability in construction by linking relative variability indicators to processes. Mass homebuilding sector was chosen as the scope of the analysis because data is readily available. Numerous simulation experiments were designed by varying size of capacity buffers in front of trade contractors, availability of trade contractors, and level of variability in homebuilding processes. The measurements were shown to lead to an accurate determination of relationships between these factors and production parameters. The variability indicator was found to dramatically affect the tangible performance measures such as home completion rates. This study provides for future analysis of the production homebuilding sector, which may lead to improvements in performance and a faster product delivery to homebuyers. 

  7. Instrumentation

    International Nuclear Information System (INIS)

    Decreton, M.

    2002-01-01

    SCK-CEN's R and D programme on instrumentation involves the development of advanced instrumentation systems for nuclear applications as well as the assessment of the performance of these instruments in a radiation environment. Particular emphasis is on the use of optical fibres as umbilincal links of a remote handling unit for use during maintanance of a fusion reacor, studies on the radiation hardening of plasma diagnostic systems; investigations on new instrumentation for the future MYRRHA accelerator driven system; space applications related to radiation-hardened lenses; the development of new approaches for dose, temperature and strain measurements; the assessment of radiation-hardened sensors and motors for remote handling tasks and studies of dose measurement systems including the use of optical fibres. Progress and achievements in these areas for 2001 are described

  8. Instrumentation

    Energy Technology Data Exchange (ETDEWEB)

    Decreton, M

    2002-04-01

    SCK-CEN's R and D programme on instrumentation involves the development of advanced instrumentation systems for nuclear applications as well as the assessment of the performance of these instruments in a radiation environment. Particular emphasis is on the use of optical fibres as umbilincal links of a remote handling unit for use during maintanance of a fusion reacor, studies on the radiation hardening of plasma diagnostic systems; investigations on new instrumentation for the future MYRRHA accelerator driven system; space applications related to radiation-hardened lenses; the development of new approaches for dose, temperature and strain measurements; the assessment of radiation-hardened sensors and motors for remote handling tasks and studies of dose measurement systems including the use of optical fibres. Progress and achievements in these areas for 2001 are described.

  9. Evaluation of Validity and Reliability for Hierarchical Scales Using Latent Variable Modeling

    Science.gov (United States)

    Raykov, Tenko; Marcoulides, George A.

    2012-01-01

    A latent variable modeling method is outlined, which accomplishes estimation of criterion validity and reliability for a multicomponent measuring instrument with hierarchical structure. The approach provides point and interval estimates for the scale criterion validity and reliability coefficients, and can also be used for testing composite or…

  10. Exploring mouthfeel in model wines: Sensory-to-instrumental approaches.

    Science.gov (United States)

    Laguna, Laura; Sarkar, Anwesha; Bryant, Michael G; Beadling, Andrew R; Bartolomé, Begoña; Victoria Moreno-Arribas, M

    2017-12-01

    /absence of tannins. It is therefore necessary to use an integrative approach that combines complementary instrumental techniques for mouthfeel perception characterization. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Energy conserving schemes for the simulation of musical instrument contact dynamics

    Science.gov (United States)

    Chatziioannou, Vasileios; van Walstijn, Maarten

    2015-03-01

    Collisions are an innate part of the function of many musical instruments. Due to the nonlinear nature of contact forces, special care has to be taken in the construction of numerical schemes for simulation and sound synthesis. Finite difference schemes and other time-stepping algorithms used for musical instrument modelling purposes are normally arrived at by discretising a Newtonian description of the system. However because impact forces are non-analytic functions of the phase space variables, algorithm stability can rarely be established this way. This paper presents a systematic approach to deriving energy conserving schemes for frictionless impact modelling. The proposed numerical formulations follow from discretising Hamilton's equations of motion, generally leading to an implicit system of nonlinear equations that can be solved with Newton's method. The approach is first outlined for point mass collisions and then extended to distributed settings, such as vibrating strings and beams colliding with rigid obstacles. Stability and other relevant properties of the proposed approach are discussed and further demonstrated with simulation examples. The methodology is exemplified through a case study on tanpura string vibration, with the results confirming the main findings of previous studies on the role of the bridge in sound generation with this type of string instrument.

  12. Analysis of instrumentation technology for SMART

    International Nuclear Information System (INIS)

    Hur, Seop; Koo, I. S.; Park, H. Y.; Lee, C. K.; Kim, D. H.; Suh, Y. S.; Seong, S. H.; Jang, G. S.

    1998-03-01

    It is necessary that development requirements, techniques to be developed, and development tasks and approach are established to develop the SMART instrumentation system. It is important to establish the development strategies for input for developing SMART instrumentation system. To meet above needs, the industry general and nuclear instrumentation techniques were analyzed and reviewed, respectively, based on the classification of instrumentation to analyze the industrial instrumentation techniques, and analysis results which described the inherent merits and demerits of each technique can be used for inputs to select the instruments for SMART. For the instrumentation techniques for nuclear environments, the major instrumentation techniques were reviewed, and the instrumentation system were established. The following development approaches were established based on the development requirements and the analysis results of research and development trends of industrial and nuclear instrumentation techniques. (author). 90 refs., 38 tabs., 33 figs

  13. An Assessment of the Need for Standard Variable Names for Airborne Field Campaigns

    Science.gov (United States)

    Beach, A. L., III; Chen, G.; Northup, E. A.; Kusterer, J.; Quam, B. M.

    2017-12-01

    The NASA Earth Venture Program has led to a dramatic increase in airborne observations, requiring updated data management practices with clearly defined data standards and protocols for metadata. An airborne field campaign can involve multiple aircraft and a variety of instruments. It is quite common to have different instruments/techniques measure the same parameter on one or more aircraft platforms. This creates a need to allow instrument Principal Investigators (PIs) to name their variables in a way that would distinguish them across various data sets. A lack of standardization of variables names presents a challenge for data search tools in enabling discovery of similar data across airborne studies, aircraft platforms, and instruments. This was also identified by data users as one of the top issues in data use. One effective approach for mitigating this problem is to enforce variable name standardization, which can effectively map the unique PI variable names to fixed standard names. In order to ensure consistency amongst the standard names, it will be necessary to choose them from a controlled list. However, no such list currently exists despite a number of previous efforts to establish a sufficient list of atmospheric variable names. The Atmospheric Composition Variable Standard Name Working Group was established under the auspices of NASA's Earth Science Data Systems Working Group (ESDSWG) to solicit research community feedback to create a list of standard names that are acceptable to data providers and data users This presentation will discuss the challenges and recommendations of standard variable names in an effort to demonstrate how airborne metadata curation/management can be improved to streamline data ingest, improve interoperability, and discoverability to a broader user community.

  14. S-variable approach to LMI-based robust control

    CERN Document Server

    Ebihara, Yoshio; Arzelier, Denis

    2015-01-01

    This book shows how the use of S-variables (SVs) in enhancing the range of problems that can be addressed with the already-versatile linear matrix inequality (LMI) approach to control can, in many cases, be put on a more unified, methodical footing. Beginning with the fundamentals of the SV approach, the text shows how the basic idea can be used for each problem (and when it should not be employed at all). The specific adaptations of the method necessitated by each problem are also detailed. The problems dealt with in the book have the common traits that: analytic closed-form solutions are not available; and LMIs can be applied to produce numerical solutions with a certain amount of conservatism. Typical examples are robustness analysis of linear systems affected by parametric uncertainties and the synthesis of a linear controller satisfying multiple, often  conflicting, design specifications. For problems in which LMI methods produce conservative results, the SV approach is shown to achieve greater accuracy...

  15. Development of a new linearly variable edge filter (LVEF)-based compact slit-less mini-spectrometer

    Science.gov (United States)

    Mahmoud, Khaled; Park, Seongchong; Lee, Dong-Hoon

    2018-02-01

    This paper presents the development of a compact charge-coupled detector (CCD) spectrometer. We describe the design, concept and characterization of VNIR linear variable edge filter (LVEF)- based mini-spectrometer. The new instrument has been realized for operation in the 300 nm to 850 nm wavelength range. The instrument consists of a linear variable edge filter in front of CCD array. Low-size, light-weight and low-cost could be achieved using the linearly variable filters with no need to use any moving parts for wavelength selection as in the case of commercial spectrometers available in the market. This overview discusses the main components characteristics, the main concept with the main advantages and limitations reported. Experimental characteristics of the LVEFs are described. The mathematical approach to get the position-dependent slit function of the presented prototype spectrometer and its numerical de-convolution solution for a spectrum reconstruction is described. The performance of our prototype instrument is demonstrated by measuring the spectrum of a reference light source.

  16. A Comprehensive Approach Towards Optimizing the Xenon Plasma Focused Ion Beam Instrument for Semiconductor Failure Analysis Applications.

    Science.gov (United States)

    Subramaniam, Srinivas; Huening, Jennifer; Richards, John; Johnson, Kevin

    2017-08-01

    The xenon plasma focused ion beam instrument (PFIB), holds significant promise in expanding the applications of focused ion beams in new technology thrust areas. In this paper, we have explored the operational characteristics of a Tescan FERA3 XMH PFIB instrument with the aim of meeting current and future challenges in the semiconductor industry. A two part approach, with the first part aimed at optimizing the ion column and the second optimizing specimen preparation, has been undertaken. Detailed studies characterizing the ion column, optimizing for high-current/high mill rate activities, have been described to support a better understanding of the PFIB. In addition, a novel single-crystal sacrificial mask method has been developed and implemented for use in the PFIB. Using this combined approach, we have achieved high-quality images with minimal artifacts, while retaining the shorter throughput times of the PFIB. Although the work presented in this paper has been performed on a specific instrument, the authors hope that these studies will provide general insight to direct further improvement of PFIB design and applications.

  17. The Matrix model, a driven state variables approach to non-equilibrium thermodynamics

    NARCIS (Netherlands)

    Jongschaap, R.J.J.

    2001-01-01

    One of the new approaches in non-equilibrium thermodynamics is the so-called matrix model of Jongschaap. In this paper some features of this model are discussed. We indicate the differences with the more common approach based upon internal variables and the more sophisticated Hamiltonian and GENERIC

  18. A Novel Approach to model EPIC variable background

    Science.gov (United States)

    Marelli, M.; De Luca, A.; Salvetti, D.; Belfiore, A.

    2017-10-01

    One of the main aim of the EXTraS (Exploring the X-ray Transient and variable Sky) project is to characterise the variability of serendipitous XMM-Newton sources within each single observation. Unfortunately, 164 Ms out of the 774 Ms of cumulative exposure considered (21%) are badly affected by soft proton flares, hampering any classical analysis of field sources. De facto, the latest releases of the 3XMM catalog, as well as most of the analysis in literature, simply exclude these 'high background' periods from analysis. We implemented a novel SAS-indipendent approach to produce background-subtracted light curves, which allows to treat the case of very faint sources and very bright proton flares. EXTraS light curves of 3XMM-DR5 sources will be soon released to the community, together with new tools we are developing.

  19. Work-nonwork interference: Preliminary results on the psychometric properties of a new instrument

    Directory of Open Access Journals (Sweden)

    Eileen Koekemoer

    2010-11-01

    Research purpose: The objectives of this study were to investigate the internal validity (construct, discriminant and convergent validity, reliability and external validity (relationship with theoretically relevant variables, including job characteristics, home characteristics, burnout, ill health and life satisfaction of the instrument. Motivation for the study: Work-family interaction is a key topic receiving significant research attention. In order to facilitate comparison across work-family studies, the use of psychometrically sound instruments is of great importance. Research design, approach and method: A cross-sectional survey design was used for the target population of married employees with children working at a tertiary institution in the North West province (n = 366. In addition to the new instrument, job characteristics, home characteristics, burnout, ill health and life satisfaction were measured. Main findings: The results provided evidence for construct, discriminant and convergent validity, reliability and significant relations with external variables. Practical/managerial implications: The new instrument can be used by researchers and managers as a test under development to investigate the interference between work and different nonwork roles (i.e. parental role, spousal role, work role, domestic role and specific relations with antecedents (e.g. job/home characteristics and well-being (e.g. burnout, ill health and life satisfaction. Contribution/value-add: This study provides preliminary information on the psychometric properties of a new instrument that measures the interference between work and nonwork.

  20. Inverse Ising problem in continuous time: A latent variable approach

    Science.gov (United States)

    Donner, Christian; Opper, Manfred

    2017-12-01

    We consider the inverse Ising problem: the inference of network couplings from observed spin trajectories for a model with continuous time Glauber dynamics. By introducing two sets of auxiliary latent random variables we render the likelihood into a form which allows for simple iterative inference algorithms with analytical updates. The variables are (1) Poisson variables to linearize an exponential term which is typical for point process likelihoods and (2) Pólya-Gamma variables, which make the likelihood quadratic in the coupling parameters. Using the augmented likelihood, we derive an expectation-maximization (EM) algorithm to obtain the maximum likelihood estimate of network parameters. Using a third set of latent variables we extend the EM algorithm to sparse couplings via L1 regularization. Finally, we develop an efficient approximate Bayesian inference algorithm using a variational approach. We demonstrate the performance of our algorithms on data simulated from an Ising model. For data which are simulated from a more biologically plausible network with spiking neurons, we show that the Ising model captures well the low order statistics of the data and how the Ising couplings are related to the underlying synaptic structure of the simulated network.

  1. Density dependence and climate effects in Rocky Mountain elk: an application of regression with instrumental variables for population time series with sampling error.

    Science.gov (United States)

    Creel, Scott; Creel, Michael

    2009-11-01

    1. Sampling error in annual estimates of population size creates two widely recognized problems for the analysis of population growth. First, if sampling error is mistakenly treated as process error, one obtains inflated estimates of the variation in true population trajectories (Staples, Taper & Dennis 2004). Second, treating sampling error as process error is thought to overestimate the importance of density dependence in population growth (Viljugrein et al. 2005; Dennis et al. 2006). 2. In ecology, state-space models are used to account for sampling error when estimating the effects of density and other variables on population growth (Staples et al. 2004; Dennis et al. 2006). In econometrics, regression with instrumental variables is a well-established method that addresses the problem of correlation between regressors and the error term, but requires fewer assumptions than state-space models (Davidson & MacKinnon 1993; Cameron & Trivedi 2005). 3. We used instrumental variables to account for sampling error and fit a generalized linear model to 472 annual observations of population size for 35 Elk Management Units in Montana, from 1928 to 2004. We compared this model with state-space models fit with the likelihood function of Dennis et al. (2006). We discuss the general advantages and disadvantages of each method. Briefly, regression with instrumental variables is valid with fewer distributional assumptions, but state-space models are more efficient when their distributional assumptions are met. 4. Both methods found that population growth was negatively related to population density and winter snow accumulation. Summer rainfall and wolf (Canis lupus) presence had much weaker effects on elk (Cervus elaphus) dynamics [though limitation by wolves is strong in some elk populations with well-established wolf populations (Creel et al. 2007; Creel & Christianson 2008)]. 5. Coupled with predictions for Montana from global and regional climate models, our results

  2. Instrumental variable estimation in a survival context

    DEFF Research Database (Denmark)

    Tchetgen Tchetgen, Eric J; Walter, Stefan; Vansteelandt, Stijn

    2015-01-01

    for regression analysis in a survival context, primarily under an additive hazards model, for which we describe 2 simple methods for estimating causal effects. The first method is a straightforward 2-stage regression approach analogous to 2-stage least squares commonly used for IV analysis in linear regression....... The IV approach is very well developed in the context of linear regression and also for certain generalized linear models with a nonlinear link function. However, IV methods are not as well developed for regression analysis with a censored survival outcome. In this article, we develop the IV approach....... In this approach, the fitted value from a first-stage regression of the exposure on the IV is entered in place of the exposure in the second-stage hazard model to recover a valid estimate of the treatment effect of interest. The second method is a so-called control function approach, which entails adding...

  3. Instrument Remote Control via the Astronomical Instrument Markup Language

    Science.gov (United States)

    Sall, Ken; Ames, Troy; Warsaw, Craig; Koons, Lisa; Shafer, Richard

    1998-01-01

    The Instrument Remote Control (IRC) project ongoing at NASA's Goddard Space Flight Center's (GSFC) Information Systems Center (ISC) supports NASA's mission by defining an adaptive intranet-based framework that provides robust interactive and distributed control and monitoring of remote instruments. An astronomical IRC architecture that combines the platform-independent processing capabilities of Java with the power of Extensible Markup Language (XML) to express hierarchical data in an equally platform-independent, as well as human readable manner, has been developed. This architecture is implemented using a variety of XML support tools and Application Programming Interfaces (API) written in Java. IRC will enable trusted astronomers from around the world to easily access infrared instruments (e.g., telescopes, cameras, and spectrometers) located in remote, inhospitable environments, such as the South Pole, a high Chilean mountaintop, or an airborne observatory aboard a Boeing 747. Using IRC's frameworks, an astronomer or other scientist can easily define the type of onboard instrument, control the instrument remotely, and return monitoring data all through the intranet. The Astronomical Instrument Markup Language (AIML) is the first implementation of the more general Instrument Markup Language (IML). The key aspects of our approach to instrument description and control applies to many domains, from medical instruments to machine assembly lines. The concepts behind AIML apply equally well to the description and control of instruments in general. IRC enables us to apply our techniques to several instruments, preferably from different observatories.

  4. Variability of floods, droughts and windstorms over the past 500 years in Central Europe based on documentary and instrumental data

    Science.gov (United States)

    Brazdil, Rudolf

    2016-04-01

    Hydrological and meteorological extremes (HMEs) in Central Europe during the past 500 years can be reconstructed based on instrumental and documentary data. Documentary data about weather and related phenomena represent the basic source of information for historical climatology and hydrology, dealing with reconstruction of past climate and HMEs, their perception and impacts on human society. The paper presents the basic distribution of documentary data on (i) direct descriptions of HMEs and their proxies on the one hand and on (ii) individual and institutional data sources on the other. Several groups of documentary evidence such as narrative written records (annals, chronicles, memoirs), visual daily weather records, official and personal correspondence, special prints, financial and economic records (with particular attention to taxation data), newspapers, pictorial documentation, chronograms, epigraphic data, early instrumental observations, early scientific papers and communications are demonstrated with respect to extraction of information about HMEs, which concerns usually of their occurrence, severity, seasonality, meteorological causes, perception and human impacts. The paper further presents the analysis of 500-year variability of floods, droughts and windstorms on the base of series, created by combination of documentary and instrumental data. Results, advantages and drawbacks of such approach are documented on the examples from the Czech Lands. The analysis of floods concentrates on the River Vltava (Prague) and the River Elbe (Děčín) which show the highest frequency of floods occurring in the 19th century (mainly of winter synoptic type) and in the second half of the 16th century (summer synoptic type). Reported are also the most disastrous floods (August 1501, March and August 1598, February 1655, June 1675, February 1784, March 1845, February 1862, September 1890, August 2002) and the European context of floods in the severe winter 1783/84. Drought

  5. College quality and hourly wages: evidence from the self-revelation model, sibling models and instrumental variables.

    Science.gov (United States)

    Borgen, Nicolai T

    2014-11-01

    This paper addresses the recent discussion on confounding in the returns to college quality literature using the Norwegian case. The main advantage of studying Norway is the quality of the data. Norwegian administrative data provide information on college applications, family relations and a rich set of control variables for all Norwegian citizens applying to college between 1997 and 2004 (N = 141,319) and their succeeding wages between 2003 and 2010 (676,079 person-year observations). With these data, this paper uses a subset of the models that have rendered mixed findings in the literature in order to investigate to what extent confounding biases the returns to college quality. I compare estimates obtained using standard regression models to estimates obtained using the self-revelation model of Dale and Krueger (2002), a sibling fixed effects model and the instrumental variable model used by Long (2008). Using these methods, I consistently find increasing returns to college quality over the course of students' work careers, with positive returns only later in students' work careers. I conclude that the standard regression estimate provides a reasonable estimate of the returns to college quality. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. On Representative Spaceflight Instrument and Associated Instrument Sensor Web Framework

    Science.gov (United States)

    Kizhner, Semion; Patel, Umeshkumar; Vootukuru, Meg

    2007-01-01

    Sensor Web-based adaptation and sharing of space flight mission resources, including those of the Space-Ground and Control-User communication segment, could greatly benefit from utilization of heritage Internet Protocols and devices applied for Spaceflight (SpaceIP). This had been successfully demonstrated by a few recent spaceflight experiments. However, while terrestrial applications of Internet protocols are well developed and understood (mostly due to billions of dollars in investments by the military and industry), the spaceflight application of Internet protocols is still in its infancy. Progress in the developments of SpaceIP-enabled instrument components will largely determine the SpaceIP utilization of those investments and acceptance in years to come. Likewise SpaceIP, the development of commercial real-time and instrument colocated computational resources, data compression and storage, can be enabled on-board a spacecraft and, in turn, support a powerful application to Sensor Web-based design of a spaceflight instrument. Sensor Web-enabled reconfiguration and adaptation of structures for hardware resources and information systems will commence application of Field Programmable Arrays (FPGA) and other aerospace programmable logic devices for what this technology was intended. These are a few obvious potential benefits of Sensor Web technologies for spaceflight applications. However, they are still waiting to be explored. This is because there is a need for a new approach to spaceflight instrumentation in order to make these mature sensor web technologies applicable for spaceflight. In this paper we present an approach in developing related and enabling spaceflight instrument-level technologies based on the new concept of a representative spaceflight Instrument Sensor Web (ISW).

  7. Industrial instrumentation principles and design

    CERN Document Server

    Padmanabhan, Tattamangalam R

    2000-01-01

    Pneumatic, hydraulic and allied instrumentation schemes have given way to electronic schemes in recent years thanks to the rapid strides in electronics and allied areas. Principles, design and applications of such state-of-the-art instrumentation schemes form the subject matter of this book. Through representative examples, the basic building blocks of instrumentation schemes are identified and each of these building blocks discussed in terms of its design and interface characteristics. The common generic schemes synthesized with such building blocks are dealt with subsequently. This forms the scope of Part I. The focus in Part II is on application. Displacement and allied instrumentation, force and allied instrumentation and process instrumentation in terms of temperature, flow, pressure level and other common process variables are dealt with separately and exhaustively. Despite the diversity in the sensor principles and characteristics and the variety in the applications and their environments, it is possib...

  8. VIGO: Instrumental Interaction in Multi-Surface Environments

    DEFF Research Database (Denmark)

    Klokmose, Clemens Nylandsted; Beaudouin-Lafon, Michel

    2009-01-01

    This paper addresses interaction in multi-surface environments and questions whether the current application-centric approaches to user interfaces are adequate in this context, and presents an alternative approach based on instrumental interaction. The paper presents the VIGO (Views, Instruments...

  9. Non-process instrumentation surveillance and test reduction

    International Nuclear Information System (INIS)

    Ferrell, R.; LeDonne, V.; Donat, T.; Thomson, I.; Sarlitto, M.

    1993-12-01

    Analysis of operating experience, instrument failure modes, and degraded instrument performance has led to a reduction in Technical Specification surveillance and test requirements for nuclear power plant process instrumentation. These changes have resulted in lower plant operations and maintenance (O ampersand M) labor costs. This report explores the possibility of realizing similar savings by reducing requirements for non-process instrumentation. The project team reviewed generic Technical Specifications for the four major US nuclear steam supply system (NSSS) vendors (Westinghouse, General Electric, Combustion Engineering, and Babcock ampersand Wilcox) to identify nonprocess instrumentation for which surveillance/test requirements could be reduced. The team surveyed 10 utilities to identify specific non-process instrumentation at their plants for which requirements could be reduced. The team evaluated utility analytic approaches used to justify changes in surveillance/test requirements for process equipment to determine their applicability to non-process instrumentation. The report presents a prioritized list of non-process instrumentation systems suitable for surveillance/test requirements reduction. The top three systems in the list are vibration monitors, leak detection monitors, and chemistry monitors. In general, most non-process instrumentation governed by Technical Specification requirements are candidates for requirements reduction. If statistical requirements are somewhat relaxed, the analytic approaches previously used to reduce requirements for process instrumentation can be applied to non-process instrumentation. The report identifies as viable the technical approaches developed and successfully used by Southern California Edison, Arizona Public Service, and Boston Edison

  10. Does the Early Bird Catch the Worm? Instrumental Variable Estimates of Educational Effects of Age of School Entry in Germany

    OpenAIRE

    Puhani, Patrick A.; Weber, Andrea M.

    2006-01-01

    We estimate the effect of age of school entry on educational outcomes using two different data sets for Germany, sampling pupils at the end of primary school and in the middle of secondary school. Results are obtained based on instrumental variable estimation exploiting the exogenous variation in month of birth. We find robust and significant positive effects on educational outcomes for pupils who enter school at seven instead of six years of age: Test scores at the end of primary school incr...

  11. Integrated variable projection approach (IVAPA) for parallel magnetic resonance imaging.

    Science.gov (United States)

    Zhang, Qiao; Sheng, Jinhua

    2012-10-01

    Parallel magnetic resonance imaging (pMRI) is a fast method which requires algorithms for the reconstructing image from a small number of measured k-space lines. The accurate estimation of the coil sensitivity functions is still a challenging problem in parallel imaging. The joint estimation of the coil sensitivity functions and the desired image has recently been proposed to improve the situation by iteratively optimizing both the coil sensitivity functions and the image reconstruction. It regards both the coil sensitivities and the desired images as unknowns to be solved for jointly. In this paper, we propose an integrated variable projection approach (IVAPA) for pMRI, which integrates two individual processing steps (coil sensitivity estimation and image reconstruction) into a single processing step to improve the accuracy of the coil sensitivity estimation using the variable projection approach. The method is demonstrated to be able to give an optimal solution with considerably reduced artifacts for high reduction factors and a low number of auto-calibration signal (ACS) lines, and our implementation has a fast convergence rate. The performance of the proposed method is evaluated using a set of in vivo experiment data. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. On the relationship between optical variability, visual saliency, and eye fixations: a computational approach.

    Science.gov (United States)

    Garcia-Diaz, Antón; Leborán, Víctor; Fdez-Vidal, Xosé R; Pardo, Xosé M

    2012-06-12

    A hierarchical definition of optical variability is proposed that links physical magnitudes to visual saliency and yields a more reductionist interpretation than previous approaches. This definition is shown to be grounded on the classical efficient coding hypothesis. Moreover, we propose that a major goal of contextual adaptation mechanisms is to ensure the invariance of the behavior that the contribution of an image point to optical variability elicits in the visual system. This hypothesis and the necessary assumptions are tested through the comparison with human fixations and state-of-the-art approaches to saliency in three open access eye-tracking datasets, including one devoted to images with faces, as well as in a novel experiment using hyperspectral representations of surface reflectance. The results on faces yield a significant reduction of the potential strength of semantic influences compared to previous works. The results on hyperspectral images support the assumptions to estimate optical variability. As well, the proposed approach explains quantitative results related to a visual illusion observed for images of corners, which does not involve eye movements.

  13. Design and Implementation of Data Collection Instruments for Neonatology Research

    Directory of Open Access Journals (Sweden)

    Monica G. HĂŞMĂŞANU

    2014-12-01

    Full Text Available im: The aim of our research was to design and implement data collection instruments to be use in context of an observational prospective clinical study with follow-up conducted on new born with intrauterine growth restriction. Methods: The structure of the data collection forms (paper based and electronic based was first identified and for each variable the best type to accomplish the research aim was established. The code for categorical variables has also been decided as well as the units of measurements for quantitative variables. In respect of good practice, a set of confounding factors (as gender, date of birth, etc. have also been identified and integrated in data collection instruments. Data-entry validation rules were implemented for each variable to reduce data input errors when the electronic data collection instrument was created. Results: Two data collection instruments have been developed and successfully implemented: a paper-based form and an electronic data collection instrument. The developed forms included demographics, neonatal complications (as hypoglycemia, hypocalcemia, etc., biochemical data at birth and follow-up, immunological data, as well as basal and follow-up echocardiographic data. Data-entry validation criteria have been implemented in electronic data collection instrument to assure validity and precision when paper-based data are translated in electronic form. Furthermore, to assure subject’s confidentiality a careful attention was given to HIPPA identifiers when electronic data collection instrument was developed. Conclusion: Data collection instruments were successfully developed and implemented as an a priori step in a clinical research for assisting data collection and management in a case of an observational prospective study with follow-up visits.

  14. Improved installation approach for variable spring setting on a pipe yet to be insulated

    International Nuclear Information System (INIS)

    Shah, H.H.; Chitnis, S.S.; Rencher, D.

    1993-01-01

    This paper provides an approach in setting of variable spring supports for noninsulated or partially insulated piping systems so that resetting these supports is not required when the insulation is fully installed. This approach shows a method of deriving the spring coldload setting tolerance values that can be readily utilized by craft personnel. This method is based on the percentage of the weight of the insulation compared to the total weight of the pipe and the applicable tolerance. Use of these setting tolerances eliminates reverification of the original cold-load settings, for the majority of variable springs when the insulation is fully installed

  15. Focus on variability : New tools to study intra-individual variability in developmental data

    NARCIS (Netherlands)

    van Geert, P; van Dijk, M

    2002-01-01

    In accordance with dynamic systems theory, we assume that variability is an important developmental phenomenon. However, the standard methodological toolkit of the developmental psychologist offers few instruments for the study of variability. In this article we will present several new methods that

  16. Calibration through on-line monitoring of instruments channels

    International Nuclear Information System (INIS)

    James, R.W.

    1996-01-01

    Plant technical specifications require periodic calibration of instrument channels, and this has traditionally meant calibration at fixed time intervals for nearly all instruments. Experience has shown that unnecessarily frequent calibrations reduce channel availability and reliability, impact outage durations, and increase maintenance costs. An alternative approach to satisfying existing requirements for periodic calibration consists of on-line monitoring and quantitative comparison of instrument channels during operation to identify instrument degradation and failure. A Utility Working Group has been formed by EPRI to support the technical activities necessary to achieve generic NRC acceptance of on-line monitoring of redundant instrument channels as a basis for determining when to perform calibrations. A topical report proposing NRC acceptance of this approach was submitted in August 1995, and the Working Group is currently resolving NRC technical questions. This paper describes the proposed approach and the current status of the topical report with regard to NRC review. While these activities will not preclude utilities from continuing to use existing calibration approaches, successful acceptance of this performance-based approach will allow utilities to substantially reduce the number of calibrations which are performed. Concurrent benefits will include reduced I ampersand C impact on outage durations and improved sensitivity to instrument channel performance

  17. Organizational learning, pilot test of Likert-type instruments

    Directory of Open Access Journals (Sweden)

    Manuel Alfonso Garzón Castrillón

    2010-09-01

    Full Text Available This paper presents the results obtained in the pilot study of instruments created to comply the specific objective of designing and validating instruments to study the capacity of organizational learning. The Likert measurement scale was used because it allowed to establish the pertinence of the dimension as variable in the context of organizational learning. A One-way Analysis of Variance (ANOVA was used, with statistical package SPSS. Some 138 variables in 3 factors and 40 affirmations were simplified.

  18. Nonparametric instrumental regression with non-convex constraints

    International Nuclear Information System (INIS)

    Grasmair, M; Scherzer, O; Vanhems, A

    2013-01-01

    This paper considers the nonparametric regression model with an additive error that is dependent on the explanatory variables. As is common in empirical studies in epidemiology and economics, it also supposes that valid instrumental variables are observed. A classical example in microeconomics considers the consumer demand function as a function of the price of goods and the income, both variables often considered as endogenous. In this framework, the economic theory also imposes shape restrictions on the demand function, such as integrability conditions. Motivated by this illustration in microeconomics, we study an estimator of a nonparametric constrained regression function using instrumental variables by means of Tikhonov regularization. We derive rates of convergence for the regularized model both in a deterministic and stochastic setting under the assumption that the true regression function satisfies a projected source condition including, because of the non-convexity of the imposed constraints, an additional smallness condition. (paper)

  19. Nonparametric instrumental regression with non-convex constraints

    Science.gov (United States)

    Grasmair, M.; Scherzer, O.; Vanhems, A.

    2013-03-01

    This paper considers the nonparametric regression model with an additive error that is dependent on the explanatory variables. As is common in empirical studies in epidemiology and economics, it also supposes that valid instrumental variables are observed. A classical example in microeconomics considers the consumer demand function as a function of the price of goods and the income, both variables often considered as endogenous. In this framework, the economic theory also imposes shape restrictions on the demand function, such as integrability conditions. Motivated by this illustration in microeconomics, we study an estimator of a nonparametric constrained regression function using instrumental variables by means of Tikhonov regularization. We derive rates of convergence for the regularized model both in a deterministic and stochastic setting under the assumption that the true regression function satisfies a projected source condition including, because of the non-convexity of the imposed constraints, an additional smallness condition.

  20. R Package multiPIM: A Causal Inference Approach to Variable Importance Analysis

    Directory of Open Access Journals (Sweden)

    Stephan J Ritter

    2014-04-01

    Full Text Available We describe the R package multiPIM, including statistical background, functionality and user options. The package is for variable importance analysis, and is meant primarily for analyzing data from exploratory epidemiological studies, though it could certainly be applied in other areas as well. The approach taken to variable importance comes from the causal inference field, and is different from approaches taken in other R packages. By default, multiPIM uses a double robust targeted maximum likelihood estimator (TMLE of a parameter akin to the attributable risk. Several regression methods/machine learning algorithms are available for estimating the nuisance parameters of the models, including super learner, a meta-learner which combines several different algorithms into one. We describe a simulation in which the double robust TMLE is compared to the graphical computation estimator. We also provide example analyses using two data sets which are included with the package.

  1. Innovative Approach to Implementation of FPGA-based NPP Instrumentation and Control Systems

    Energy Technology Data Exchange (ETDEWEB)

    Andrashov, Anton; Kharchenko, Vyacheslav; Sklyar, Volodymir [Centre for Safety Infrastructure-Oriented Research and Analysis, Kharkov (Ukraine); SIORA Alexander [Research and Production Corporation Radiy, Kirovograd (Ukraine)

    2011-08-15

    Advantages of application of Field Programmable Gates Arrays (FPGA) technology for implementation of Instrumentation and Control (I and C) systems for Nuclear Power Plants (NPP) are outlined. Specific features of FPGA technology in the context of cyber security threats for NPPs I and C systems are analyzed. Description of FPGA-based platform used for implementation of different safety I and C systems for NPPs is presented. Typical architecture of NPPs safety I and C system based on the platform, as well as approach to implementation of I and C systems using FPGA-based platform are discussed. Data on implementation experience of application of the platform for NPP safety I and C systems modernization projects are finalizing the paper.

  2. Innovative approach to implementation of FPGA-based NPP instrumentation and control systems

    International Nuclear Information System (INIS)

    Andrashov, Anton; Kharchenko, Vyacheslav; Sklyar, Volodymir; Siora, Alexander

    2011-01-01

    Advantages of application of Field Programmable Gates Arrays (FPGA) technology for implementation of Instrumentation and Control (I and C) systems for Nuclear Power Plants (NPP) are outlined. Specific features of FPGA technology in the context of cyber security threats for NPPs I and C systems are analyzed. Description of FPGA-based platform used for implementation of different safety I and C systems for NPPs is presented. Typical architecture of NPPs safety I and C system based on the platform, as well as approach to implementation of I and C systems using FPGA-based platform are discussed. Data on implementation experience of application of the platform for NPP safety I and C systems modernization projects are finalizing the paper. (author)

  3. Innovative Approach to Implementation of FPGA-based NPP Instrumentation and Control Systems

    International Nuclear Information System (INIS)

    Andrashov, Anton; Kharchenko, Vyacheslav; Sklyar, Volodymir; SIORA Alexander

    2011-01-01

    Advantages of application of Field Programmable Gates Arrays (FPGA) technology for implementation of Instrumentation and Control (I and C) systems for Nuclear Power Plants (NPP) are outlined. Specific features of FPGA technology in the context of cyber security threats for NPPs I and C systems are analyzed. Description of FPGA-based platform used for implementation of different safety I and C systems for NPPs is presented. Typical architecture of NPPs safety I and C system based on the platform, as well as approach to implementation of I and C systems using FPGA-based platform are discussed. Data on implementation experience of application of the platform for NPP safety I and C systems modernization projects are finalizing the paper

  4. An integrated approach for integrated intelligent instrumentation and control system (I3CS)

    International Nuclear Information System (INIS)

    Jung, C.H.; Kim, J.T.; Kwon, K.C.

    1997-01-01

    Nuclear power plants to guarantee the safety of public should be designed to reduce the operator intervention resulting in operating human errors, identify the process states in transients, and aid to make a decision of their tasks and guide operator actions. For the sake of this purpose, MMIS(MAN-Machine Interface System) in NPPs should be the integrated top-down approach tightly focused on the function-based task analysis including an advanced digital technology, an operator support function, and so on. The advanced I and C research team in KAERI has embarked on developing an Integrated Intelligent Instrumentation and Control System (I 3 CS) for Korea's next generation nuclear power plants. I 3 CS bases the integrated top-down approach on the function-based task analysis, modern digital technology, standardization and simplification, availability and reliability, and protection of investment. (author). 4 refs, 6 figs

  5. Emerging adulthood features and criteria for adulthood : Variable- and person-centered approaches

    NARCIS (Netherlands)

    Tagliabue, Semira; Crocetti, Elisabetta; Lanz, Margherita

    Reaching adulthood is the aim of the transition to adulthood; however, emerging adults differently define both adulthood and the transitional period they are living. Variable-centered and person-centered approaches were integrated in the present paper to investigate if the criteria used to define

  6. Medical instruments in museums

    DEFF Research Database (Denmark)

    Söderqvist, Thomas; Arnold, Ken

    2011-01-01

    This essay proposes that our understanding of medical instruments might benefit from adding a more forthright concern with their immediate presence to the current historical focus on simply decoding their meanings and context. This approach is applied to the intriguingly tricky question of what...... actually is meant by a "medical instrument." It is suggested that a pragmatic part of the answer might lie simply in reconsidering the holdings of medical museums, where the significance of the physical actuality of instruments comes readily to hand....

  7. Contribution to the discussion of P.M. Fayers and David J. Hand: Causal variables, indicator variables and measurement scales: an example from quality of life

    DEFF Research Database (Denmark)

    Keiding, Niels

    2002-01-01

    Causal variables; Clinimetric scales; Composite scales; Construct validity; Measurement scales; Multi-item scales; Quality-of-life instruments......Causal variables; Clinimetric scales; Composite scales; Construct validity; Measurement scales; Multi-item scales; Quality-of-life instruments...

  8. On the Integrity of Online Testing for Introductory Statistics Courses: A Latent Variable Approach

    Directory of Open Access Journals (Sweden)

    Alan Fask

    2015-04-01

    Full Text Available There has been a remarkable growth in distance learning courses in higher education. Despite indications that distance learning courses are more vulnerable to cheating behavior than traditional courses, there has been little research studying whether online exams facilitate a relatively greater level of cheating. This article examines this issue by developing an approach using a latent variable to measure student cheating. This latent variable is linked to both known student mastery related variables and variables unrelated to student mastery. Grade scores from a proctored final exam and an unproctored final exam are used to test for increased cheating behavior in the unproctored exam

  9. An integrated approach to endoscopic instrument tracking for augmented reality applications in surgical simulation training.

    Science.gov (United States)

    Loukas, Constantinos; Lahanas, Vasileios; Georgiou, Evangelos

    2013-12-01

    Despite the popular use of virtual and physical reality simulators in laparoscopic training, the educational potential of augmented reality (AR) has not received much attention. A major challenge is the robust tracking and three-dimensional (3D) pose estimation of the endoscopic instrument, which are essential for achieving interaction with the virtual world and for realistic rendering when the virtual scene is occluded by the instrument. In this paper we propose a method that addresses these issues, based solely on visual information obtained from the endoscopic camera. Two different tracking algorithms are combined for estimating the 3D pose of the surgical instrument with respect to the camera. The first tracker creates an adaptive model of a colour strip attached to the distal part of the tool (close to the tip). The second algorithm tracks the endoscopic shaft, using a combined Hough-Kalman approach. The 3D pose is estimated with perspective geometry, using appropriate measurements extracted by the two trackers. The method has been validated on several complex image sequences for its tracking efficiency, pose estimation accuracy and applicability in AR-based training. Using a standard endoscopic camera, the absolute average error of the tip position was 2.5 mm for working distances commonly found in laparoscopic training. The average error of the instrument's angle with respect to the camera plane was approximately 2°. The results are also supplemented by video segments of laparoscopic training tasks performed in a physical and an AR environment. The experiments yielded promising results regarding the potential of applying AR technologies for laparoscopic skills training, based on a computer vision framework. The issue of occlusion handling was adequately addressed. The estimated trajectory of the instruments may also be used for surgical gesture interpretation and assessment. Copyright © 2013 John Wiley & Sons, Ltd.

  10. Improved instrumentation for intensity-, wavelength-, temperature-, and magnetic field-resolved photoconductivity spectroscopy

    International Nuclear Information System (INIS)

    Cottingham, Patrick; Morey, Jennifer R.; Lemire, Amanda; Lemire, Penny; McQueen, Tyrel M.

    2016-01-01

    We report instrumentation for photovoltage and photocurrent spectroscopy over a larger continuous range of wavelengths, temperatures, and applied magnetic fields than other instruments described in the literature: 350 nm≤λ≤1700 nm, 1.8 K≤T≤300 K, and B≤9 T. This instrument uses a modulated monochromated incoherent light source with total power<30 μW in combination with an LED in order to probe selected regions of non-linear responses while maintaining low temperatures and avoiding thermal artifacts. The instrument may also be used to measure a related property, the photomagnetoresistance. We demonstrate the importance of normalizing measured responses for variations in light power and describe a rigorous process for performing these normalizations. We discuss several circuits suited to measuring different types of samples and provide analysis for converting measured values into physically relevant properties. Uniform approaches to measurement of these photoproperties are essential for reliable quantitative comparisons between emerging new materials with energy applications. - Highlights: • A novel instrument for measuring photoconductivity and photocurrents of materials and devices. • Continuous parameter space: 350 nm≤λ≤1700, 1.8 K≤T≤300 K, and B≤9 T. • Methodology for treating non-linear responses and variable lamp intensity. • Mathematical detail for extracting properties of materials from measured values is provided.

  11. Development of a Blood Pressure Measurement Instrument with Active Cuff Pressure Control Schemes

    Directory of Open Access Journals (Sweden)

    Chung-Hsien Kuo

    2017-01-01

    Full Text Available This paper presents an oscillometric blood pressure (BP measurement approach based on the active control schemes of cuff pressure. Compared with conventional electronic BP instruments, the novelty of the proposed BP measurement approach is to utilize a variable volume chamber which actively and stably alters the cuff pressure during inflating or deflating cycles. The variable volume chamber is operated with a closed-loop pressure control scheme, and it is activated by controlling the piston position of a single-acting cylinder driven by a screw motor. Therefore, the variable volume chamber could significantly eliminate the air turbulence disturbance during the air injection stage when compared to an air pump mechanism. Furthermore, the proposed active BP measurement approach is capable of measuring BP characteristics, including systolic blood pressure (SBP and diastolic blood pressure (DBP, during the inflating cycle. Two modes of air injection measurement (AIM and accurate dual-way measurement (ADM were proposed. According to the healthy subject experiment results, AIM reduced 34.21% and ADM reduced 15.78% of the measurement time when compared to a commercial BP monitor. Furthermore, the ADM performed much consistently (i.e., less standard deviation in the measurements when compared to a commercial BP monitor.

  12. MEANINGFUL VARIABILITY: A SOCIOLINGUISTICALLY-GROUNDED APPROACH TO VARIATION IN OPTIMALITY THEORY

    Directory of Open Access Journals (Sweden)

    Juan Antonio Cutillas Espinosa

    2004-12-01

    Full Text Available Most approaches to variability in Optimality Theory have attempted to make variation possible within the OT framework, i.e. to reformulate constraints and rankings to accommodate variable and gradient linguistic facts. Sociolinguists have attempted to apply these theoretical advances to the study of language variation, with an emphasis on language-interna1 variables (Auger 2001, Cardoso 2001. Little attention has been paid to the array of externa1 factors that influence the patterning of variation. In this paper, we argue that some variation pattems-specially those that are socially meaningful- are actually the result of a three-grarnmar system. G, is the standard grammar, which has to be available to the speaker to obtain these variation patterns. G; is the vernacular grammar, which the speaker is likely to have acquired in his local community. Finally, G, is an intergrammar, which is used by the speaker as his 'default' constraint set. G is a continuous ranking (Boersma & Hayes 2001 and domination relations are consciously altered by the speakers to shape the appropriate and variable linguistic output. We illustrate this model with analyses of English and Spanish.

  13. New approaches for examining associations with latent categorical variables: applications to substance abuse and aggression.

    Science.gov (United States)

    Feingold, Alan; Tiberio, Stacey S; Capaldi, Deborah M

    2014-03-01

    Assessments of substance use behaviors often include categorical variables that are frequently related to other measures using logistic regression or chi-square analysis. When the categorical variable is latent (e.g., extracted from a latent class analysis [LCA]), classification of observations is often used to create an observed nominal variable from the latent one for use in a subsequent analysis. However, recent simulation studies have found that this classical 3-step analysis championed by the pioneers of LCA produces underestimates of the associations of latent classes with other variables. Two preferable but underused alternatives for examining such linkages-each of which is most appropriate under certain conditions-are (a) 3-step analysis, which corrects the underestimation bias of the classical approach, and (b) 1-step analysis. The purpose of this article is to dissuade researchers from conducting classical 3-step analysis and to promote the use of the 2 newer approaches that are described and compared. In addition, the applications of these newer models-for use when the independent, the dependent, or both categorical variables are latent-are illustrated through substantive analyses relating classes of substance abusers to classes of intimate partner aggressors.

  14. A Systematic Review of Statistical Methods Used to Test for Reliability of Medical Instruments Measuring Continuous Variables

    Directory of Open Access Journals (Sweden)

    Rafdzah Zaki

    2013-06-01

    Full Text Available   Objective(s: Reliability measures precision or the extent to which test results can be replicated. This is the first ever systematic review to identify statistical methods used to measure reliability of equipment measuring continuous variables. This studyalso aims to highlight the inappropriate statistical method used in the reliability analysis and its implication in the medical practice.   Materials and Methods: In 2010, five electronic databases were searched between 2007 and 2009 to look for reliability studies. A total of 5,795 titles were initially identified. Only 282 titles were potentially related, and finally 42 fitted the inclusion criteria. Results: The Intra-class Correlation Coefficient (ICC is the most popular method with 25 (60% studies having used this method followed by the comparing means (8 or 19%. Out of 25 studies using the ICC, only 7 (28% reported the confidence intervals and types of ICC used. Most studies (71% also tested the agreement of instruments. Conclusion: This study finds that the Intra-class Correlation Coefficient is the most popular method used to assess the reliability of medical instruments measuring continuous outcomes. There are also inappropriate applications and interpretations of statistical methods in some studies. It is important for medical researchers to be aware of this issue, and be able to correctly perform analysis in reliability studies.

  15. Optimal speech motor control and token-to-token variability: a Bayesian modeling approach.

    Science.gov (United States)

    Patri, Jean-François; Diard, Julien; Perrier, Pascal

    2015-12-01

    The remarkable capacity of the speech motor system to adapt to various speech conditions is due to an excess of degrees of freedom, which enables producing similar acoustical properties with different sets of control strategies. To explain how the central nervous system selects one of the possible strategies, a common approach, in line with optimal motor control theories, is to model speech motor planning as the solution of an optimality problem based on cost functions. Despite the success of this approach, one of its drawbacks is the intrinsic contradiction between the concept of optimality and the observed experimental intra-speaker token-to-token variability. The present paper proposes an alternative approach by formulating feedforward optimal control in a probabilistic Bayesian modeling framework. This is illustrated by controlling a biomechanical model of the vocal tract for speech production and by comparing it with an existing optimal control model (GEPPETO). The essential elements of this optimal control model are presented first. From them the Bayesian model is constructed in a progressive way. Performance of the Bayesian model is evaluated based on computer simulations and compared to the optimal control model. This approach is shown to be appropriate for solving the speech planning problem while accounting for variability in a principled way.

  16. Data, instruments, and theory a dialectical approach to understanding science

    CERN Document Server

    Ackermann, Robert John

    1985-01-01

    Robert John Ackermann deals decisively with the problem of relativism that has plagued post-empiricist philosophy of science. Recognizing that theory and data are mediated by data domains (bordered data sets produced by scientific instruments), he argues that the use of instruments breaks the dependency of observation on theory and thus creates a reasoned basis for scientific objectivity.

  17. Software design for the VIS instrument onboard the Euclid mission: a multilayer approach

    Science.gov (United States)

    Galli, E.; Di Giorgio, A. M.; Pezzuto, S.; Liu, S. J.; Giusi, G.; Li Causi, G.; Farina, M.; Cropper, M.; Denniston, J.; Niemi, S.

    2014-07-01

    The Euclid mission scientific payload is composed of two instruments: a VISible Imaging Instrument (VIS) and a Near Infrared Spectrometer and Photometer instrument (NISP). Each instrument has its own control unit. The Instrument Command and Data Processing Unit (VI-CDPU) is the control unit of the VIS instrument. The VI-CDPU is connected directly to the spacecraft by means of a MIL-STD-1553B bus and to the satellite Mass Memory Unit via a SpaceWire link. All the internal interfaces are implemented via SpaceWire links and include 12 high speed lines for the data provided by the 36 focal plane CCDs readout electronics (ROEs) and one link to the Power and Mechanisms Control Unit (VI-PMCU). VI-CDPU is in charge of distributing commands to the instrument sub-systems, collecting their housekeeping parameters and monitoring their health status. Moreover, the unit has the task of acquiring, reordering, compressing and transferring the science data to the satellite Mass Memory. This last feature is probably the most challenging one for the VI-CDPU, since stringent constraints about the minimum lossless compression ratio, the maximum time for the compression execution and the maximum power consumption have to be satisfied. Therefore, an accurate performance analysis at hardware layer is necessary, which could delay too much the design and development of software. In order to mitigate this risk, in the multilayered design of software we decided to design a middleware layer that provides a set of APIs with the aim of hiding the implementation of the HW connected layer to the application one. The middleware is built on top of the Operating System layer (which includes the Real-Time OS that will be adopted) and the onboard Computer Hardware. The middleware itself has a multi-layer architecture composed of 4 layers: the Abstract RTOS Adapter Layer (AOSAL), the Speci_c RTOS Adapter Layer (SOSAL), the Common Patterns Layer (CPL), the Service Layer composed of two subgroups which

  18. Unit-specific calibration of Actigraph accelerometers in a mechanical setup - is it worth the effort? The effect on random output variation caused by technical inter-instrument variability in the laboratory and in the field

    DEFF Research Database (Denmark)

    Moeller, Niels C; Korsholm, Lars; Kristensen, Peter L

    2008-01-01

    BACKGROUND: Potentially, unit-specific in-vitro calibration of accelerometers could increase field data quality and study power. However, reduced inter-unit variability would only be important if random instrument variability contributes considerably to the total variation in field data. Therefor...

  19. A new approach to hazardous materials transportation risk analysis: decision modeling to identify critical variables.

    Science.gov (United States)

    Clark, Renee M; Besterfield-Sacre, Mary E

    2009-03-01

    We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk.

  20. The Formation of Instruments of Management of Industrial Enterprises According to the Theoretical and Functional Approaches

    Directory of Open Access Journals (Sweden)

    Raiko Diana V.

    2018-03-01

    Full Text Available The article is aimed at the substantiation based on the analysis of the company theories of the basic theoretical provisions on the formation of industrial enterprise management instruments. The article determines that the subject of research in theories is enterprise, the object is the process of management of potential according to the forms of business organization and technology of partnership relations, the goal is high financial results, stabilization of the activity, and social responsibility. The publication carries out an analysis of enterprise theories on the determining of its essence as a socio-economic system in the following directions: technical preparation of production, economic theory and law, theory of systems, marketing-management. As a result of the research, the general set of functions has been identified – the socio-economic functions of enterprise by groups: information-legal, production, marketing-management, social responsibility. When building management instruments, it is suggested to take into consideration the direct and inverse relationships of enterprise at all levels of management – micro, meso and macro. On this ground, the authors have developed provisions on formation of instruments of management of industrial enterprises according to two approaches – theoretical and functional.

  1. International Approaches to Financial Instruments and Their Application in Ukraine

    OpenAIRE

    Viktor Zamlynskyy

    2013-01-01

    Introduction of International Financial Reporting Standards in Ukraine requires scientific and methodological study of their specific use in national practice. The essence and types of financial instruments have been researched. The regulatory support for their accounting in Ukraine has been established. The authors have analyzed the provisions of the International Financial Reporting Standards governing the financial instruments accounting, worked out characteristics of existing methodology ...

  2. Predicting College Women's Career Plans: Instrumentality, Work, and Family

    Science.gov (United States)

    Savela, Alexandra E.; O'Brien, Karen M.

    2016-01-01

    This study examined how college women's instrumentality and expectations about combining work and family predicted early career development variables. Specifically, 177 undergraduate women completed measures of instrumentality (i.e., traits such as ambition, assertiveness, and risk taking), willingness to compromise career for family, anticipated…

  3. An integrated approach for integrated intelligent instrumentation and control system (I{sup 3}CS)

    Energy Technology Data Exchange (ETDEWEB)

    Jung, C H; Kim, J T; Kwon, K C [Korea Atomic Energy Research Inst., Yusong, Taejon (Korea, Republic of)

    1997-07-01

    Nuclear power plants to guarantee the safety of public should be designed to reduce the operator intervention resulting in operating human errors, identify the process states in transients, and aid to make a decision of their tasks and guide operator actions. For the sake of this purpose, MMIS(MAN-Machine Interface System) in NPPs should be the integrated top-down approach tightly focused on the function-based task analysis including an advanced digital technology, an operator support function, and so on. The advanced I and C research team in KAERI has embarked on developing an Integrated Intelligent Instrumentation and Control System (I{sup 3}CS) for Korea`s next generation nuclear power plants. I{sup 3}CS bases the integrated top-down approach on the function-based task analysis, modern digital technology, standardization and simplification, availability and reliability, and protection of investment. (author). 4 refs, 6 figs.

  4. Microprocessor-based, on-line decision aid for resolving conflicting nuclear reactor instrumentation

    International Nuclear Information System (INIS)

    Alesso, H.P.

    1981-01-01

    We describe one design for a microprocessor-based, on-line decision aid for identifying and resolving false, conflicting, or misleading instrument indications resulting from certain systems interactions for a pressurized water reactor. The system processes sensor signals from groups of instruments that track together under nominal transient and certain accident conditions, and alarms when they do not track together. We examine multiple-casualty systems interaction and formulate a trial grouping of variables that track together under specified conditions. A two-of-three type redundancy check of key variables provides alarm and indication of conflicting information when one signal suddenly tracks in opposition due to multiple casualty, instrument failure, and/or locally abnormal conditions. Since a vote count of two of three variables in conflict as inconclusive evidence, the system is not designed to provide tripping or corrective action, but improves the operator/instrument interface by providing additional and partially digested information

  5. Non-invasive identification of organic materials in historical stringed musical instruments by reflection infrared spectroscopy: a methodological approach.

    Science.gov (United States)

    Invernizzi, Claudia; Daveri, Alessia; Vagnini, Manuela; Malagodi, Marco

    2017-05-01

    The analysis of historical musical instruments is becoming more relevant and the interest is increasingly moving toward the non-invasive reflection FTIR spectroscopy, especially for the analysis of varnishes. In this work, a specific infrared reflectance spectral library of organic compounds was created with the aim of identifying musical instrument materials in a totally non-invasive way. The analyses were carried out on pure organic compounds, as bulk samples and laboratory wooden models, to evaluate the diagnostic reflection mid-infrared (MIR) bands of proteins, polysaccharides, lipids, and resins by comparing reflection spectra before and after the KK correction. This methodological approach was applied to real case studies represented by four Stradivari violins and a Neapolitan mandolin.

  6. Variable system: An alternative approach for the analysis of mediated moderation.

    Science.gov (United States)

    Kwan, Joyce Lok Yin; Chan, Wai

    2018-06-01

    Mediated moderation (meMO) occurs when the moderation effect of the moderator (W) on the relationship between the independent variable (X) and the dependent variable (Y) is transmitted through a mediator (M). To examine this process empirically, 2 different model specifications (Type I meMO and Type II meMO) have been proposed in the literature. However, both specifications are found to be problematic, either conceptually or statistically. For example, it can be shown that each type of meMO model is statistically equivalent to a particular form of moderated mediation (moME), another process that examines the condition when the indirect effect from X to Y through M varies as a function of W. Consequently, it is difficult for one to differentiate these 2 processes mathematically. This study therefore has 2 objectives. First, we attempt to differentiate moME and meMO by proposing an alternative specification for meMO. Conceptually, this alternative specification is intuitively meaningful and interpretable, and, statistically, it offers meMO a unique representation that is no longer identical to its moME counterpart. Second, using structural equation modeling, we propose an integrated approach for the analysis of meMO as well as for other general types of conditional path models. VS, a computer software program that implements the proposed approach, has been developed to facilitate the analysis of conditional path models for applied researchers. Real examples are considered to illustrate how the proposed approach works in practice and to compare its performance against the traditional methods. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  7. Confidence Intervals for a Semiparametric Approach to Modeling Nonlinear Relations among Latent Variables

    Science.gov (United States)

    Pek, Jolynn; Losardo, Diane; Bauer, Daniel J.

    2011-01-01

    Compared to parametric models, nonparametric and semiparametric approaches to modeling nonlinearity between latent variables have the advantage of recovering global relationships of unknown functional form. Bauer (2005) proposed an indirect application of finite mixtures of structural equation models where latent components are estimated in the…

  8. State-Space Modeling and Performance Analysis of Variable-Speed Wind Turbine Based on a Model Predictive Control Approach

    Directory of Open Access Journals (Sweden)

    H. Bassi

    2017-04-01

    Full Text Available Advancements in wind energy technologies have led wind turbines from fixed speed to variable speed operation. This paper introduces an innovative version of a variable-speed wind turbine based on a model predictive control (MPC approach. The proposed approach provides maximum power point tracking (MPPT, whose main objective is to capture the maximum wind energy in spite of the variable nature of the wind’s speed. The proposed MPC approach also reduces the constraints of the two main functional parts of the wind turbine: the full load and partial load segments. The pitch angle for full load and the rotating force for the partial load have been fixed concurrently in order to balance power generation as well as to reduce the operations of the pitch angle. A mathematical analysis of the proposed system using state-space approach is introduced. The simulation results using MATLAB/SIMULINK show that the performance of the wind turbine with the MPC approach is improved compared to the traditional PID controller in both low and high wind speeds.

  9. Assessing Mucoadhesion in Polymer Gels: The Effect of Method Type and Instrument Variables

    Directory of Open Access Journals (Sweden)

    Jéssica Bassi da Silva

    2018-03-01

    Full Text Available The process of mucoadhesion has been widely studied using a wide variety of methods, which are influenced by instrumental variables and experiment design, making the comparison between the results of different studies difficult. The aim of this work was to standardize the conditions of the detachment test and the rheological methods of mucoadhesion assessment for semisolids, and introduce a texture profile analysis (TPA method. A factorial design was developed to suggest standard conditions for performing the detachment force method. To evaluate the method, binary polymeric systems were prepared containing poloxamer 407 and Carbopol 971P®, Carbopol 974P®, or Noveon® Polycarbophil. The mucoadhesion of systems was evaluated, and the reproducibility of these measurements investigated. This detachment force method was demonstrated to be reproduceable, and gave different adhesion when mucin disk or ex vivo oral mucosa was used. The factorial design demonstrated that all evaluated parameters had an effect on measurements of mucoadhesive force, but the same was not observed for the work of adhesion. It was suggested that the work of adhesion is a more appropriate metric for evaluating mucoadhesion. Oscillatory rheology was more capable of investigating adhesive interactions than flow rheology. TPA method was demonstrated to be reproducible and can evaluate the adhesiveness interaction parameter. This investigation demonstrates the need for standardized methods to evaluate mucoadhesion and makes suggestions for a standard study design.

  10. A Quantitative Approach to Variables Affecting Production of Short Films in Turkey

    Directory of Open Access Journals (Sweden)

    Vedat Akman

    2011-08-01

    Full Text Available This study aims to explore the influence of various variables affecting the production of migration themed short films in Turkey. We proceeded to our analysis using descriptive statistics to describe the main futures of the sample data quantitatively. Due to non-uniformity of the data available, we were unable to use inductive statistics. Our basic sample statistical results indicated that short film producers prefered to produce short films on domestic migration theme rather than international. Gender and university seemed on surface as significant determinants to the production of migration themed short films in Turkey. We also looked at the demografic variables to provide more insights into our quantitative approach.

  11. Economic instruments for environmental mitigation

    International Nuclear Information System (INIS)

    Wilkinson, A.

    1995-01-01

    A joint International Chamber of Commerce (ICC)/World Energy Council (WEC) Working Group has been studying a range of policy instruments which are being used or considered for use to address the question of ever increasing energy demand versus environmental protection, and pollution reduction. Economic instruments for such environmental protection include direct regulation, market-based instruments, and voluntary approaches. No single policy or device was likely to suffice in addressing the diversity of environmental problems currently faced. Altering energy prices must be seen in a social context, but some direct regulation may also be inevitable. Generally economic instruments of change were preferred as these were viewed as more flexible and cost-effective. (UK)

  12. A Miniaturized Variable Pressure Scanning Electron Microscope (MVP-SEM) for the Surface of Mars: An Instrument for the Planetary Science Community

    Science.gov (United States)

    Edmunson, J.; Gaskin, J. A.; Danilatos, G.; Doloboff, I. J.; Effinger, M. R.; Harvey, R. P.; Jerman, G. A.; Klein-Schoder, R.; Mackie, W.; Magera, B.; hide

    2016-01-01

    The Miniaturized Variable Pressure Scanning Electron Microscope(MVP-SEM) project, funded by the NASA Planetary Instrument Concepts for the Advancement of Solar System Observations (PICASSO) Research Opportunities in Space and Earth Science (ROSES), will build upon previous miniaturized SEM designs for lunar and International Space Station (ISS) applications and recent advancements in variable pressure SEM's to design and build a SEM to complete analyses of samples on the surface of Mars using the atmosphere as an imaging medium. By the end of the PICASSO work, a prototype of the primary proof-of-concept components (i.e., the electron gun, focusing optics and scanning system)will be assembled and preliminary testing in a Mars analog chamber at the Jet Propulsion Laboratory will be completed to partially fulfill Technology Readiness Level to 5 requirements for those components. The team plans to have Secondary Electron Imaging(SEI), Backscattered Electron (BSE) detection, and Energy Dispersive Spectroscopy (EDS) capabilities through the MVP-SEM.

  13. A labview approach to instrumentation for the TFTR bumper limiter alignment project

    International Nuclear Information System (INIS)

    Skelly, G.N.; Owens, D.K.

    1992-01-01

    This paper reports on a project recently undertaken to measure the alignment of the TFTR bumper limiter in relation to the toroidal magnetic field axis. The process involved the measurement of the toroidal magnetic field, and the positions of the tiles that make up the bumper limiter. The basis for the instrument control and data acquisition system was National Instrument's LabVIEW 2. LabVIEW is a graphical programming system for developing scientific and engineering applications on a Macintosh. For this project, a Macintosh IIci controlled the IEEE-488 GPIB programmable instruments via an interface box connected to the SCSI port of the computer. With LabVIEW, users create graphical software modules called virtual instruments instead of writing conventional text-based code. To measure the magnetic field, the control system acquired data from two nuclear magnetic resonance magnetometers while the torroidal field coils were pulsed. To measure the position of the tiles on the limiter, an instrumented mechanical arm was used inside the vessel

  14. Comparing surgical trays with redundant instruments with trays with reduced instruments: a cost analysis.

    Science.gov (United States)

    John-Baptiste, A; Sowerby, L J; Chin, C J; Martin, J; Rotenberg, B W

    2016-01-01

    When prearranged standard surgical trays contain instruments that are repeatedly unused, the redundancy can result in unnecessary health care costs. Our objective was to estimate potential savings by performing an economic evaluation comparing the cost of surgical trays with redundant instruments with surgical trays with reduced instruments ("reduced trays"). We performed a cost-analysis from the hospital perspective over a 1-year period. Using a mathematical model, we compared the direct costs of trays containing redundant instruments to reduced trays for 5 otolaryngology procedures. We incorporated data from several sources including local hospital data on surgical volume, the number of instruments on redundant and reduced trays, wages of personnel and time required to pack instruments. From the literature, we incorporated instrument depreciation costs and the time required to decontaminate an instrument. We performed 1-way sensitivity analyses on all variables, including surgical volume. Costs were estimated in 2013 Canadian dollars. The cost of redundant trays was $21 806 and the cost of reduced trays was $8803, for a 1-year cost saving of $13 003. In sensitivity analyses, cost savings ranged from $3262 to $21 395, based on the surgical volume at the institution. Variation in surgical volume resulted in a wider range of estimates, with a minimum of $3253 for low-volume to a maximum of $52 012 for high-volume institutions. Our study suggests moderate savings may be achieved by reducing surgical tray redundancy and, if applied to other surgical specialties, may result in savings to Canadian health care systems.

  15. A geostatistical approach to the change-of-support problem and variable-support data fusion in spatial analysis

    Science.gov (United States)

    Wang, Jun; Wang, Yang; Zeng, Hui

    2016-01-01

    A key issue to address in synthesizing spatial data with variable-support in spatial analysis and modeling is the change-of-support problem. We present an approach for solving the change-of-support and variable-support data fusion problems. This approach is based on geostatistical inverse modeling that explicitly accounts for differences in spatial support. The inverse model is applied here to produce both the best predictions of a target support and prediction uncertainties, based on one or more measurements, while honoring measurements. Spatial data covering large geographic areas often exhibit spatial nonstationarity and can lead to computational challenge due to the large data size. We developed a local-window geostatistical inverse modeling approach to accommodate these issues of spatial nonstationarity and alleviate computational burden. We conducted experiments using synthetic and real-world raster data. Synthetic data were generated and aggregated to multiple supports and downscaled back to the original support to analyze the accuracy of spatial predictions and the correctness of prediction uncertainties. Similar experiments were conducted for real-world raster data. Real-world data with variable-support were statistically fused to produce single-support predictions and associated uncertainties. The modeling results demonstrate that geostatistical inverse modeling can produce accurate predictions and associated prediction uncertainties. It is shown that the local-window geostatistical inverse modeling approach suggested offers a practical way to solve the well-known change-of-support problem and variable-support data fusion problem in spatial analysis and modeling.

  16. How the 2SLS/IV estimator can handle equality constraints in structural equation models: a system-of-equations approach.

    Science.gov (United States)

    Nestler, Steffen

    2014-05-01

    Parameters in structural equation models are typically estimated using the maximum likelihood (ML) approach. Bollen (1996) proposed an alternative non-iterative, equation-by-equation estimator that uses instrumental variables. Although this two-stage least squares/instrumental variables (2SLS/IV) estimator has good statistical properties, one problem with its application is that parameter equality constraints cannot be imposed. This paper presents a mathematical solution to this problem that is based on an extension of the 2SLS/IV approach to a system of equations. We present an example in which our approach was used to examine strong longitudinal measurement invariance. We also investigated the new approach in a simulation study that compared it with ML in the examination of the equality of two latent regression coefficients and strong measurement invariance. Overall, the results show that the suggested approach is a useful extension of the original 2SLS/IV estimator and allows for the effective handling of equality constraints in structural equation models. © 2013 The British Psychological Society.

  17. A critical appraisal of instruments to measure outcomes of interprofessional education.

    Science.gov (United States)

    Oates, Matthew; Davidson, Megan

    2015-04-01

    Interprofessional education (IPE) is believed to prepare health professional graduates for successful collaborative practice. A range of instruments have been developed to measure the outcomes of IPE. An understanding of the psychometric properties of these instruments is important if they are to be used to measure the effectiveness of IPE. This review set out to identify instruments available to measure outcomes of IPE and collaborative practice in pre-qualification health professional students and to critically appraise the psychometric properties of validity, responsiveness and reliability against contemporary standards for instrument design. Instruments were selected from a pool of extant instruments and subjected to critical appraisal to determine whether they satisfied inclusion criteria. The qualitative and psychometric attributes of the included instruments were appraised using a checklist developed for this review. Nine instruments were critically appraised, including the widely adopted Readiness for Interprofessional Learning Scale (RIPLS) and the Interdisciplinary Education Perception Scale (IEPS). Validity evidence for instruments was predominantly based on test content and internal structure. Ceiling effects and lack of scale width contribute to the inability of some instruments to detect change in variables of interest. Limited reliability data were reported for two instruments. Scale development and scoring protocols were generally reported by instrument developers, but the inconsistent application of scoring protocols for some instruments was apparent. A number of instruments have been developed to measure outcomes of IPE in pre-qualification health professional students. Based on reported validity evidence and reliability data, the psychometric integrity of these instruments is limited. The theoretical test construction paradigm on which instruments have been developed may be contributing to the failure of some instruments to detect change in

  18. Novel Harmonic Regularization Approach for Variable Selection in Cox’s Proportional Hazards Model

    Directory of Open Access Journals (Sweden)

    Ge-Jin Chu

    2014-01-01

    Full Text Available Variable selection is an important issue in regression and a number of variable selection methods have been proposed involving nonconvex penalty functions. In this paper, we investigate a novel harmonic regularization method, which can approximate nonconvex Lq  (1/2approach, which can produce solutions that closely approximate those for the convex loss function and the nonconvex regularization. Simulation results based on the artificial datasets and four real microarray gene expression datasets, such as real diffuse large B-cell lymphoma (DCBCL, the lung cancer, and the AML datasets, show that the harmonic regularization method can be more accurate for variable selection than existing Lasso series methods.

  19. We Know the Yin-But Where Is the Yang? Toward a Balanced Approach on Common Source Bias in Public Administration Scholarship.

    Science.gov (United States)

    George, Bert; Pandey, Sanjay K

    2017-06-01

    Surveys have long been a dominant instrument for data collection in public administration. However, it has become widely accepted in the last decade that the usage of a self-reported instrument to measure both the independent and dependent variables results in common source bias (CSB). In turn, CSB is argued to inflate correlations between variables, resulting in biased findings. Subsequently, a narrow blinkered approach on the usage of surveys as single data source has emerged. In this article, we argue that this approach has resulted in an unbalanced perspective on CSB. We argue that claims on CSB are exaggerated, draw upon selective evidence, and project what should be tentative inferences as certainty over large domains of inquiry. We also discuss the perceptual nature of some variables and measurement validity concerns in using archival data. In conclusion, we present a flowchart that public administration scholars can use to analyze CSB concerns.

  20. Bayesian Population Physiologically-Based Pharmacokinetic (PBPK Approach for a Physiologically Realistic Characterization of Interindividual Variability in Clinically Relevant Populations.

    Directory of Open Access Journals (Sweden)

    Markus Krauss

    Full Text Available Interindividual variability in anatomical and physiological properties results in significant differences in drug pharmacokinetics. The consideration of such pharmacokinetic variability supports optimal drug efficacy and safety for each single individual, e.g. by identification of individual-specific dosings. One clear objective in clinical drug development is therefore a thorough characterization of the physiological sources of interindividual variability. In this work, we present a Bayesian population physiologically-based pharmacokinetic (PBPK approach for the mechanistically and physiologically realistic identification of interindividual variability. The consideration of a generic and highly detailed mechanistic PBPK model structure enables the integration of large amounts of prior physiological knowledge, which is then updated with new experimental data in a Bayesian framework. A covariate model integrates known relationships of physiological parameters to age, gender and body height. We further provide a framework for estimation of the a posteriori parameter dependency structure at the population level. The approach is demonstrated considering a cohort of healthy individuals and theophylline as an application example. The variability and co-variability of physiological parameters are specified within the population; respectively. Significant correlations are identified between population parameters and are applied for individual- and population-specific visual predictive checks of the pharmacokinetic behavior, which leads to improved results compared to present population approaches. In the future, the integration of a generic PBPK model into an hierarchical approach allows for extrapolations to other populations or drugs, while the Bayesian paradigm allows for an iterative application of the approach and thereby a continuous updating of physiological knowledge with new data. This will facilitate decision making e.g. from preclinical to

  1. Effect of geotropism on instrument readings

    International Nuclear Information System (INIS)

    Rolph, James T.

    2006-01-01

    A review of gravity's effect on instrument readings, also referred to as geotropism. In this essay a review of meter movement construction and the effect are reviewed as it applies to portable radiation instruments. Reference to the three ANSI standards and their requirements are reviewed. An alternate approach to test for the effects is offered

  2. Instrumentation for the follow-up of severe accidents

    International Nuclear Information System (INIS)

    Munoz Sanchez, A.; Nino Perote, R.

    2000-01-01

    During severe accidents, it is foreseeable that the instrumentation installed in a plant is subjected to conditions which are more hostile than those for which the instrumentation was designed and qualified. Moreover, new, specific instrumentation is required to monitor variables which have not been considered until now, and to control systems which lessen the consequences of severe accidents. Both existing instrumentation used to monitor critical functions in design basis accident conditions and additional instrumentation which provides the information necessary to control and mitigate the consequences of severe accidents, have to be designed to withstand such conditions, especially in terms of measurements range, functional characteristics and qualification to withstand pressure and temperature loads resulting from steam explosion, hydrogen combustion/explosion and high levels of radiation over long periods of time. (Author)

  3. Systems approach to the design of the CCD sensors and camera electronics for the AIA and HMI instruments on solar dynamics observatory

    Science.gov (United States)

    Waltham, N.; Beardsley, S.; Clapp, M.; Lang, J.; Jerram, P.; Pool, P.; Auker, G.; Morris, D.; Duncan, D.

    2017-11-01

    Solar Dynamics Observatory (SDO) is imaging the Sun in many wavelengths near simultaneously and with a resolution ten times higher than the average high-definition television. In this paper we describe our innovative systems approach to the design of the CCD cameras for two of SDO's remote sensing instruments, the Atmospheric Imaging Assembly (AIA) and the Helioseismic and Magnetic Imager (HMI). Both instruments share use of a custom-designed 16 million pixel science-grade CCD and common camera readout electronics. A prime requirement was for the CCD to operate with significantly lower drive voltages than before, motivated by our wish to simplify the design of the camera readout electronics. Here, the challenge lies in the design of circuitry to drive the CCD's highly capacitive electrodes and to digitize its analogue video output signal with low noise and to high precision. The challenge is greatly exacerbated when forced to work with only fully space-qualified, radiation-tolerant components. We describe our systems approach to the design of the AIA and HMI CCD and camera electronics, and the engineering solutions that enabled us to comply with both mission and instrument science requirements.

  4. Surveillance of instruments by noise analysis

    International Nuclear Information System (INIS)

    Thie, J.A.

    1981-01-01

    Random fluctuations of neutron flux, temperature, and pressure in a reactor provide multifrequency excitation of the corresponding instrumentation chains. Mathematical descriptors suitable for characterizing the output, or noise, of the instrumentation are reviewed with a view toward using such noise in detecting instrument faults. Demonstrations of the feasibility of this approach in a number of reactors provide illustrative examples. Comparisons with traditional surveillance testing are made, and a number of advantages and some disadvantages of using noise analysis as a supplementary technique are pointed out

  5. Minimally invasive instrumentation without fusion during posterior thoracic corpectomies: a comparison of percutaneously instrumented nonfused segments with open instrumented fused segments.

    Science.gov (United States)

    Lau, Darryl; Chou, Dean

    2017-07-01

    OBJECTIVE During the mini-open posterior corpectomy, percutaneous instrumentation without fusion is performed above and below the corpectomy level. In this study, the authors' goal was to compare the perioperative and long-term implant failure rates of patients who underwent nonfused percutaneous instrumentation with those of patients who underwent traditional open instrumented fusion. METHODS Adult patients who underwent posterior thoracic corpectomies with cage reconstruction between 2009 and 2014 were identified. Patients who underwent mini-open corpectomy had percutaneous instrumentation without fusion, and patients who underwent open corpectomy had instrumented fusion above and below the corpectomy site. The authors compared perioperative outcomes and rates of implant failure requiring reoperation between the open (fused) and mini-open (unfused) groups. RESULTS A total of 75 patients were identified, and 53 patients (32 open and 21 mini-open) were available for followup. The mean patient age was 52.8 years, and 56.6% of patients were male. There were no significant differences in baseline variables between the 2 groups. The overall perioperative complication rate was 15.1%, and there was no significant difference between the open and mini-open groups (18.8% vs 9.5%; p = 0.359). The mean hospital stay was 10.5 days. The open group required a significantly longer stay than the mini-open group (12.8 vs 7.1 days; p open and mini-open groups at 6 months (3.1% vs 0.0%, p = 0.413), 1 year (10.7% vs 6.2%, p = 0.620), and 2 years (18.2% vs 8.3%, p = 0.438). The overall mean follow-up was 29.2 months. CONCLUSIONS These findings suggest that percutaneous instrumentation without fusion in mini-open transpedicular corpectomies offers similar implant failure and reoperation rates as open instrumented fusion as far out as 2 years of follow-up.

  6. Importance of Intrinsic and Instrumental Value of Education in Pakistan

    Science.gov (United States)

    Kumar, Mahendar

    2017-01-01

    Normally, effectiveness of any object or thing is judged by two values; intrinsic and instrumental. To compare intrinsic value of education with instrumental value, this study has used the following variables: getting knowledge for its own sake, getting knowledge for social status, getting knowledge for job or business endeavor and getting…

  7. Incorporating Latent Variables into Discrete Choice Models - A Simultaneous Estimation Approach Using SEM Software

    Directory of Open Access Journals (Sweden)

    Dirk Temme

    2008-12-01

    Full Text Available Integrated choice and latent variable (ICLV models represent a promising new class of models which merge classic choice models with the structural equation approach (SEM for latent variables. Despite their conceptual appeal, applications of ICLV models in marketing remain rare. We extend previous ICLV applications by first estimating a multinomial choice model and, second, by estimating hierarchical relations between latent variables. An empirical study on travel mode choice clearly demonstrates the value of ICLV models to enhance the understanding of choice processes. In addition to the usually studied directly observable variables such as travel time, we show how abstract motivations such as power and hedonism as well as attitudes such as a desire for flexibility impact on travel mode choice. Furthermore, we show that it is possible to estimate such a complex ICLV model with the widely available structural equation modeling package Mplus. This finding is likely to encourage more widespread application of this appealing model class in the marketing field.

  8. Sensitivity analysis on uncertainty variables affecting the NPP's LUEC with probabilistic approach

    International Nuclear Information System (INIS)

    Nuryanti; Akhmad Hidayatno; Erlinda Muslim

    2013-01-01

    One thing that is quite crucial to be reviewed prior to any investment decision on the nuclear power plant (NPP) project is the calculation of project economic, including calculation of Levelized Unit Electricity Cost (LUEC). Infrastructure projects such as NPP’s project are vulnerable to a number of uncertainty variables. Information on the uncertainty variables which makes LUEC’s value quite sensitive due to the changes of them is necessary in order the cost overrun can be avoided. Therefore this study aimed to do the sensitivity analysis on variables that affect LUEC with probabilistic approaches. This analysis was done by using Monte Carlo technique that simulate the relationship between the uncertainty variables and visible impact on LUEC. The sensitivity analysis result shows the significant changes on LUEC value of AP1000 and OPR due to the sensitivity of investment cost and capacity factors. While LUEC changes due to sensitivity of U 3 O 8 ’s price looks not quite significant. (author)

  9. RECOVERY OF LARGE ANGULAR SCALE CMB POLARIZATION FOR INSTRUMENTS EMPLOYING VARIABLE-DELAY POLARIZATION MODULATORS

    Energy Technology Data Exchange (ETDEWEB)

    Miller, N. J.; Marriage, T. A.; Appel, J. W.; Bennett, C. L.; Eimer, J.; Essinger-Hileman, T.; Harrington, K.; Rostem, K.; Watts, D. J. [Department of Physics and Astronomy, Johns Hopkins University, 3400 N. Charles St., Baltimore, MD 21218 (United States); Chuss, D. T. [Department of Physics, Villanova University, 800 E Lancaster, Villanova, PA 19085 (United States); Wollack, E. J.; Fixsen, D. J.; Moseley, S. H.; Switzer, E. R., E-mail: Nathan.J.Miller@nasa.gov [Observational Cosmology Laboratory, Code 665, NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States)

    2016-02-20

    Variable-delay Polarization Modulators (VPMs) are currently being implemented in experiments designed to measure the polarization of the cosmic microwave background on large angular scales because of their capability for providing rapid, front-end polarization modulation and control over systematic errors. Despite the advantages provided by the VPM, it is important to identify and mitigate any time-varying effects that leak into the synchronously modulated component of the signal. In this paper, the effect of emission from a 300 K VPM on the system performance is considered and addressed. Though instrument design can greatly reduce the influence of modulated VPM emission, some residual modulated signal is expected. VPM emission is treated in the presence of rotational misalignments and temperature variation. Simulations of time-ordered data are used to evaluate the effect of these residual errors on the power spectrum. The analysis and modeling in this paper guides experimentalists on the critical aspects of observations using VPMs as front-end modulators. By implementing the characterizations and controls as described, front-end VPM modulation can be very powerful for mitigating 1/f noise in large angular scale polarimetric surveys. None of the systematic errors studied fundamentally limit the detection and characterization of B-modes on large scales for a tensor-to-scalar ratio of r = 0.01. Indeed, r < 0.01 is achievable with commensurately improved characterizations and controls.

  10. Probabilistic approaches to accounting for data variability in the practical application of bioavailability in predicting aquatic risks from metals.

    Science.gov (United States)

    Ciffroy, Philippe; Charlatchka, Rayna; Ferreira, Daniel; Marang, Laura

    2013-07-01

    The biotic ligand model (BLM) theoretically enables the derivation of environmental quality standards that are based on true bioavailable fractions of metals. Several physicochemical variables (especially pH, major cations, dissolved organic carbon, and dissolved metal concentrations) must, however, be assigned to run the BLM, but they are highly variable in time and space in natural systems. This article describes probabilistic approaches for integrating such variability during the derivation of risk indexes. To describe each variable using a probability density function (PDF), several methods were combined to 1) treat censored data (i.e., data below the limit of detection), 2) incorporate the uncertainty of the solid-to-liquid partitioning of metals, and 3) detect outliers. From a probabilistic perspective, 2 alternative approaches that are based on log-normal and Γ distributions were tested to estimate the probability of the predicted environmental concentration (PEC) exceeding the predicted non-effect concentration (PNEC), i.e., p(PEC/PNEC>1). The probabilistic approach was tested on 4 real-case studies based on Cu-related data collected from stations on the Loire and Moselle rivers. The approach described in this article is based on BLM tools that are freely available for end-users (i.e., the Bio-Met software) and on accessible statistical data treatments. This approach could be used by stakeholders who are involved in risk assessments of metals for improving site-specific studies. Copyright © 2013 SETAC.

  11. Laser-induced breakdown spectroscopy (LIBS), part II: review of instrumental and methodological approaches to material analysis and applications to different fields.

    Science.gov (United States)

    Hahn, David W; Omenetto, Nicoló

    2012-04-01

    The first part of this two-part review focused on the fundamental and diagnostics aspects of laser-induced plasmas, only touching briefly upon concepts such as sensitivity and detection limits and largely omitting any discussion of the vast panorama of the practical applications of the technique. Clearly a true LIBS community has emerged, which promises to quicken the pace of LIBS developments, applications, and implementations. With this second part, a more applied flavor is taken, and its intended goal is summarizing the current state-of-the-art of analytical LIBS, providing a contemporary snapshot of LIBS applications, and highlighting new directions in laser-induced breakdown spectroscopy, such as novel approaches, instrumental developments, and advanced use of chemometric tools. More specifically, we discuss instrumental and analytical approaches (e.g., double- and multi-pulse LIBS to improve the sensitivity), calibration-free approaches, hyphenated approaches in which techniques such as Raman and fluorescence are coupled with LIBS to increase sensitivity and information power, resonantly enhanced LIBS approaches, signal processing and optimization (e.g., signal-to-noise analysis), and finally applications. An attempt is made to provide an updated view of the role played by LIBS in the various fields, with emphasis on applications considered to be unique. We finally try to assess where LIBS is going as an analytical field, where in our opinion it should go, and what should still be done for consolidating the technique as a mature method of chemical analysis. © 2012 Society for Applied Spectroscopy

  12. Risk implications of renewable support instruments: Comparative analysis of feed-in tariffs and premiums using a mean–variance approach

    International Nuclear Information System (INIS)

    Kitzing, Lena

    2014-01-01

    Different support instruments for renewable energy expose investors differently to market risks. This has implications on the attractiveness of investment. We use mean–variance portfolio analysis to identify the risk implications of two support instruments: feed-in tariffs and feed-in premiums. Using cash flow analysis, Monte Carlo simulations and mean–variance analysis, we quantify risk-return relationships for an exemplary offshore wind park in a simplified setting. We show that feed-in tariffs systematically require lower direct support levels than feed-in premiums while providing the same attractiveness for investment, because they expose investors to less market risk. These risk implications should be considered when designing policy schemes. - Highlights: • Mean–variance portfolio approach to analyse risk implications of policy instruments. • We show that feed-in tariffs require lower support levels than feed-in premiums. • This systematic effect stems from the lower exposure of investors to market risk. • We created a stochastic model for an exemplary offshore wind park in West Denmark. • We quantify risk-return, Sharpe Ratios and differences in required support levels

  13. We Know the Yin—But Where Is the Yang? Toward a Balanced Approach on Common Source Bias in Public Administration Scholarship

    Science.gov (United States)

    George, Bert; Pandey, Sanjay K.

    2017-01-01

    Surveys have long been a dominant instrument for data collection in public administration. However, it has become widely accepted in the last decade that the usage of a self-reported instrument to measure both the independent and dependent variables results in common source bias (CSB). In turn, CSB is argued to inflate correlations between variables, resulting in biased findings. Subsequently, a narrow blinkered approach on the usage of surveys as single data source has emerged. In this article, we argue that this approach has resulted in an unbalanced perspective on CSB. We argue that claims on CSB are exaggerated, draw upon selective evidence, and project what should be tentative inferences as certainty over large domains of inquiry. We also discuss the perceptual nature of some variables and measurement validity concerns in using archival data. In conclusion, we present a flowchart that public administration scholars can use to analyze CSB concerns. PMID:29046599

  14. Patient perspective: choosing or developing instruments.

    Science.gov (United States)

    Kirwan, John R; Fries, James F; Hewlett, Sarah; Osborne, Richard H

    2011-08-01

    Previous Outcome Measures in Rheumatology (OMERACT) meetings recognized that patients view outcomes of intervention from a different perspective. This preconference position paper briefly sets out 2 patient-reported outcome (PRO) instrument approaches, the PROMISE computer adaptive testing (CAT) system and development of a rheumatoid arthritis-specific questionnaire to measure fatigue; a tentative proposal for a PRO instrument development pathway is also made.

  15. Exploratory and Creative Properties of Physical-Modeling-based Musical Instruments

    DEFF Research Database (Denmark)

    Gelineck, Steven

    Digital musical instruments are developed to enable musicians to find new ways of expressing themselves. The development and evaluation of these instruments can be approached from many different perspectives depending on which capabilities one wants the musicians to have. This thesis attempts...... to approach development and evaluation of these instruments with the notion that instruments today are able to facilitate the creative process that is so crucial for creating music. The fundamental question pursued throughout the thesis is how creative work processes of composers of electronic music can...... be supported and even challenged by the instruments they use. What is it that makes one musical instrument more creatively inspiring than another, and how do we evaluate how well it succeeds? In order to present answers to these questions, the thesis focusses on the sound synthesis technique of physical...

  16. Remote and Virtual Instrumentation Platform for Distance Learning

    Directory of Open Access Journals (Sweden)

    Tom Eppes

    2010-08-01

    Full Text Available This journal presents distance learning using the National Instruments ELVIS II and how Multisim can be combined with ELVIS II for distance learning. National Instrument’s ELVIS II is a new version that can easily be used for e-learning. It features 12 of the commonly used instruments in engineering and science laboratories, including an oscilloscope, a function generator, a variable power supply, and an isolated digital multi-meter in a low-cost and easy-to-use platform and completes integration with Multisim software for SPICE simulation, which simplifies the teaching of circuit design. As NI ELVIS II is based on LabView, designers can easily customize the 12 instruments or can create their own using the provided source code for the instruments.

  17. Flexible software process lines in practice: A metamodel-based approach to effectively construct and manage families of software process models

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Ternité, Thomas; Friedrich, Jan

    2017-01-01

    Process flexibility and adaptability is frequently discussed, and several proposals aim to improve software processes for a given organization-/project context. A software process line (SPrL) is an instrument to systematically construct and manage variable software processes, by combining pre-def......: A metamodel-based approach to effectively construct and manage families of software process models [Ku16]. This paper was published as original research article in the Journal of Systems and Software.......Process flexibility and adaptability is frequently discussed, and several proposals aim to improve software processes for a given organization-/project context. A software process line (SPrL) is an instrument to systematically construct and manage variable software processes, by combining pre...... to construct flexible SPrLs and show its practical application in the German V-Modell XT. We contribute a proven approach that is presented as metamodel fragment for reuse and implementation in further process modeling approaches. This summary refers to the paper Flexible software process lines in practice...

  18. An analytical approach to separate climate and human contributions to basin streamflow variability

    Science.gov (United States)

    Li, Changbin; Wang, Liuming; Wanrui, Wang; Qi, Jiaguo; Linshan, Yang; Zhang, Yuan; Lei, Wu; Cui, Xia; Wang, Peng

    2018-04-01

    Climate variability and anthropogenic regulations are two interwoven factors in the ecohydrologic system across large basins. Understanding the roles that these two factors play under various hydrologic conditions is of great significance for basin hydrology and sustainable water utilization. In this study, we present an analytical approach based on coupling water balance method and Budyko hypothesis to derive effectiveness coefficients (ECs) of climate change, as a way to disentangle contributions of it and human activities to the variability of river discharges under different hydro-transitional situations. The climate dominated streamflow change (ΔQc) by EC approach was compared with those deduced by the elasticity method and sensitivity index. The results suggest that the EC approach is valid and applicable for hydrologic study at large basin scale. Analyses of various scenarios revealed that contributions of climate change and human activities to river discharge variation differed among the regions of the study area. Over the past several decades, climate change dominated hydro-transitions from dry to wet, while human activities played key roles in the reduction of streamflow during wet to dry periods. Remarkable decline of discharge in upstream was mainly due to human interventions, although climate contributed more to runoff increasing during dry periods in the semi-arid downstream. Induced effectiveness on streamflow changes indicated a contribution ratio of 49% for climate and 51% for human activities at the basin scale from 1956 to 2015. The mathematic derivation based simple approach, together with the case example of temporal segmentation and spatial zoning, could help people understand variation of river discharge with more details at a large basin scale under the background of climate change and human regulations.

  19. Knowledge sharing behaviour and demographic variables amongst secondary school teachers in and around Gaborone, Botswana

    Directory of Open Access Journals (Sweden)

    Isaac C. Mogotsi

    2011-11-01

    Full Text Available The purpose of this study was to investigate the relationships between knowledge sharing behaviour and the demographic variables gender, age, organisational tenure and professional tenure. Following a correlational survey approach, the study sourced its data from senior secondary school teachers in and around Gaborone, Botswana. Knowledge sharing behaviour was measured using an instrument sourced from the extant literature. No statistically significant relationship was detected between knowledge sharing behaviour and gender, age, or professional tenure. Only organisational tenure weakly negatively correlated with knowledge sharing behaviour. Thus, according to these findings, demographic variables do not appear to be important determinants of knowledge sharing behaviour.

  20. Instrumental Approach and Diagnosis of Total Inorganics in a ...

    African Journals Online (AJOL)

    Typical Carbonaceous matter are often widespread and reveal relatively wide range of dominant organic types. The instrumental diagnosis to subject them to oxidation, combustion and/or incineration serve till date mandatory fundamental requirement in the further pursuance of their mineralogical and wet Chemical ...

  1. Age-Related Changes in Bimanual Instrument Playing with Rhythmic Cueing

    Directory of Open Access Journals (Sweden)

    Soo Ji Kim

    2017-09-01

    Full Text Available Deficits in bimanual coordination of older adults have been demonstrated to significantly limit their functioning in daily life. As a bimanual sensorimotor task, instrument playing has great potential for motor and cognitive training in advanced age. While the process of matching a person’s repetitive movements to auditory rhythmic cueing during instrument playing was documented to involve motor and attentional control, investigation into whether the level of cognitive functioning influences the ability to rhythmically coordinate movement to an external beat in older populations is relatively limited. Therefore, the current study aimed to examine how timing accuracy during bimanual instrument playing with rhythmic cueing differed depending on the degree of participants’ cognitive aging. Twenty one young adults, 20 healthy older adults, and 17 older adults with mild dementia participated in this study. Each participant tapped an electronic drum in time to the rhythmic cueing provided using both hands simultaneously and in alternation. During bimanual instrument playing with rhythmic cueing, mean and variability of synchronization errors were measured and compared across the groups and the tempo of cueing during each type of tapping task. Correlations of such timing parameters with cognitive measures were also analyzed. The results showed that the group factor resulted in significant differences in the synchronization errors-related parameters. During bimanual tapping tasks, cognitive decline resulted in differences in synchronization errors between younger adults and older adults with mild dimentia. Also, in terms of variability of synchronization errors, younger adults showed significant differences in maintaining timing performance from older adults with and without mild dementia, which may be attributed to decreased processing time for bimanual coordination due to aging. Significant correlations were observed between variability of

  2. An automated calibration laboratory for flight research instrumentation: Requirements and a proposed design approach

    Science.gov (United States)

    Oneill-Rood, Nora; Glover, Richard D.

    1990-01-01

    NASA's Dryden Flight Research Facility (Ames-Dryden), operates a diverse fleet of research aircraft which are heavily instrumented to provide both real time data for in-flight monitoring and recorded data for postflight analysis. Ames-Dryden's existing automated calibration (AUTOCAL) laboratory is a computerized facility which tests aircraft sensors to certify accuracy for anticipated harsh flight environments. Recently, a major AUTOCAL lab upgrade was initiated; the goal of this modernization is to enhance productivity and improve configuration management for both software and test data. The new system will have multiple testing stations employing distributed processing linked by a local area network to a centralized database. The baseline requirements for the new AUTOCAL lab and the design approach being taken for its mechanization are described.

  3. Cognitive Preconditions of Early Reading and Spelling: A Latent-Variable Approach with Longitudinal Data

    Science.gov (United States)

    Preßler, Anna-Lena; Könen, Tanja; Hasselhorn, Marcus; Krajewski, Kristin

    2014-01-01

    The aim of the present study was to empirically disentangle the interdependencies of the impact of nonverbal intelligence, working memory capacities, and phonological processing skills on early reading decoding and spelling within a latent variable approach. In a sample of 127 children, these cognitive preconditions were assessed before the onset…

  4. Determinants of The Application of Macro Prudential Instruments

    Directory of Open Access Journals (Sweden)

    Zakaria Firano

    2017-09-01

    Full Text Available The use of macro prudential instruments today gives rise to a major debate within the walls of central banks and other authorities in charge of financial stability. Contrary to micro prudential instruments, whose effects remain limited, macro prudential instruments are different in nature and can affect the stability of the financial system. By influencing the financial cycle and the financial structure of financial institutions, the use of such instruments should be conducted with great vigilance as well as macroeconomic and financial expertise. But the experiences of central banks in this area are sketchy, and only some emerging countries have experience using these types of instruments in different ways. This paper presents an analysis of instruments of macro prudential policy and attempts to empirically demonstrate that these instruments should be used only in specific economic and financial situations. Indeed, the results obtained, using modeling bivariate panel, confirm that these instruments are more effective when used to mitigate the euphoria of financial and economic cycles. In this sense, the output gap, describing the economic cycle, and the Z-score are the intermediate variables for the activation of capital instruments. Moreover, the liquidity ratio and changes in bank profitability are the two early warning indicators for activation of liquidity instruments.

  5. Fair Value Accounting for Financial Instruments – Conceptual Approach and Implications

    OpenAIRE

    Dumitru MATIS; Carmen Giorgiana BONACI

    2008-01-01

    This study complements the growing literature on the value relevance of fair value by examining the validity of the hypothesis that fair value is more informative than historical cost as a financial reporting standard for financial instruments. We therefore compare the relative explanatory power of fair value and historical cost in explaining equity values. In order to reflect fair values’ role in offering the fair view where financial instruments are concerned we briefly reviewed capital mar...

  6. The Propagation of Movement Variability in Time: A Methodological Approach for Discrete Movements with Multiple Degrees of Freedom

    Science.gov (United States)

    Krüger, Melanie; Straube, Andreas; Eggert, Thomas

    2017-01-01

    In recent years, theory-building in motor neuroscience and our understanding of the synergistic control of the redundant human motor system has significantly profited from the emergence of a range of different mathematical approaches to analyze the structure of movement variability. Approaches such as the Uncontrolled Manifold method or the Noise-Tolerance-Covariance decomposition method allow to detect and interpret changes in movement coordination due to e.g., learning, external task constraints or disease, by analyzing the structure of within-subject, inter-trial movement variability. Whereas, for cyclical movements (e.g., locomotion), mathematical approaches exist to investigate the propagation of movement variability in time (e.g., time series analysis), similar approaches are missing for discrete, goal-directed movements, such as reaching. Here, we propose canonical correlation analysis as a suitable method to analyze the propagation of within-subject variability across different time points during the execution of discrete movements. While similar analyses have already been applied for discrete movements with only one degree of freedom (DoF; e.g., Pearson's product-moment correlation), canonical correlation analysis allows to evaluate the coupling of inter-trial variability across different time points along the movement trajectory for multiple DoF-effector systems, such as the arm. The theoretical analysis is illustrated by empirical data from a study on reaching movements under normal and disturbed proprioception. The results show increased movement duration, decreased movement amplitude, as well as altered movement coordination under ischemia, which results in a reduced complexity of movement control. Movement endpoint variability is not increased under ischemia. This suggests that healthy adults are able to immediately and efficiently adjust the control of complex reaching movements to compensate for the loss of proprioceptive information. Further, it is

  7. Technical Training seminar: Texas Instruments

    CERN Multimedia

    2006-01-01

    Monday 6 November TECHNICAL TRAINING SEMINAR 14:00 to 17:30 - Training Centre Auditorium (bldg. 593) Texas Instruments Technical Seminar Michael Scholtholt, Field Application Engineer / TEXAS INSTRUMENTS (US, D, CH) POWER - A short approach to Texas Instruments power products Voltage mode vs. current mode control Differentiating DC/DC converters by analyzing control and compensation schemes: line / load regulation, transient response, BOM, board space, ease-of-use Introduction to the SWIFT software FPGA + CPLD power solutions WIRELESS / CHIPCON Decision criteria when choosing a RF platform Introduction to Texas Instruments wireless products: standardized platforms proprietary platforms ( 2.4 GHz / sub 1 GHz) development tools Antenna design: example for 2.4 GHz questions, discussion Industrial partners: Robert Medioni, François Caloz / Spoerle Electronic, CH-1440 Montagny (VD), Switzerland Phone: +41 24 447 0137, email: RMedioni@spoerle.com, http://www.spoerle.com Language: English. Free s...

  8. Analysis of optically variable devices using a photometric light-field approach

    Science.gov (United States)

    Soukup, Daniel; Å tolc, Svorad; Huber-Mörk, Reinhold

    2015-03-01

    Diffractive Optically Variable Image Devices (DOVIDs), sometimes loosely referred to as holograms, are popular security features for protecting banknotes, ID cards, or other security documents. Inspection, authentication, as well as forensic analysis of these security features are still demanding tasks requiring special hardware tools and expert knowledge. Existing equipment for such analyses is based either on a microscopic analysis of the grating structure or a point-wise projection and recording of the diffraction patterns. We investigated approaches for an examination of DOVID security features based on sampling the Bidirectional Reflectance Distribution Function (BRDF) of DOVIDs using photometric stereo- and light-field-based methods. Our approach is demonstrated on the practical task of automated discrimination between genuine and counterfeited DOVIDs on banknotes. For this purpose, we propose a tailored feature descriptor which is robust against several expected sources of inaccuracy but still specific enough for the given task. The suggested approach is analyzed from both theoretical as well as practical viewpoints and w.r.t. analysis based on photometric stereo and light fields. We show that especially the photometric method provides a reliable and robust tool for revealing DOVID behavior and authenticity.

  9. The Relationship between Executive Functions and Language Abilities in Children: A Latent Variables Approach

    Science.gov (United States)

    Kaushanskaya, Margarita; Park, Ji Sook; Gangopadhyay, Ishanti; Davidson, Meghan M.; Weismer, Susan Ellis

    2017-01-01

    Purpose: We aimed to outline the latent variables approach for measuring nonverbal executive function (EF) skills in school-age children, and to examine the relationship between nonverbal EF skills and language performance in this age group. Method: Seventy-one typically developing children, ages 8 through 11, participated in the study. Three EF…

  10. Increased Science Instrumentation Funding Strengthens Mars Program

    Science.gov (United States)

    Graham, Lee D.; Graff, T. G.

    2012-01-01

    As the strategic knowledge gaps mature for the exploration of Mars, Mars sample return (MSR), and Phobos/Deimos missions, one approach that becomes more probable involves smaller science instrumentation and integrated science suites. Recent technological advances provide the foundation for a significant evolution of instrumentation; however, the funding support is currently too small to fully utilize these advances. We propose that an increase in funding for instrumentation development occur in the near-term so that these foundational technologies can be applied. These instruments would directly address the significant knowledge gaps for humans to Mars orbit, humans to the Martian surface, and humans to Phobos/ Deimos. They would also address the topics covered by the Decadal Survey and the Mars scientific goals, objectives, investigations and priorities as stated by the MEPAG. We argue that an increase of science instrumentation funding would be of great benefit to the Mars program as well as the potential for human exploration of the Mars system. If the total non-Earth-related planetary science instrumentation budget were increased 100% it would not add an appreciable amount to the overall NASA budget and would provide the real potential for future breakthroughs. If such an approach were implemented in the near-term, NASA would benefit greatly in terms of science knowledge of the Mars, Phobos/Deimos system, exploration risk mitigation, technology development, and public interest.

  11. Instrument surveillance and calibration verification through plant wide monitoring using autoassociative neural networks

    International Nuclear Information System (INIS)

    Wrest, D.J.; Hines, J.W.; Uhrig, R.E.

    1996-01-01

    The approach to instrument surveillance and calibration verification (ISCV) through plant wide monitoring proposed in this paper is an autoassociative neural network (AANN) which will utilize digitized data presently available in the Safety Parameter Display computer system from Florida Power Corporations Crystal River number 3 nuclear power plant. An autoassociative neural network is one in which the outputs are trained to emulate the inputs over an appropriate dynamic range. The relationships between the different variables are embedded in the weights by the training process. As a result, the output can be a correct version of an input pattern that has been distorted by noise, missing data, or non-linearities. Plant variables that have some degree of coherence with each other constitute the inputs to the network. Once the network has been trained with normal operational data it has been shown to successfully monitor the selected plant variables to detect sensor drift or failure by simply comparing the network inputs with the outputs. The AANN method of monitoring many variables not only indicates that there is a sensor failure, it clearly indicates the signal channel in which the signal error has occurred. (author). 11 refs, 8 figs, 2 tabs

  12. Instrument surveillance and calibration verification through plant wide monitoring using autoassociative neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Wrest, D J; Hines, J W; Uhrig, R E [Tennessee Univ., Knoxville, TN (United States). Dept. of Nuclear Engineering

    1997-12-31

    The approach to instrument surveillance and calibration verification (ISCV) through plant wide monitoring proposed in this paper is an autoassociative neural network (AANN) which will utilize digitized data presently available in the Safety Parameter Display computer system from Florida Power Corporations Crystal River number 3 nuclear power plant. An autoassociative neural network is one in which the outputs are trained to emulate the inputs over an appropriate dynamic range. The relationships between the different variables are embedded in the weights by the training process. As a result, the output can be a correct version of an input pattern that has been distorted by noise, missing data, or non-linearities. Plant variables that have some degree of coherence with each other constitute the inputs to the network. Once the network has been trained with normal operational data it has been shown to successfully monitor the selected plant variables to detect sensor drift or failure by simply comparing the network inputs with the outputs. The AANN method of monitoring many variables not only indicates that there is a sensor failure, it clearly indicates the signal channel in which the signal error has occurred. (author). 11 refs, 8 figs, 2 tabs.

  13. A variational conformational dynamics approach to the selection of collective variables in metadynamics

    Science.gov (United States)

    McCarty, James; Parrinello, Michele

    2017-11-01

    In this paper, we combine two powerful computational techniques, well-tempered metadynamics and time-lagged independent component analysis. The aim is to develop a new tool for studying rare events and exploring complex free energy landscapes. Metadynamics is a well-established and widely used enhanced sampling method whose efficiency depends on an appropriate choice of collective variables. Often the initial choice is not optimal leading to slow convergence. However by analyzing the dynamics generated in one such run with a time-lagged independent component analysis and the techniques recently developed in the area of conformational dynamics, we obtain much more efficient collective variables that are also better capable of illuminating the physics of the system. We demonstrate the power of this approach in two paradigmatic examples.

  14. Assessing compositional variability through graphical analysis and Bayesian statistical approaches: case studies on transgenic crops.

    Science.gov (United States)

    Harrigan, George G; Harrison, Jay M

    2012-01-01

    New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.

  15. Feedback control of acoustic musical instruments: collocated control using physical analogs.

    Science.gov (United States)

    Berdahl, Edgar; Smith, Julius O; Niemeyer, Günter

    2012-01-01

    Traditionally, the average professional musician has owned numerous acoustic musical instruments, many of them having distinctive acoustic qualities. However, a modern musician could prefer to have a single musical instrument whose acoustics are programmable by feedback control, where acoustic variables are estimated from sensor measurements in real time and then fed back in order to influence the controlled variables. In this paper, theory is presented that describes stable feedback control of an acoustic musical instrument. The presentation should be accessible to members of the musical acoustics community who may have limited or no experience with feedback control. First, the only control strategy guaranteed to be stable subject to any musical instrument mobility is described: the sensors and actuators must be collocated, and the controller must emulate a physical analog system. Next, the most fundamental feedback controllers and the corresponding physical analog systems are presented. The effects that these controllers have on acoustic musical instruments are described. Finally, practical design challenges are discussed. A proof explains why changing the resonance frequency of a musical resonance requires much more control power than changing the decay time of the resonance. © 2012 Acoustical Society of America.

  16. DEVELOPING ONLINE CO-CREATION INSTRUMENTS BASED ON A FOCUS GROUP APPROACH: THE E-PICUS CASE

    Directory of Open Access Journals (Sweden)

    ALEXA Lidia

    2016-09-01

    Full Text Available The current business environment is in constant change, characterized by increased competition and in order to remain relevant and to create products and services that respond better to the customers’ needs and expectations, companies need to become more innovative and proactive. To address the competitive challenges, more and more companies are using innovation co-creation where all the relevant stakeholders are participating across the value chain, from idea generation, selection, development and eventually, even to marketing the new products or services.The paper presents the process of developing an online cocreation. The platform, within the framework of a research project, underlying the importance of using a focus group approach for requirements elicitation in IT instruments development.

  17. Fatigue Crack Propagation Under Variable Amplitude Loading Analyses Based on Plastic Energy Approach

    Directory of Open Access Journals (Sweden)

    Sofiane Maachou

    2014-04-01

    Full Text Available Plasticity effects at the crack tip had been recognized as “motor” of crack propagation, the growth of cracks is related to the existence of a crack tip plastic zone, whose formation and intensification is accompanied by energy dissipation. In the actual state of knowledge fatigue crack propagation is modeled using crack closure concept. The fatigue crack growth behavior under constant amplitude and variable amplitude loading of the aluminum alloy 2024 T351 are analyzed using in terms energy parameters. In the case of VAL (variable amplitude loading tests, the evolution of the hysteretic energy dissipated per block is shown similar with that observed under constant amplitude loading. A linear relationship between the crack growth rate and the hysteretic energy dissipated per block is obtained at high growth rates. For lower growth rates values, the relationship between crack growth rate and hysteretic energy dissipated per block can represented by a power law. In this paper, an analysis of fatigue crack propagation under variable amplitude loading based on energetic approach is proposed.

  18. Robotic-surgical instrument wrist pose estimation.

    Science.gov (United States)

    Fabel, Stephan; Baek, Kyungim; Berkelman, Peter

    2010-01-01

    The Compact Lightweight Surgery Robot from the University of Hawaii includes two teleoperated instruments and one endoscope manipulator which act in accord to perform assisted interventional medicine. The relative positions and orientations of the robotic instruments and endoscope must be known to the teleoperation system so that the directions of the instrument motions can be controlled to correspond closely to the directions of the motions of the master manipulators, as seen by the the endoscope and displayed to the surgeon. If the manipulator bases are mounted in known locations and all manipulator joint variables are known, then the necessary coordinate transformations between the master and slave manipulators can be easily computed. The versatility and ease of use of the system can be increased, however, by allowing the endoscope or instrument manipulator bases to be moved to arbitrary positions and orientations without reinitializing each manipulator or remeasuring their relative positions. The aim of this work is to find the pose of the instrument end effectors using the video image from the endoscope camera. The P3P pose estimation algorithm is used with a Levenberg-Marquardt optimization to ensure convergence. The correct transformations between the master and slave coordinate frames can then be calculated and updated when the bases of the endoscope or instrument manipulators are moved to new, unknown, positions at any time before or during surgical procedures.

  19. Instrumented Compliant Wrist with Proximity and Contact Sensing for Close Robot Interaction Control

    Directory of Open Access Journals (Sweden)

    Pascal Laferrière

    2017-06-01

    Full Text Available Compliance has been exploited in various forms in robotic systems to allow rigid mechanisms to come into contact with fragile objects, or with complex shapes that cannot be accurately modeled. Force feedback control has been the classical approach for providing compliance in robotic systems. However, by integrating other forms of instrumentation with compliance into a single device, it is possible to extend close monitoring of nearby objects before and after contact occurs. As a result, safer and smoother robot control can be achieved both while approaching and while touching surfaces. This paper presents the design and extensive experimental evaluation of a versatile, lightweight, and low-cost instrumented compliant wrist mechanism which can be mounted on any rigid robotic manipulator in order to introduce a layer of compliance while providing the controller with extra sensing signals during close interaction with an object’s surface. Arrays of embedded range sensors provide real-time measurements on the position and orientation of surfaces, either located in proximity or in contact with the robot’s end-effector, which permits close guidance of its operation. Calibration procedures are formulated to overcome inter-sensor variability and achieve the highest available resolution. A versatile solution is created by embedding all signal processing, while wireless transmission connects the device to any industrial robot’s controller to support path control. Experimental work demonstrates the device’s physical compliance as well as the stability and accuracy of the device outputs. Primary applications of the proposed instrumented compliant wrist include smooth surface following in manufacturing, inspection, and safe human-robot interaction.

  20. A Hybrid ICA-SVM Approach for Determining the Quality Variables at Fault in a Multivariate Process

    Directory of Open Access Journals (Sweden)

    Yuehjen E. Shao

    2012-01-01

    Full Text Available The monitoring of a multivariate process with the use of multivariate statistical process control (MSPC charts has received considerable attention. However, in practice, the use of MSPC chart typically encounters a difficulty. This difficult involves which quality variable or which set of the quality variables is responsible for the generation of the signal. This study proposes a hybrid scheme which is composed of independent component analysis (ICA and support vector machine (SVM to determine the fault quality variables when a step-change disturbance existed in a multivariate process. The proposed hybrid ICA-SVM scheme initially applies ICA to the Hotelling T2 MSPC chart to generate independent components (ICs. The hidden information of the fault quality variables can be identified in these ICs. The ICs are then served as the input variables of the classifier SVM for performing the classification process. The performance of various process designs is investigated and compared with the typical classification method. Using the proposed approach, the fault quality variables for a multivariate process can be accurately and reliably determined.

  1. The effects of competition on premiums: using United Healthcare's 2015 entry into Affordable Care Act's marketplaces as an instrumental variable.

    Science.gov (United States)

    Agirdas, Cagdas; Krebs, Robert J; Yano, Masato

    2018-01-08

    One goal of the Affordable Care Act is to increase insurance coverage by improving competition and lowering premiums. To facilitate this goal, the federal government enacted online marketplaces in the 395 rating areas spanning 34 states that chose not to establish their own state-run marketplaces. Few multivariate regression studies analyzing the effects of competition on premiums suffer from endogeneity, due to simultaneity and omitted variable biases. However, United Healthcare's decision to enter these marketplaces in 2015 provides the researcher with an opportunity to address this endogeneity problem. Exploiting the variation caused by United Healthcare's entry decision as an instrument for competition, we study the impact of competition on premiums during the first 2 years of these marketplaces. Combining panel data from five different sources and controlling for 12 variables, we find that one more insurer in a rating area leads to a 6.97% reduction in the second-lowest-priced silver plan premium, which is larger than the estimated effects in existing literature. Furthermore, we run a threshold analysis and find that competition's effects on premiums become statistically insignificant if there are four or more insurers in a rating area. These findings are robust to alternative measures of premiums, inclusion of a non-linear term in the regression models and a county-level analysis.

  2. Describing the interannual variability of precipitation with the derived distribution approach: effects of record length and resolution

    Directory of Open Access Journals (Sweden)

    C. I. Meier

    2016-10-01

    , analyzing the ability of the DD to estimate the long-term standard deviation of annual rainfall, as compared to direct computation from the sample of annual totals. Our results show that, as compared to the fitting of a normal or lognormal distribution (or equivalently, direct estimation of the sample moments, the DD approach reduces the uncertainty in annual precipitation estimates (especially interannual variability when only short records (below 6–8 years are available. In such cases, it also reduces the bias in annual precipitation quantiles with high return periods. We demonstrate that using precipitation data aggregated every 24 h, as commonly available at most weather stations, introduces a noticeable bias in the DD. These results point to the tangible benefits of installing high-resolution (hourly, at least precipitation gauges, next to the customary, manual rain-measuring instrument, at previously ungauged locations. We propose that the DD approach is a suitable tool for the statistical description and study of annual rainfall, not only when short records are available, but also when dealing with nonstationary time series of precipitation. Finally, to avert any misinterpretation of the presented method, we should like to emphasize that it only applies for climatic analyses of annual precipitation totals; even though storm data are used, there is no relation to the study of extreme rainfall intensities needed for engineering design.

  3. Integrating Cost as an Independent Variable Analysis with Evolutionary Acquisition - A Multiattribute Design Evaluation Approach

    Science.gov (United States)

    2003-03-01

    within the Automated Cost Estimating Integrated Tools ( ACEIT ) software suite (version 5.x). With this capability, one can set cost targets or time...not allow the user to vary more than one decision variable. This limitation of the ACEIT approach thus hinders a holistic view when attempting to

  4. Error-in-variables models in calibration

    Science.gov (United States)

    Lira, I.; Grientschnig, D.

    2017-12-01

    In many calibration operations, the stimuli applied to the measuring system or instrument under test are derived from measurement standards whose values may be considered to be perfectly known. In that case, it is assumed that calibration uncertainty arises solely from inexact measurement of the responses, from imperfect control of the calibration process and from the possible inaccuracy of the calibration model. However, the premise that the stimuli are completely known is never strictly fulfilled and in some instances it may be grossly inadequate. Then, error-in-variables (EIV) regression models have to be employed. In metrology, these models have been approached mostly from the frequentist perspective. In contrast, not much guidance is available on their Bayesian analysis. In this paper, we first present a brief summary of the conventional statistical techniques that have been developed to deal with EIV models in calibration. We then proceed to discuss the alternative Bayesian framework under some simplifying assumptions. Through a detailed example about the calibration of an instrument for measuring flow rates, we provide advice on how the user of the calibration function should employ the latter framework for inferring the stimulus acting on the calibrated device when, in use, a certain response is measured.

  5. Einstein x-ray observations of cataclysmic variables

    International Nuclear Information System (INIS)

    Mason, K.O.; Cordova, F.A.

    1982-01-01

    Observations with the imaging x-ray detectors on the Einstein Observatory have led to a large increase in the number of low luminosity x-ray sources known to be associated with cataclysmic variable stars (CVs). The high sensitivity of the Einstein instrumentation has permitted study of their short timescale variability and spectra. The data are adding significantly to our knowledge of the accretion process in cataclysmic variables and forcing some revision in our ideas concerning the origin of the optical variability in these stars

  6. A Comparison of Approaches for the Analysis of Interaction Effects between Latent Variables Using Partial Least Squares Path Modeling

    Science.gov (United States)

    Henseler, Jorg; Chin, Wynne W.

    2010-01-01

    In social and business sciences, the importance of the analysis of interaction effects between manifest as well as latent variables steadily increases. Researchers using partial least squares (PLS) to analyze interaction effects between latent variables need an overview of the available approaches as well as their suitability. This article…

  7. Adaptation of WHOQOL as health-related quality of life instrument to develop a vision-specific instrument.

    Directory of Open Access Journals (Sweden)

    Dandona Lalit

    2000-01-01

    Full Text Available The WHOQOL instrument was adapted as a health-related QOL instrument for a population-based epidemiologic study of eye diseases in southern India, the Andhra Pradesh Eye Disease Study (APEDS. A follow-up question was added to each item in WHOQOL to determine whether the decrease in QOL was due to any health reasons including eye-related reasons. Modifications in WHOQOL and translation in local language were done through the use of the focus groups including health professionals and people not related to health care. The modified instrument has 28 items across 6 domains of the WHOQOL and was translated into the local language, Telugu, using the pragmatic approach. It takes 10-20 minutes to be administered by a trained interviewer. Reliability was within acceptable range. This health-related QOL instrument is being used in the population-based study APEDS to develop a vision-specific QOL instrument which could potentially be used to assess the impact of visual impairment on QOL across different cultures and for use in evaluating eye-care interventions. This health-related QOL instrument could also be used to develop other disease-specific instruments as it allows assessment of the extent to which various aspects of QOL are affected by a variety of health problems.

  8. Outcome indicators for the evaluation of energy policy instruments and technical change

    International Nuclear Information System (INIS)

    Neij, Lena; Astrand, Kerstin

    2006-01-01

    The aim of this paper is to propose a framework for the evaluation of policy instruments designed to affect development and dissemination of new energy technologies. The evaluation approach is based on the analysis of selected outcome indicators describing the process of technical change, i.e. the development and dissemination of new energy technologies, on the basis of a socio-technical systems approach. The outcome indicators are used to analyse the effect, in terms of outcome, and outcome scope of the policy instruments as well as the extent to which the policy instruments support diversity, learning and institutional change. The analysis of two cases of evaluations, of energy efficiency policy and wind energy policy in Sweden, shows that the approach has several advantages, allowing continuous evaluation and providing important information for the redesign of policy instruments. There are also disadvantages associated with the approach, such as complexity, possible high cost and the requirement of qualified evaluators. Nevertheless, it is concluded that the information on the continuous performance of different policy instruments and their effects on the introduction and dissemination of new energy technologies, provided by this evaluation approach, is essential for an improved adaptation and implementation of energy and climate policy

  9. Hitting emissions targets with (statistical) confidence in multi-instrument Emissions Trading Schemes

    International Nuclear Information System (INIS)

    Shipworth, David

    2003-12-01

    A means of assessing, monitoring and controlling aggregate emissions from multi-instrument Emissions Trading Schemes is proposed. The approach allows contributions from different instruments with different forms of emissions targets to be integrated. Where Emissions Trading Schemes are helping to meet specific national targets, the approach allows the entry requirements of new participants to be calculated and set at a level that will achieve these targets. The approach is multi-levelled, and may be extended downwards to support pooling of participants within instruments, or upwards to embed Emissions Trading Schemes within a wider suite of policies and measures with hard and soft targets. Aggregate emissions from each instrument are treated stochastically. Emissions from the scheme as a whole are then the joint probability distribution formed by integrating the emissions from its instruments. Because a Bayesian approach is adopted, qualitative and semi-qualitative data from expert opinion can be used where quantitative data is not currently available, or is incomplete. This approach helps government retain sufficient control over emissions trading scheme targets to allow them to meet their emissions reduction obligations, while minimising the need for retrospectively adjusting existing participants' conditions of entry. This maintains participant confidence, while providing the necessary policy levers for good governance

  10. Quantifying measurement uncertainty and spatial variability in the context of model evaluation

    Science.gov (United States)

    Choukulkar, A.; Brewer, A.; Pichugina, Y. L.; Bonin, T.; Banta, R. M.; Sandberg, S.; Weickmann, A. M.; Djalalova, I.; McCaffrey, K.; Bianco, L.; Wilczak, J. M.; Newman, J. F.; Draxl, C.; Lundquist, J. K.; Wharton, S.; Olson, J.; Kenyon, J.; Marquis, M.

    2017-12-01

    In an effort to improve wind forecasts for the wind energy sector, the Department of Energy and the NOAA funded the second Wind Forecast Improvement Project (WFIP2). As part of the WFIP2 field campaign, a large suite of in-situ and remote sensing instrumentation was deployed to the Columbia River Gorge in Oregon and Washington from October 2015 - March 2017. The array of instrumentation deployed included 915-MHz wind profiling radars, sodars, wind- profiling lidars, and scanning lidars. The role of these instruments was to provide wind measurements at high spatial and temporal resolution for model evaluation and improvement of model physics. To properly determine model errors, the uncertainties in instrument-model comparisons need to be quantified accurately. These uncertainties arise from several factors such as measurement uncertainty, spatial variability, and interpolation of model output to instrument locations, to name a few. In this presentation, we will introduce a formalism to quantify measurement uncertainty and spatial variability. The accuracy of this formalism will be tested using existing datasets such as the eXperimental Planetary boundary layer Instrumentation Assessment (XPIA) campaign. Finally, the uncertainties in wind measurement and the spatial variability estimates from the WFIP2 field campaign will be discussed to understand the challenges involved in model evaluation.

  11. Tackling regional health inequalities in france by resource allocation : a case for complementary instrumental and process-based approaches?

    Science.gov (United States)

    Bellanger, Martine M; Jourdain, Alain

    2004-01-01

    This article aims to evaluate the results of two different approaches underlying the attempts to reduce health inequalities in France. In the 'instrumental' approach, resource allocation is based on an indicator to assess the well-being or the quality of life associated with healthcare provision, the argument being that additional resources would respond to needs that could then be treated quickly and efficiently. This governs the distribution of regional hospital budgets. In the second approach, health professionals and users in a given region are involved in a consensus process to define those priorities to be included in programme formulation. This 'procedural' approach is employed in the case of the regional health programmes. In this second approach, the evaluation of the results runs parallel with an analysis of the process using Rawlsian principles, whereas the first approach is based on the classical economic model.At this stage, a pragmatic analysis based on both the comparison of regional hospital budgets during the period 1992-2003 (calculated using a 'RAWP [resource allocation working party]-like' formula) and the evolution of regional health policies through the evaluation of programmes for the prevention of suicide, alcohol-related diseases and cancers provides a partial assessment of the impact of the two types of approaches, the second having a greater effect on the reduction of regional inequalities.

  12. Wartościowanie zdrowia w opinii pielęgniarek w odniesieniu do zmiennych społeczno – demograficznych = Health evaluation with relation to socio-demographic variables – nurses’ opinions

    Directory of Open Access Journals (Sweden)

    Alina Deluga

    2016-05-01

    Conclusions Health evaluation and significance are prominent elements of pro-health awareness in the nurses. Moreover, they/health behaviours depend on socio-demographic variables. The nurses pay much attention to health being associated with property and state, which characterizes an instrumental approach to health.

  13. Variable selection in multivariate calibration based on clustering of variable concept.

    Science.gov (United States)

    Farrokhnia, Maryam; Karimi, Sadegh

    2016-01-01

    Recently we have proposed a new variable selection algorithm, based on clustering of variable concept (CLoVA) in classification problem. With the same idea, this new concept has been applied to a regression problem and then the obtained results have been compared with conventional variable selection strategies for PLS. The basic idea behind the clustering of variable is that, the instrument channels are clustered into different clusters via clustering algorithms. Then, the spectral data of each cluster are subjected to PLS regression. Different real data sets (Cargill corn, Biscuit dough, ACE QSAR, Soy, and Tablet) have been used to evaluate the influence of the clustering of variables on the prediction performances of PLS. Almost in the all cases, the statistical parameter especially in prediction error shows the superiority of CLoVA-PLS respect to other variable selection strategies. Finally the synergy clustering of variable (sCLoVA-PLS), which is used the combination of cluster, has been proposed as an efficient and modification of CLoVA algorithm. The obtained statistical parameter indicates that variable clustering can split useful part from redundant ones, and then based on informative cluster; stable model can be reached. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Statistical Modeling Approach to Quantitative Analysis of Interobserver Variability in Breast Contouring

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Jinzhong, E-mail: jyang4@mdanderson.org [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Woodward, Wendy A.; Reed, Valerie K.; Strom, Eric A.; Perkins, George H.; Tereffe, Welela; Buchholz, Thomas A. [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Zhang, Lifei; Balter, Peter; Court, Laurence E. [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Li, X. Allen [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States); Dong, Lei [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Scripps Proton Therapy Center, San Diego, California (United States)

    2014-05-01

    Purpose: To develop a new approach for interobserver variability analysis. Methods and Materials: Eight radiation oncologists specializing in breast cancer radiation therapy delineated a patient's left breast “from scratch” and from a template that was generated using deformable image registration. Three of the radiation oncologists had previously received training in Radiation Therapy Oncology Group consensus contouring for breast cancer atlas. The simultaneous truth and performance level estimation algorithm was applied to the 8 contours delineated “from scratch” to produce a group consensus contour. Individual Jaccard scores were fitted to a beta distribution model. We also applied this analysis to 2 or more patients, which were contoured by 9 breast radiation oncologists from 8 institutions. Results: The beta distribution model had a mean of 86.2%, standard deviation (SD) of ±5.9%, a skewness of −0.7, and excess kurtosis of 0.55, exemplifying broad interobserver variability. The 3 RTOG-trained physicians had higher agreement scores than average, indicating that their contours were close to the group consensus contour. One physician had high sensitivity but lower specificity than the others, which implies that this physician tended to contour a structure larger than those of the others. Two other physicians had low sensitivity but specificity similar to the others, which implies that they tended to contour a structure smaller than the others. With this information, they could adjust their contouring practice to be more consistent with others if desired. When contouring from the template, the beta distribution model had a mean of 92.3%, SD ± 3.4%, skewness of −0.79, and excess kurtosis of 0.83, which indicated a much better consistency among individual contours. Similar results were obtained for the analysis of 2 additional patients. Conclusions: The proposed statistical approach was able to measure interobserver variability quantitatively

  15. Intrajudge and Interjudge Reliability of the Stuttering Severity Instrument-Fourth Edition.

    Science.gov (United States)

    Davidow, Jason H; Scott, Kathleen A

    2017-11-08

    The Stuttering Severity Instrument (SSI) is a tool used to measure the severity of stuttering. Previous versions of the instrument have known limitations (e.g., Lewis, 1995). The present study examined the intra- and interjudge reliability of the newest version, the Stuttering Severity Instrument-Fourth Edition (SSI-4) (Riley, 2009). Twelve judges who were trained on the SSI-4 protocol participated. Judges collected SSI-4 data while viewing 4 videos of adults who stutter at Time 1 and 4 weeks later at Time 2. Data were analyzed for intra- and interjudge reliability of the SSI-4 subscores (for Frequency, Duration, and Physical Concomitants), total score, and final severity rating. Intra- and interjudge reliability across the subscores and total score concurred with the manual's reported reliability when reliability was calculated using the methods described in the manual. New calculations of judge agreement produced different values from those in the manual-for the 3 subscores, total score, and final severity rating-and provided data absent from the manual. Clinicians and researchers who use the SSI-4 should carefully consider the limitations of the instrument. Investigation into the multitasking demands of the instrument may provide information on whether separating the collection of data for specific variables will improve intra- and interjudge reliability of those variables.

  16. Variable-Structure Control of a Model Glider Airplane

    Science.gov (United States)

    Waszak, Martin R.; Anderson, Mark R.

    2008-01-01

    A variable-structure control system designed to enable a fuselage-heavy airplane to recover from spin has been demonstrated in a hand-launched, instrumented model glider airplane. Variable-structure control is a high-speed switching feedback control technique that has been developed for control of nonlinear dynamic systems.

  17. Principles relating to the digital instrumentation and control design approach 2017

    International Nuclear Information System (INIS)

    2017-01-01

    The design of the instrumentation and control of nuclear facilities uses digital systems that offer increasing computation and interconnection capabilities. They enable advanced functions to be carried out, such as calculation of the critical heat flux ratio, help to detect hardware failures in real time and provide operators with rich, flexible interfaces. However, these evolved functions may be affected by faults that make their logic systematically inadequate in certain cases, which introduces sources of failure other than random hardware failures and raises questions about the informal concept of the increased 'complexity' of instrumentation and control. Appropriate design principles shall therefore be applied so that this logic is as fault-free as possible and can be assessed by an independent body such as IRSN. This document presents the main problems associated with the design of the digital instrumentation and control of a complex facility, as well as the general principles to follow to demonstrate that a satisfactory safety level has been achieved. The doctrine elements presented in this document are the result of the experience acquired during assessments carried out for the French nuclear power plants, enhanced by exchanges with experts from the nuclear sector, and reflect French practice; they apply in other sectors in which a high level of confidence can be attributed to instrumentation and control. The normative texts cited in this document provide detailed requirements that are open to considerable interpretation, as the nature of the problem posed does not enable relevant and measurable criteria to be defined in all cases. This document aims to explain the principles underlying these detailed requirements and to give the means for interpreting them in each situation. (authors)

  18. A new model of wheezing severity in young children using the validated ISAAC wheezing module: A latent variable approach with validation in independent cohorts.

    Science.gov (United States)

    Brunwasser, Steven M; Gebretsadik, Tebeb; Gold, Diane R; Turi, Kedir N; Stone, Cosby A; Datta, Soma; Gern, James E; Hartert, Tina V

    2018-01-01

    The International Study of Asthma and Allergies in Children (ISAAC) Wheezing Module is commonly used to characterize pediatric asthma in epidemiological studies, including nearly all airway cohorts participating in the Environmental Influences on Child Health Outcomes (ECHO) consortium. However, there is no consensus model for operationalizing wheezing severity with this instrument in explanatory research studies. Severity is typically measured using coarsely-defined categorical variables, reducing power and potentially underestimating etiological associations. More precise measurement approaches could improve testing of etiological theories of wheezing illness. We evaluated a continuous latent variable model of pediatric wheezing severity based on four ISAAC Wheezing Module items. Analyses included subgroups of children from three independent cohorts whose parents reported past wheezing: infants ages 0-2 in the INSPIRE birth cohort study (Cohort 1; n = 657), 6-7-year-old North American children from Phase One of the ISAAC study (Cohort 2; n = 2,765), and 5-6-year-old children in the EHAAS birth cohort study (Cohort 3; n = 102). Models were estimated using structural equation modeling. In all cohorts, covariance patterns implied by the latent variable model were consistent with the observed data, as indicated by non-significant χ2 goodness of fit tests (no evidence of model misspecification). Cohort 1 analyses showed that the latent factor structure was stable across time points and child sexes. In both cohorts 1 and 3, the latent wheezing severity variable was prospectively associated with wheeze-related clinical outcomes, including physician asthma diagnosis, acute corticosteroid use, and wheeze-related outpatient medical visits when adjusting for confounders. We developed an easily applicable continuous latent variable model of pediatric wheezing severity based on items from the well-validated ISAAC Wheezing Module. This model prospectively associates with

  19. Approaches for developing a sizing method for stand-alone PV systems with variable demand

    Energy Technology Data Exchange (ETDEWEB)

    Posadillo, R. [Grupo de Investigacion en Energias y Recursos Renovables, Dpto. de Fisica Aplicada, E.P.S., Universidad de Cordoba, Avda. Menendez Pidal s/n, 14004 Cordoba (Spain); Lopez Luque, R. [Grupo de Investigacion de Fisica para las Energias y Recursos Renovables, Dpto. de Fisica Aplicada. Edificio C2 Campus de Rabanales, 14071 Cordoba (Spain)

    2008-05-15

    Accurate sizing is one of the most important aspects to take into consideration when designing a stand-alone photovoltaic system (SAPV). Various methods, which differ in terms of their simplicity or reliability, have been developed for this purpose. Analytical methods, which seek functional relationships between variables of interest to the sizing problem, are one of these approaches. A series of rational considerations are presented in this paper with the aim of shedding light upon the basic principles and results of various sizing methods proposed by different authors. These considerations set the basis for a new analytical method that has been designed for systems with variable monthly energy demands. Following previous approaches, the method proposed is based on the concept of loss of load probability (LLP) - a parameter that is used to characterize system design. The method includes information on the standard deviation of loss of load probability ({sigma}{sub LLP}) and on two new parameters: annual number of system failures (f) and standard deviation of annual number of failures ({sigma}{sub f}). The method proves useful for sizing a PV system in a reliable manner and serves to explain the discrepancies found in the research on systems with LLP<10{sup -2}. We demonstrate that reliability depends not only on the sizing variables and on the distribution function of solar radiation, but on the minimum value as well, which in a given location and with a monthly average clearness index, achieves total solar radiation on the receiver surface. (author)

  20. Nonlinear internal friction, chaos, fractal and musical instruments

    International Nuclear Information System (INIS)

    Sun, Z.Q.; Lung, C.W.

    1995-08-01

    Nonlinear and structure sensitive internal friction phenomena in materials are used for characterizing musical instruments. It may be one of the most important factors influencing timbre of instruments. As a nonlinear dissipated system, chaos and fractals are fundamental peculiarities of sound spectra. It is shown that the concept of multi range fractals can be used to decompose the frequency spectra of melody. New approaches are suggested to improve the fabrication, property characterization and physical understanding of instruments. (author). 18 refs, 4 figs

  1. Assessment of the quality and variability of health information on chronic pain websites using the DISCERN instrument

    Directory of Open Access Journals (Sweden)

    Buckley Norman

    2010-10-01

    Full Text Available Abstract Background The Internet is used increasingly by providers as a tool for disseminating pain-related health information and by patients as a resource about health conditions and treatment options. However, health information on the Internet remains unregulated and varies in quality, accuracy and readability. The objective of this study was to determine the quality of pain websites, and explain variability in quality and readability between pain websites. Methods Five key terms (pain, chronic pain, back pain, arthritis, and fibromyalgia were entered into the Google, Yahoo and MSN search engines. Websites were assessed using the DISCERN instrument as a quality index. Grade level readability ratings were assessed using the Flesch-Kincaid Readability Algorithm. Univariate (using alpha = 0.20 and multivariable regression (using alpha = 0.05 analyses were used to explain the variability in DISCERN scores and grade level readability using potential for commercial gain, health related seals of approval, language(s and multimedia features as independent variables. Results A total of 300 websites were assessed, 21 excluded in accordance with the exclusion criteria and 110 duplicate websites, leaving 161 unique sites. About 6.8% (11/161 websites of the websites offered patients' commercial products for their pain condition, 36.0% (58/161 websites had a health related seal of approval, 75.8% (122/161 websites presented information in English only and 40.4% (65/161 websites offered an interactive multimedia experience. In assessing the quality of the unique websites, of a maximum score of 80, the overall average DISCERN Score was 55.9 (13.6 and readability (grade level of 10.9 (3.9. The multivariable regressions demonstrated that website seals of approval (P = 0.015 and potential for commercial gain (P = 0.189 were contributing factors to higher DISCERN scores, while seals of approval (P = 0.168 and interactive multimedia (P = 0.244 contributed to

  2. Six Degree-of-Freedom Haptic Simulation of a Stringed Musical Instrument for Triggering Sounds.

    Science.gov (United States)

    Dangxiao Wang; Xiaohan Zhao; Youjiao Shi; Yuru Zhang; Jing Xiao

    2017-01-01

    Six degree-of-freedom (DoF) haptic rendering of multi-region contacts between a moving hand avatar and varied-shaped components of a music instrument is fundamental to realizing interactive simulation of music playing. There are two aspects of computational challenges: first, some components have significantly small sizes in some dimensions, such as the strings on a seven-string plucked instrument (e.g., Guqin), which makes it challenging to avoid pop-through during multi-region contact scenarios. Second, deformable strings may produce high-frequency vibration, which requires simulating diversified and subtle force sensations when a hand interacts with strings in different ways. In this paper, we propose a constraint-based approach to haptic interaction and simulation between a moving hand avatar and various parts of a string instrument, using a cylinder model for the string that has a large length-radius ratio and a sphere-tree model for the other parts that have complex shapes. Collision response algorithms based on configuration-based optimization is adapted to solve for the contact configuration of the hand avatar interacting with thin strings without penetration. To simulate the deformation and vibration of a string, a cylindrical volume with variable diameters is defined with response to the interaction force applied by the operator. Experimental results have validated the stability and efficiency of the proposed approach. Subtle force feelings can be simulated to reflect varied interaction patterns, to differentiate collisions between the hand avatar with a static or vibrating string and the effects of various colliding forces and touch locations on the strings.

  3. Predicting approach to homework in Primary school students.

    Science.gov (United States)

    Valle, Antonio; Pan, Irene; Regueiro, Bibiana; Suárez, Natalia; Tuero, Ellián; Nunes, Ana R

    2015-01-01

    The goal of this research was to study the weight of student variables related to homework (intrinsic homework motivation, perceived homework instrumentality, homework attitude, time spent on homework, and homework time management) and context (teacher feedback on homework and parental homework support) in the prediction of approaches to homework. 535 students of the last three courses of primary education participated in the study. Data were analyzed with hierarchical regression models and path analysis. The results obtained suggest that students’ homework engagement (high or low) is related to students´ level of intrinsic motivation and positive attitude towards homework. Furthermore, it was also observed that students who manage their homework time well (and not necessarily those who spend more time) are more likely to show the deepest approach to homework. Parental support and teacher feedback on homework affect student homework engagement through their effect on the levels of intrinsic homework motivation (directly), and on homework attitude, homework time management, and perceived homework instrumentality (indirectly). Data also indicated a strong and significant relationship between parental and teacher involvement.

  4. An Undergraduate Research Experience on Studying Variable Stars

    Science.gov (United States)

    Amaral, A.; Percy, J. R.

    2016-06-01

    We describe and evaluate a summer undergraduate research project and experience by one of us (AA), under the supervision of the other (JP). The aim of the project was to sample current approaches to analyzing variable star data, and topics related to the study of Mira variable stars and their astrophysical importance. This project was done through the Summer Undergraduate Research Program (SURP) in astronomy at the University of Toronto. SURP allowed undergraduate students to explore and learn about many topics within astronomy and astrophysics, from instrumentation to cosmology. SURP introduced students to key skills which are essential for students hoping to pursue graduate studies in any scientific field. Variable stars proved to be an excellent topic for a research project. For beginners to independent research, it introduces key concepts in research such as critical thinking and problem solving, while illuminating previously learned topics in stellar physics. The focus of this summer project was to compare observations with structural and evolutionary models, including modelling the random walk behavior exhibited in the (O-C) diagrams of most Mira stars. We found that the random walk could be modelled by using random fluctuations of the period. This explanation agreed well with observations.

  5. How to get rid of W: a latent variables approach to modelling spatially lagged variables

    NARCIS (Netherlands)

    Folmer, H.; Oud, J.

    2008-01-01

    In this paper we propose a structural equation model (SEM) with latent variables to model spatial dependence. Rather than using the spatial weights matrix W, we propose to use latent variables to represent spatial dependence and spillover effects, of which the observed spatially lagged variables are

  6. How to get rid of W : a latent variables approach to modelling spatially lagged variables

    NARCIS (Netherlands)

    Folmer, Henk; Oud, Johan

    2008-01-01

    In this paper we propose a structural equation model (SEM) with latent variables to model spatial dependence. Rather than using the spatial weights matrix W, we propose to use latent variables to represent spatial dependence and spillover effects, of which the observed spatially lagged variables are

  7. The static and dynamic efficiency of instruments of promotion of renewables

    International Nuclear Information System (INIS)

    Finon, D; Menanteau, P.

    2003-01-01

    Results of a comparative analysis of the economic and social efficiency of the instruments used to promote renewable energy sources are discussed. The analysis was carried out first from a static point of view and then using dynamic criteria, to ascertain the ability of the instruments to stimulate technological progress and cost reduction. The first part of the analysis is devoted to characterization of the rationale of these policies in a cost-efficiency framework, followed by an analysis of the instruments in relation to the classic discussion of environmental policy that opposes price-based approaches versus quantity-based approaches. The final part of the analysis examines the incentives to invest and innovate in the context of each framework in relation to the surplus allowed by each policy instrument and its effects on the tendency of manufacturers and renewable energy producers to innovate. The evidence shows that in terms of installed capacity price-based approaches have been superior to the quantity-based approaches that have been used until recently. In contrast, quantity-based approaches have been found to be more efficient for defining and adjusting the overall goals. Adjusting the quotas also provides an indirect way of controlling overall costs. 22 refs., 3 tabs., 6 figs

  8. Cost-effective design of economic instruments in nutrition policy

    Directory of Open Access Journals (Sweden)

    Smed Sinne

    2007-04-01

    Full Text Available Abstract This paper addresses the potential for using economic regulation, e.g. taxes or subsidies, as instruments to combat the increasing problems of inappropriate diets, leading to health problems such as obesity, diabetes 2, cardiovascular diseases etc. in most countries. Such policy measures may be considered as alternatives or supplements to other regulation instruments, including information campaigns, bans or enhancement of technological solutions to the problems of obesity or related diseases. 7 different food tax and subsidy instruments or combinations of instruments are analysed quantitatively. The analyses demonstrate that the average cost-effectiveness with regard to changing the intake of selected nutritional variables can be improved by 10–30 per cent if taxes/subsidies are targeted against these nutrients, compared with targeting selected food categories. Finally, the paper raises a range of issues, which need to be investigated further, before firm conclusions about the suitability of economic instruments in nutrition policy can be drawn.

  9. Ultrasonic imaging with a fixed instrument configuration

    Energy Technology Data Exchange (ETDEWEB)

    Witten, A.; Tuggle, J.; Waag, R.C.

    1988-07-04

    Diffraction tomography is a technique based on an inversion of the wave equation which has been proposed for high-resolution ultrasonic imaging. While this approach has been considered for diagnostic medical applications, it has, until recently, been limited by practical limitations on the speed of data acquisition associated with instrument motions. This letter presents the results of an experimental study directed towards demonstrating tomography utilizing a fixed instrument configuration.

  10. Advances in control and instrumentation

    International Nuclear Information System (INIS)

    Surendar, Ch.

    1994-01-01

    Control and instrumentation systems have seen significant changes from pneumatic to electronic with the advent of transistors and integrated circuits. Miniaturization was realised. With the introduction of microprocessors there has been a revolutionary change in the approach in instrumentation and control systems in the areas of sensors, data acquisition/transmission, processing for control, and presentation of the information to the operator. An effort is made to give some insight into these areas, with some idea of the advantages to which these systems are being put to use in the nuclear facilities, particularly nuclear power reactors. (author)

  11. Instrumental approach and diagnosis of total inorganics in a typical ...

    African Journals Online (AJOL)

    The instrumental diagnosis to subject them to oxidation, combustion and/or incineration serves till date mandatory fundamental requirement in the further pursuance of their mineralogical and wet chemical analysis. The results of the ash content from three (3) carbonaceous coal and shaly coal samples from Northeastern ...

  12. Advanced instrumentation and teleoperation

    International Nuclear Information System (INIS)

    Decreton, M.

    1998-01-01

    SCK-CEN's advanced instrumentation and teleoperation project aims at evaluating the potential of a telerobotic approach in a nuclear environment and, in particular, the use of remote-perception systems. Main achievements in 1997 in the areas of R and D on radiation tolerance for remote sensing, optical fibres and optical-fibre sensors, and computer-aided teleoperation are reported

  13. The mediation proportion: a structural equation approach for estimating the proportion of exposure effect on outcome explained by an intermediate variable

    DEFF Research Database (Denmark)

    Ditlevsen, Susanne; Christensen, Ulla; Lynch, John

    2005-01-01

    It is often of interest to assess how much of the effect of an exposure on a response is mediated through an intermediate variable. However, systematic approaches are lacking, other than assessment of a surrogate marker for the endpoint of a clinical trial. We review a measure of "proportion...... of several intermediate variables. Binary or categorical variables can be included directly through threshold models. We call this measure the mediation proportion, that is, the part of an exposure effect on outcome explained by a third, intermediate variable. Two examples illustrate the approach. The first...... example is a randomized clinical trial of the effects of interferon-alpha on visual acuity in patients with age-related macular degeneration. In this example, the exposure, mediator and response are all binary. The second example is a common problem in social epidemiology-to find the proportion...

  14. VIMOS Instrument Control Software Design: an Object Oriented Approach

    Science.gov (United States)

    Brau-Nogué, Sylvie; Lucuix, Christian

    2002-12-01

    The Franco-Italian VIMOS instrument is a VIsible imaging Multi-Object Spectrograph with outstanding multiplex capabilities, allowing to take spectra of more than 800 objects simultaneously, or integral field spectroscopy mode in a 54x54 arcsec area. VIMOS is being installed at the Nasmyth focus of the third Unit Telescope of the European Southern Observatory Very Large Telescope (VLT) at Mount Paranal in Chile. This paper will describe the analysis, the design and the implementation of the VIMOS Instrument Control System, using UML notation. Our Control group followed an Object Oriented software process while keeping in mind the ESO VLT standard control concepts. At ESO VLT a complete software library is available. Rather than applying waterfall lifecycle, ICS project used iterative development, a lifecycle consisting of several iterations. Each iteration consisted in : capture and evaluate the requirements, visual modeling for analysis and design, implementation, test, and deployment. Depending of the project phases, iterations focused more or less on specific activity. The result is an object model (the design model), including use-case realizations. An implementation view and a deployment view complement this product. An extract of VIMOS ICS UML model will be presented and some implementation, integration and test issues will be discussed.

  15. A Synergetic Approach to Describe the Stability and Variability of Motor Behavior

    Science.gov (United States)

    Witte, Kersttn; Bock, Holger; Storb, Ulrich; Blaser, Peter

    At the beginning of the 20th century, the Russian physiologist and biomechanist Bernstein developed his cyclograms, in which he showed in the non-repetition of the same movement under constant conditions. We can also observe this phenomenon when we analyze several cyclic sports movements. For example, we investigated the trajectories of single joints and segments of the body in breaststroke, walking, and running. The problem of the stability and variability of movement, and the relation between the two, cannot be satisfactorily tackled by means of linear methods. Thus, several authors (Turvey, 1977; Kugler et al., 1980; Haken et al., 1985; Schöner et al., 1986; Mitra et al., 1997; Kay et al., 1991; Ganz et al., 1996; Schöllhorn, 1999) use nonlinear models to describe human movement. These models and approaches have shown that nonlinear theories of complex systems provide a new understanding of the stability and variability of motor control. The purpose of this chapter is a presentation of a common synergetic model of motor behavior and its application to foot tapping, walking, and running.

  16. Investigating Factorial Invariance of Latent Variables Across Populations When Manifest Variables Are Missing Completely.

    Science.gov (United States)

    Widaman, Keith F; Grimm, Kevin J; Early, Dawnté R; Robins, Richard W; Conger, Rand D

    2013-07-01

    Difficulties arise in multiple-group evaluations of factorial invariance if particular manifest variables are missing completely in certain groups. Ad hoc analytic alternatives can be used in such situations (e.g., deleting manifest variables), but some common approaches, such as multiple imputation, are not viable. At least 3 solutions to this problem are viable: analyzing differing sets of variables across groups, using pattern mixture approaches, and a new method using random number generation. The latter solution, proposed in this article, is to generate pseudo-random normal deviates for all observations for manifest variables that are missing completely in a given sample and then to specify multiple-group models in a way that respects the random nature of these values. An empirical example is presented in detail comparing the 3 approaches. The proposed solution can enable quantitative comparisons at the latent variable level between groups using programs that require the same number of manifest variables in each group.

  17. Children's Learning in Scientific Thinking: Instructional Approaches and Roles of Variable Identification and Executive Function

    Science.gov (United States)

    Blums, Angela

    The present study examines instructional approaches and cognitive factors involved in elementary school children's thinking and learning the Control of Variables Strategy (CVS), a critical aspect of scientific reasoning. Previous research has identified several features related to effective instruction of CVS, including using a guided learning approach, the use of self-reflective questions, and learning in individual and group contexts. The current study examined the roles of procedural and conceptual instruction in learning CVS and investigated the role of executive function in the learning process. Additionally, this study examined how learning to identify variables is a part of the CVS process. In two studies (individual and classroom experiments), 139 third, fourth, and fifth grade students participated in hands-on and paper and pencil CVS learning activities and, in each study, were assigned to either a procedural instruction, conceptual instruction, or control (no instruction) group. Participants also completed a series of executive function tasks. The study was carried out with two parts--Study 1 used an individual context and Study 2 was carried out in a group setting. Results indicated that procedural and conceptual instruction were more effective than no instruction, and the ability to identify variables was identified as a key piece to the CVS process. Executive function predicted ability to identify variables and predicted success on CVS tasks. Developmental differences were present, in that older children outperformed younger children on CVS tasks, and that conceptual instruction was slightly more effective for older children. Some differences between individual and group instruction were found, with those in the individual context showing some advantage over the those in the group setting in learning CVS concepts. Conceptual implications about scientific thinking and practical implications in science education are discussed.

  18. Instrumentation Cables Test Plan

    Energy Technology Data Exchange (ETDEWEB)

    Muna, Alice Baca [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); LaFleur, Chris Bensdotter [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-10-01

    A fire at a nuclear power plant (NPP) has the potential to damage structures, systems, and components important to safety, if not promptly detected and suppressed. At Browns Ferry Nuclear Power Plant on March 22, 1975, a fire in the reactor building damaged electrical power and control systems. Damage to instrumentation cables impeded the function of both normal and standby reactor coolant systems, and degraded the operators’ plant monitoring capability. This event resulted in additional NRC involvement with utilities to ensure that NPPs are properly protected from fire as intended by the NRC principle design criteria (i.e., general design criteria 3, Fire Protection). Current guidance and methods for both deterministic and performance based approaches typically make conservative (bounding) assumptions regarding the fire-induced failure modes of instrumentation cables and those failure modes effects on component and system response. Numerous fire testing programs have been conducted in the past to evaluate the failure modes and effects of electrical cables exposed to severe thermal conditions. However, that testing has primarily focused on control circuits with only a limited number of tests performed on instrumentation circuits. In 2001, the Nuclear Energy Institute (NEI) and the Electric Power Research Institute (EPRI) conducted a series of cable fire tests designed to address specific aspects of the cable failure and circuit fault issues of concern1. The NRC was invited to observe and participate in that program. The NRC sponsored Sandia National Laboratories to support this participation, whom among other things, added a 4-20 mA instrumentation circuit and instrumentation cabling to six of the tests. Although limited, one insight drawn from those instrumentation circuits tests was that the failure characteristics appeared to depend on the cable insulation material. The results showed that for thermoset insulated cables, the instrument reading tended to drift

  19. ASPECT OF LANGUAGE ON A QUALITATIVE ANALYSIS OF STUDENT’S EVALUATION INSTRUMENT

    Directory of Open Access Journals (Sweden)

    Ismanto Ismanto

    2016-11-01

    Full Text Available This article examined the characteristics of good student’s evaluation instrument. There are at least two requirements that must be met. Those are valid and reliable. The validity of the instrument can be seen from the instrument's ability to measure what should be measured. The fact the existence of the validity of an instrument may be a grain fill, the response process, internal structure, relationship with other variables, and the consequences of the implementation of the charging instrument. Analysis of the content is then known as content validity, i.e. rational analysis of the domain to be measured to determine the representation of each item on the instrument with the ability to be measured. Content validity is submitting pieces of blue print and items of the instrument to the experts to be analyzed quantitatively and qualitatively.

  20. Some notes concerning ISABELLE electronic instrumentation

    International Nuclear Information System (INIS)

    Schwartz, M.

    1976-01-01

    A discussion is given to stimulate early thought relative to the electronic instrumentation which will be required at ISABELLE at the inception of its experimental program. This represents a unique opportunity to approach the instrumentation problem rationally, making provisions for most of the needs in the various intersection regions at minimal cost. In particular, as will be seen, it may be possible to develop a number of specialized integrated circuits which will go a long way toward facilitating the rapid acquisition and examination of data

  1. Treatment of thoracolumbar burst fractures with variable screw placement or Isola instrumentation and arthrodesis: case series and literature review.

    Science.gov (United States)

    Alvine, Gregory F; Swain, James M; Asher, Marc A; Burton, Douglas C

    2004-08-01

    The controversy of burst fracture surgical management is addressed in this retrospective case study and literature review. The series consisted of 40 consecutive patients, index included, with 41 fractures treated with stiff, limited segment transpedicular bone-anchored instrumentation and arthrodesis from 1987 through 1994. No major acute complications such as death, paralysis, or infection occurred. For the 30 fractures with pre- and postoperative computed tomography studies, spinal canal compromise was 61% and 32%, respectively. Neurologic function improved in 7 of 14 patients (50%) and did not worsen in any. The principal problem encountered was screw breakage, which occurred in 16 of the 41 (39%) instrumented fractures. As we have previously reported, transpedicular anterior bone graft augmentation significantly decreased variable screw placement (VSP) implant breakage. However, it did not prevent Isola implant breakage in two-motion segment constructs. Compared with VSP, Isola provided better sagittal plane realignment and constructs that have been found to be significantly stiffer. Unplanned reoperation was necessary in 9 of the 40 patients (23%). At 1- and 2-year follow-up, 95% and 79% of patients were available for study, and a satisfactory outcome was achieved in 84% and 79%, respectively. These satisfaction and reoperation rates are consistent with the literature of the time. Based on these observations and the loads to which implant constructs are exposed following posterior realignment and stabilization of burst fractures, we recommend that three- or four-motion segment constructs, rather than two motion, be used. To save valuable motion segments, planned construct shortening can be used. An alternative is sequential or staged anterior corpectomy and structural grafting.

  2. Design and validation of a standards-based science teacher efficacy instrument

    Science.gov (United States)

    Kerr, Patricia Reda

    National standards for K--12 science education address all aspects of science education, with their main emphasis on curriculum---both science subject matter and the process involved in doing science. Standards for science teacher education programs have been developing along a parallel plane, as is self-efficacy research involving classroom teachers. Generally, studies about efficacy have been dichotomous---basing the theoretical underpinnings on the work of either Rotter's Locus of Control theory or on Bandura's explanations of efficacy beliefs and outcome expectancy. This study brings all three threads together---K--12 science standards, teacher education standards, and efficacy beliefs---in an instrument designed to measure science teacher efficacy with items based on identified critical attributes of standards-based science teaching and learning. Based on Bandura's explanation of efficacy being task-specific and having outcome expectancy, a developmental, systematic progression from standards-based strategies and activities to tasks to critical attributes was used to craft items for a standards-based science teacher efficacy instrument. Demographic questions related to school characteristics, teacher characteristics, preservice background, science teaching experience, and post-certification professional development were included in the instrument. The instrument was completed by 102 middle level science teachers, with complete data for 87 teachers. A principal components analysis of the science teachers' responses to the instrument resulted in two components: Standards-Based Science Teacher Efficacy: Beliefs About Teaching (BAT, reliability = .92) and Standards-Based Science Teacher Efficacy: Beliefs About Student Achievement (BASA, reliability = .82). Variables that were characteristic of professional development activities, science content preparation, and school environment were identified as members of the sets of variables predicting the BAT and BASA

  3. Modern spinal instrumentation. Part 1: Normal spinal implants

    International Nuclear Information System (INIS)

    Davis, W.; Allouni, A.K.; Mankad, K.; Prezzi, D.; Elias, T.; Rankine, J.; Davagnanam, I.

    2013-01-01

    The general radiologist frequently encounters studies demonstrating spinal instrumentation, either as part of the patient's postoperative evaluation or as incidental to a study performed for another purpose. There are various surgical approaches and devices used in spinal surgery with an increased understanding of spinal and spinal implant biomechanics drives development of modern fixation devices. It is, therefore, important that the radiologist can recognize commonly used devices and identify their potential complications demonstrated on imaging. The aim of part 1 of this review is to familiarize the reader with terms used to describe surgical approaches to the spine, review the function and normal appearances of commonly used instrumentations, and understand the importance of the different fixation techniques. The second part of this review will concentrate on the roles that the different imaging techniques play in assessing the instrumented spine and the recognition of complications that can potentially occur.

  4. Fusion instrumentation and control: a development strategy

    International Nuclear Information System (INIS)

    Hsu, P.Y.; Greninger, R.C.; Longhurst, G.R.; Madden, P.

    1981-01-01

    We have examined requirements for a fusion instrumentation and control development program to determine where emphasis is needed. The complex, fast, and closely coupled system dynamics of fusion reactors reveal a need for a rigorous approach to the development of instrumentation and control systems. A framework for such a development program should concentrate on three principal need areas: the operator-machine interface, the data and control system architecture, and fusion compatible instruments and sensors. System dynamics characterization of the whole fusion reactor system is also needed to facilitate the implementation process in each of these areas. Finally, the future need to make the instrumentation and control system compatible with the requirements of a commercial plant is met by applying transition technology. These needs form the basis for the program tasks suggested

  5. Optical Methods and Instrumentation in Brain Imaging and Therapy

    CERN Document Server

    2013-01-01

    This book provides a comprehensive up-to-date review of optical approaches used in brain imaging and therapy. It covers a variety of imaging techniques including diffuse optical imaging, laser speckle imaging, photoacoustic imaging and optical coherence tomography. A number of laser-based therapeutic approaches are reviewed, including photodynamic therapy, fluorescence guided resection and photothermal therapy. Fundamental principles and instrumentation are discussed for each imaging and therapeutic technique. Represents the first publication dedicated solely to optical diagnostics and therapeutics in the brain Provides a comprehensive review of the principles of each imaging/therapeutic modality Reviews the latest advances in instrumentation for optical diagnostics in the brain Discusses new optical-based therapeutic approaches for brain diseases

  6. "Fibromyalgia and quality of life: mapping the revised fibromyalgia impact questionnaire to the preference-based instruments".

    Science.gov (United States)

    Collado-Mateo, Daniel; Chen, Gang; Garcia-Gordillo, Miguel A; Iezzi, Angelo; Adsuar, José C; Olivares, Pedro R; Gusi, Narcis

    2017-05-30

    The revised version of the Fibromyalgia Impact Questionnaire (FIQR) is one of the most widely used specific questionnaires in FM studies. However, this questionnaire does not allow calculation of QALYs as it is not a preference-based measure. The aim of this study was to develop mapping algorithm which enable FIQR scores to be transformed into utility scores that can be used in the cost utility analyses. A cross-sectional survey was conducted. One hundred and 92 Spanish women with Fibromyalgia were asked to complete four general quality of life questionnaires, i.e. EQ-5D-5 L, 15D, AQoL-8D and SF-12, and one specific disease instrument, the FIQR. A direct mapping approach was adopted to derive mapping algorithms between the FIQR and each of the four multi-attribute utility (MAU) instruments. Health state utility was treated as the dependent variable in the regression analysis, whilst the FIQR score and age were predictors. The mean utility scores ranged from 0.47 (AQoL-8D) to 0.69 (15D). All correlations between the FIQR total score and MAU instruments utility scores were highly significant (p fibromyalgia specific questionnaire.

  7. The wavelength frame multiplication chopper system for the ESS test beamline at the BER II reactor—A concept study of a fundamental ESS instrument principle

    International Nuclear Information System (INIS)

    Strobl, M.; Bulat, M.; Habicht, K.

    2013-01-01

    Contributing to the design update phase of the European Spallation Source ESS–scheduled to start operation in 2019–a test beamline is under construction at the BER II research reactor at Helmholtz Zentrum Berlin (HZB). This beamline offers experimental test capabilities of instrument concepts viable for the ESS. The experiments envisaged at this dedicated beamline comprise testing of components as well as of novel experimental approaches and methods taking advantage of the long pulse characteristic of the ESS source. Therefore the test beamline will be equipped with a sophisticated chopper system that provides the specific time structure of the ESS and enables variable wavelength resolutions via wavelength frame multiplication (WFM), a fundamental instrument concept beneficial for a number of instruments at ESS. We describe the unique chopper system developed for these purposes, which allows constant wavelength resolution for a wide wavelength band. Furthermore we discuss the implications for the conceptual design for related instrumentation at the ESS

  8. Instrumented fusion in a 12-month-old with atlanto-occipital dislocation: case report and literature review of infant occipitocervical fusion.

    Science.gov (United States)

    Hale, Andrew T; Dewan, Michael C; Patel, Bhairav; Geck, Matthew J; Tomycz, Luke D

    2017-08-01

    The treatment of atlantoaxial dislocation in very young children is challenging and lacks a consensus management strategy. We review the literature on infantile occipitocervical (OC) fusion is appraised and technical considerations are organized for ease of reference. Surgical decisions such as graft type and instrumentation details are summarized, along with the use of bone morphogenic protein and post-operative orthoses. We present the case of a 12-month-old who underwent instrumented occipitocervical (OC) fusion in the setting of traumatic atlanto-occipital dislocation (AOD). Occipitocervical (OC) arthrodesis is obtainable in very young infants and children. Surgical approaches are variable and use a combination of autologous grafting and creative screw and/or wire constructs. The heterogeneity of pathologic etiology leading to OC fusion makes it difficult to make definitive recommendations for surgical management.

  9. Shortening of an existing generic online health-related quality of life instrument for dogs.

    Science.gov (United States)

    Reid, J; Wiseman-Orr, L; Scott, M

    2017-10-11

    Development, initial validation and reliability testing of a shortened version of a web-based questionnaire instrument to measure generic health-related quality of life in companion dogs, to facilitate smartphone and online use. The original 46 items were reduced using expert judgment and factor analysis. Items were removed on the basis of item loadings and communalities on factors identified through factor analysis of responses from owners of healthy and unwell dogs, intrafactor item correlations, readability of items in the UK, USA and Australia and ability of individual items to discriminate between healthy and unwell dogs. Validity was assessed through factor analysis and a field trial using a "known groups" approach. Test-retest reliability was assessed using intraclass correlation coefficients. The new instrument comprises 22 items, each of which was rated by dog owners using a 7-point Likert scale. Factor analysis revealed a structure with four health-related quality of life domains (energetic/enthusiastic, happy/content, active/comfortable, and calm/relaxed) accounting for 72% of the variability in the data compared with 64% for the original instrument. The field test involving 153 healthy and unwell dogs demonstrated good discriminative properties and high intraclass correlation coefficients. The 22-item shortened form is superior to the original instrument and can be accessed via a mobile phone app. This is likely to increase the acceptability to dog owners as a routine wellness measure in health care packages and as a therapeutic monitoring tool. © 2017 British Small Animal Veterinary Association.

  10. Concept Teaching in Instrumental Music Education: A Literature Review

    Science.gov (United States)

    Tan, Leonard

    2017-01-01

    This article is a review of research literature on the teaching of concepts in instrumental music education. It is organized in four parts (a) the value of concept teaching in large instrumental ensembles, (b) time spent teaching concepts during rehearsals, (c) approaches to concept teaching, and (d) implications for music education. Research has…

  11. Instrumenting an upland research catchment in Canterbury, New Zealand to study controls on variability of soil moisture, shallow groundwater and streamflow

    Science.gov (United States)

    McMillan, Hilary; Srinivasan, Ms

    2015-04-01

    Hydrologists recognise the importance of vertical drainage and deep flow paths in runoff generation, even in headwater catchments. Both soil and groundwater stores are highly variable over multiple scales, and the distribution of water has a strong control on flow rates and timing. In this study, we instrumented an upland headwater catchment in New Zealand to measure the temporal and spatial variation in unsaturated and saturated-zone responses. In NZ, upland catchments are the source of much of the water used in lowland agriculture, but the hydrology of such catchments and their role in water partitioning, storage and transport is poorly understood. The study area is the Langs Gully catchment in the North Branch of the Waipara River, Canterbury: this catchment was chosen to be representative of the foothills environment, with lightly managed dryland pasture and native Matagouri shrub vegetation cover. Over a period of 16 months we measured continuous soil moisture at 32 locations and near-surface water table (versus hillslope locations, and convergent versus divergent hillslopes. We found that temporal variability is strongly controlled by the climatic seasonal cycle, for both soil moisture and water table, and for both the mean and extremes of their distributions. Groundwater is a larger water storage component than soil moisture, and the difference increases with catchment wetness. The spatial standard deviation of both soil moisture and groundwater is larger in winter than in summer. It peaks during rainfall events due to partial saturation of the catchment, and also rises in spring as different locations dry out at different rates. The most important controls on spatial variability are aspect and distance from stream. South-facing and near-stream locations have higher water tables and more, larger soil moisture wetting events. Typical hydrological models do not explicitly account for aspect, but our results suggest that it is an important factor in hillslope

  12. Simple Approaches to Minimally-Instrumented, Microfluidic-Based Point-of-Care Nucleic Acid Amplification Tests

    Science.gov (United States)

    Mauk, Michael G.; Song, Jinzhao; Liu, Changchun; Bau, Haim H.

    2018-01-01

    Designs and applications of microfluidics-based devices for molecular diagnostics (Nucleic Acid Amplification Tests, NAATs) in infectious disease testing are reviewed, with emphasis on minimally instrumented, point-of-care (POC) tests for resource-limited settings. Microfluidic cartridges (‘chips’) that combine solid-phase nucleic acid extraction; isothermal enzymatic nucleic acid amplification; pre-stored, paraffin-encapsulated lyophilized reagents; and real-time or endpoint optical detection are described. These chips can be used with a companion module for separating plasma from blood through a combined sedimentation-filtration effect. Three reporter types: Fluorescence, colorimetric dyes, and bioluminescence; and a new paradigm for end-point detection based on a diffusion-reaction column are compared. Multiplexing (parallel amplification and detection of multiple targets) is demonstrated. Low-cost detection and added functionality (data analysis, control, communication) can be realized using a cellphone platform with the chip. Some related and similar-purposed approaches by others are surveyed. PMID:29495424

  13. Simple Approaches to Minimally-Instrumented, Microfluidic-Based Point-of-Care Nucleic Acid Amplification Tests

    Directory of Open Access Journals (Sweden)

    Michael G. Mauk

    2018-02-01

    Full Text Available Designs and applications of microfluidics-based devices for molecular diagnostics (Nucleic Acid Amplification Tests, NAATs in infectious disease testing are reviewed, with emphasis on minimally instrumented, point-of-care (POC tests for resource-limited settings. Microfluidic cartridges (‘chips’ that combine solid-phase nucleic acid extraction; isothermal enzymatic nucleic acid amplification; pre-stored, paraffin-encapsulated lyophilized reagents; and real-time or endpoint optical detection are described. These chips can be used with a companion module for separating plasma from blood through a combined sedimentation-filtration effect. Three reporter types: Fluorescence, colorimetric dyes, and bioluminescence; and a new paradigm for end-point detection based on a diffusion-reaction column are compared. Multiplexing (parallel amplification and detection of multiple targets is demonstrated. Low-cost detection and added functionality (data analysis, control, communication can be realized using a cellphone platform with the chip. Some related and similar-purposed approaches by others are surveyed.

  14. Aversive pavlovian responses affect human instrumental motor performance.

    Science.gov (United States)

    Rigoli, Francesco; Pavone, Enea Francesco; Pezzulo, Giovanni

    2012-01-01

    IN NEUROSCIENCE AND PSYCHOLOGY, AN INFLUENTIAL PERSPECTIVE DISTINGUISHES BETWEEN TWO KINDS OF BEHAVIORAL CONTROL: instrumental (habitual and goal-directed) and Pavlovian. Understanding the instrumental-Pavlovian interaction is fundamental for the comprehension of decision-making. Animal studies (as those using the negative auto-maintenance paradigm), have demonstrated that Pavlovian mechanisms can have maladaptive effects on instrumental performance. However, evidence for a similar effect in humans is scarce. In addition, the mechanisms modulating the impact of Pavlovian responses on instrumental performance are largely unknown, both in human and non-human animals. The present paper describes a behavioral experiment investigating the effects of Pavlovian conditioned responses on performance in humans, focusing on the aversive domain. Results showed that Pavlovian responses influenced human performance, and, similar to animal studies, could have maladaptive effects. In particular, Pavlovian responses either impaired or increased performance depending on modulator variables such as threat distance, task controllability, punishment history, amount of training, and explicit punishment expectancy. Overall, these findings help elucidating the computational mechanisms underlying the instrumental-Pavlovian interaction, which might be at the base of apparently irrational phenomena in economics, social behavior, and psychopathology.

  15. Aversive Pavlovian responses affect human instrumental motor performance

    Directory of Open Access Journals (Sweden)

    Francesco eRigoli

    2012-10-01

    Full Text Available In neuroscience and psychology, an influential perspective distinguishes between two kinds of behavioural control: instrumental (habitual and goal-directed and Pavlovian. Understanding the instrumental-Pavlovian interaction is fundamental for the comprehension of decision-making. Animal studies (as those using the negative auto-maintenance paradigm, have demonstrated that Pavlovian mechanisms can have maladaptive effects on instrumental performance. However, evidence for a similar effect in humans is scarce. In addition, the mechanisms modulating the impact of Pavlovian responses on instrumental performance are largely unknown, both in human and non-human animals. The present paper describes a behavioural experiment investigating the effects of Pavlovian conditioned responses on performance in humans, focusing on the aversive domain. Results showed that Pavlovian responses influenced human performance, and, similar to animal studies, could have maladaptive effects. In particular, Pavlovian responses either impaired or increased performance depending on modulator variables such as threat distance, task controllability, punishment history, amount of training, and explicit punishment expectancy. Overall, these findings help elucidating the computational mechanisms underlying the instrumental-Pavlovian interaction, which might be at the base of apparently irrational phenomena in economics, social behaviour, and psychopathology.

  16. Investigation on Motorcyclist Riding Behaviour at Curve Entry Using Instrumented Motorcycle

    Science.gov (United States)

    Yuen, Choon Wah; Karim, Mohamed Rehan; Saifizul, Ahmad

    2014-01-01

    This paper details the study on the changes in riding behaviour, such as changes in speed as well as the brake force and throttle force applied, when motorcyclists ride over a curve section road using an instrumented motorcycle. In this study, an instrumented motorcycle equipped with various types of sensors, on-board cameras, and data loggers, was developed in order to collect the riding data on the study site. Results from the statistical analysis showed that riding characteristics, such as changes in speed, brake force, and throttle force applied, are influenced by the distance from the curve entry, riding experience, and travel mileage of the riders. A structural equation modeling was used to study the impact of these variables on the change of riding behaviour in curve entry section. Four regression equations are formed to study the relationship between four dependent variables, which are speed, throttle force, front brake force, and rear brake force applied with the independent variables. PMID:24523660

  17. Applying neural networks to optimize instrumentation performance

    Energy Technology Data Exchange (ETDEWEB)

    Start, S.E.; Peters, G.G.

    1995-06-01

    Well calibrated instrumentation is essential in providing meaningful information about the status of a plant. Signals from plant instrumentation frequently have inherent non-linearities, may be affected by environmental conditions and can therefore cause calibration difficulties for the people who maintain them. Two neural network approaches are described in this paper for improving the accuracy of a non-linear, temperature sensitive level probe ised in Expermental Breeder Reactor II (EBR-II) that was difficult to calibrate.

  18. Applying neural networks to optimize instrumentation performance

    International Nuclear Information System (INIS)

    Start, S.E.; Peters, G.G.

    1995-01-01

    Well calibrated instrumentation is essential in providing meaningful information about the status of a plant. Signals from plant instrumentation frequently have inherent non-linearities, may be affected by environmental conditions and can therefore cause calibration difficulties for the people who maintain them. Two neural network approaches are described in this paper for improving the accuracy of a non-linear, temperature sensitive level probe ised in Expermental Breeder Reactor II (EBR-II) that was difficult to calibrate

  19. Determination of acid ionization constants for weak acids by osmometry and the instrumental analysis self-evaluation feedback approach to student preparation of solutions

    Science.gov (United States)

    Kakolesha, Nyanguila

    One focus of this work was to develop of an alternative method to conductivity for determining the acid ionization constants. Computer-controlled osmometry is one of the emerging analytical tools in industrial research and clinical laboratories. It is slowly finding its way into chemistry laboratories. The instrument's microprocessor control ensures shortened data collection time, repeatability, accuracy, and automatic calibration. The equilibrium constants of acetic acid, chloroacetic acid, bromoacetic acid, cyanoacetic acid, and iodoacetic acid have been measured using osmometry and their values compared with the existing literature values obtained, usually, from conductometric measurements. Ionization constant determined by osmometry for the moderately strong weak acids were in reasonably good agreement with literature values. The results showed that two factors, the ionic strength and the osmotic coefficient, exert opposite effects in solutions of such weak acids. Another focus of the work was analytical chemistry students solution preparation skills. The prevailing teacher-structured experiments leave little room for students' ingenuity in quantitative volumetric analysis. The purpose of this part of the study was to improve students' skills in making solutions using instrument feedback in a constructivist-learning model. After making some solutions by weighing and dissolving solutes or by serial dilution, students used the spectrophotometer and the osmometer to compare their solutions with standard solutions. Students perceived the instrument feedback as a nonthreatening approach to monitoring the development of their skill levels and liked to clarify their understanding through interacting with an instructor-observer. An assessment of the instrument feedback and the constructivist model indicated that students would assume responsibility for their own learning if given the opportunity. This study involved 167 students enrolled in Quantitative Chemical

  20. Two-port laparoscopic ovarian cystectomy using 3-mm instruments

    Directory of Open Access Journals (Sweden)

    Naoyuki Yoshiki

    2016-05-01

    Conclusion: Two-port laparoscopic ovarian cystectomy using 3-mm instruments is a feasible and safe approach by which surgeons expert in conventional multiport laparoscopy achieve minimally invasive surgery with low morbidity and a low rate of conversion to the conventional approach.

  1. High-Performance Operational and Instrumentation Amplifiers

    NARCIS (Netherlands)

    Shahi, B.

    2015-01-01

    This thesis describes techniques to reduce the offset error in precision instrumentation and operational amplifiers. The offset error which is considered a major error source associated with gain blocks, together with other errors are reviewed. Conventional and newer approaches to remove offset and

  2. STATUS SOSIAL EKONOMI DAN FERTILITAS: A Latent Variable Approach

    Directory of Open Access Journals (Sweden)

    Suandi -

    2012-11-01

    Full Text Available The main problems faced by developing countries including Indonesia are not onlyeconomic problems that tend to harm, but still met the high fertility rate. The purpose ofwriting to find out the relationship between socioeconomic status to the level of fertilitythrough the "A Latent Variable Approach." The study adopts the approach of fertility oneconomic development. Economic development based on the theories of Malthus: anincrease in "income" is slower than the increase in births (fertility and is the root ofpeople falling into poverty. However, Becker made linkage model or the influence ofchildren income and price. According to Becker, viewed from the aspect of demand thatthe price of children is greater than the income effect.The study shows that (1 level of education correlates positively on income andnegatively affect fertility, (2 age structure of women (control contraceptives adverselyaffect fertility. That is, the older the age, the level of individual productivity and lowerfertility or declining, and (3 husband's employment status correlated positively to theearnings (income. Through a permanent factor income or household income referred toas a negative influence on fertility. There are differences in value orientation of childrenbetween advanced society (rich with a backward society (the poor. The poor, forexample, the value of children is more production of goods. That is, children born moreemphasis on aspects of the number or the number of children owned (quantity, numberof children born by the poor is expected to help their parents at the age of retirement orno longer productive so that the child is expected to assist them in economic, security,and social security (insurance, while the developed (rich children are moreconsumption value or quality of the child.

  3. Approaches to and Treatment Strategies for Playing-Related Pain Problems Among Czech Instrumental Music Students: An Epidemiological Study.

    Science.gov (United States)

    Ioannou, Christos I; Altenmüller, Eckart

    2015-09-01

    The current study examined the severity of playing-related pain (PRP) problems among music students at the Prague State Conservatoire, as well as the various treatment methods used by these students and how they approach and deal with these phenomena while studying. In total, 180 instrumental students participated and completed a paper questionnaire. Of these, 88.9% reported that they had experienced PRP at least once in their lives, with 12.6% experiencing pain every time they play. The onset of PRP seemed to coincide with the transition period on entry to the conservatoire and was associated with the increase in hours of practice. Specific body regions associated with playing each particular instrument were most frequently affected, with females being more susceptible than males to the development of PRP. An alarming 35% of the affected students tended not to seek help at all, whereas those who did tended to seek advice first from their instrument tutor and second from medical doctors. Most students who visited doctors reported that medical treatments only partially helped them to overcome PRP problems. The most frequent treatment methods used were resting, gel or creams, and physical exercises. Students believed that inappropriate posture played a key role in the development of their PRP problems. Finally, students indicated a willingness to be aware of and educated about PRP issues during their studies. Further exploration of PRP problems among student musicians is warranted. Better understanding of differing attitudes toward, use of, and efficiency of various treatment methods after the occurrence of PRPs will provide additional insight for prevention and treatment.

  4. Sharp or broad pulse peak for high resolution instruments? Choice of moderator performance

    International Nuclear Information System (INIS)

    Arai, M.; Watanabe, N.; Teshigawara, M.

    2001-01-01

    We demonstrate a concept how we should choose moderator performance to realize required performance for instruments. Neutron burst pulse can be characterized with peak intensity, peak width and tail. Those can be controllable by designing moderator, i.e. material, temperature, shape, decoupling, poisoning and having premoderator. Hence there are large number of variable parameters to be determined. Here we discuss the required moderator performance for some typical examples, i.e. high resolution powder instrument, chopper instrument, high resolution back scattering machine. (author)

  5. Transgressive or Instrumental?

    DEFF Research Database (Denmark)

    Chemi, Tatiana

    2018-01-01

    Contemporary practices that connect the arts with learning are widespread at all level of educational systems and in organisations, but they include very diverse approaches, multiple methods and background values. Regardless of explicit learning benefits, the arts/learning partnerships bring about...... creativity and the other on practices of arts-integration. My final point rests on the belief that the opposition of transgression and instrumentality is a deceiving perspective on the arts, against the background of the aesthetic plurality and hybridity....

  6. Transient response of level instruments in a research reactor

    International Nuclear Information System (INIS)

    Cheng, Lap Y.

    1989-01-01

    A numerical model has been developed to simulate the dynamics of water level instruments in a research nuclear reactor. A bubble device, with helium gas as the working fluid, is used to monitor liquid level by sensing the static head pressure due to the height of liquid in the reactor vessel. A finite-difference model is constructed to study the transient response of the water level instruments to pressure perturbations. The field equations which describe the hydraulics of the helium gas in the bubbler device are arranged in the form of a tridiagonal matrix and the field variables are solved at each time step by the Thomas algorithm. Simulation results indicate that the dynamic response of the helium gas depends mainly on the volume and the inertia of the gas in the level instrument tubings. The anomalies in the simulated level indication are attributed to the inherent lag in the level instrument due to the hydraulics of the system. 1 ref., 5 figs

  7. Inspector-instrument interface in portable NDA instrumentation

    International Nuclear Information System (INIS)

    Halbig, J.K.; Klosterbuer, S.F.

    1981-01-01

    Recent electronics technology advances make it possible to design sophisticated instruments in small packages for convenient field implementation. An inspector-instrument interface design that allows communication of procedures, responses, and results between the instrument and user is presented. This capability has been incorporated into new spent-fuel instrumentation and a battery-powered multichannel analyzer

  8. Inspector-instrument interface in portable NDA instrumentation

    International Nuclear Information System (INIS)

    Halbig, J.K.; Klosterbuer, S.F.

    1981-01-01

    Recent electronics technology advances make it possible to design sophisticated instruments in small packages for convenient field implementation. This report describes an inspector-instrument interface design which allows communication of procedures, responses, and results between the instrument and user. The interface has been incorporated into new spent-fuel instrumentation and a battery-powered multichannel analyzer

  9. Seismic qualification of safety-related instrumentation cabinets for nuclear generating stations

    International Nuclear Information System (INIS)

    Sauve, R.G.; Bell, R.P.; Cuttler, J.M.

    1979-06-01

    The problem of seismically qualifying different electrical instruments mounted in cabinets of a standard design is discussed and the following economical approach is described in detail. An analytical model of the cabinet structure is developed and validated by comparison with the results of shake table tests on a prototype cabinet. Modal analysis is then used to calculate the input spectra for shake table tests to qualify the individual instruments that are mounted at the required elevations in the cabinet. The worst input spectrum, appropriate to qualify each instrument, is then specified to the suppliers. This approach avoids the need to test each cabinet configuration in an assembled state in order to qualify it. (auth)

  10. Policy instruments for energy conservation: A multidimensional assessment

    International Nuclear Information System (INIS)

    Giraudet, Louis-Gaetan

    2011-01-01

    This thesis evaluates the effectiveness of various forms of taxes, subsidies and regulations implemented to correct for market failures that may prevent energy savings. It builds on various approaches, with gradual complexity. First, a standard micro-economic model is developed to compare the static performances of these instruments. Second, the representation of consumer behaviour is strengthened in a model of the residential energy demand for space heating in France, which allows to identify the dynamic mechanisms by which instruments can correct for the main market failures. Third, an empirical evaluation of 'white certificate' schemes - tradable energy saving obligations imposed on energy operators - is made from the comparison between the British, Italian and French experiences, taking into account national institutions. The following conclusions can be drawn from these various approaches: (i) energy taxes, by encouraging both energy efficiency investment and sufficiency behaviour, are particularly effective; (ii) energy efficiency regulations have a significant impact on the diffusion of efficient technologies; (iii) subsidies to energy efficiency induce a large rebound effect; (iv) depending on the institutional environment in which they operate, white certificate schemes combine different properties of these instruments. Applied to the French residential building sector, the most effective combination of these instruments does not allow to reach the energy saving target set by the Government. (author)

  11. Developing an instrument to identify MBChB students' approaches ...

    African Journals Online (AJOL)

    The constructs of deep, surface and achieving approaches to learning are well defined in the literature and amply supported by research. Quality learning results from a deep approach to learning, and a deep-achieving approach to learning is regarded as the most adaptive approach institutionally. It is therefore felt that ...

  12. Validation of a Job Satisfaction Instrument for Residential-Care Employees.

    Science.gov (United States)

    Sluyter, Gary V.; Mukherjee, Ajit K.

    1986-01-01

    A new job satisfaction instrument for employees of a residential care facility for mentally retarded persons effectively measures the employees' satisfaction with 12 work related variables: salary, company policies, supervision, working conditions, interpersonal relations, security, advancement, recognition, achievement, work responsibility, and…

  13. Process instrumentation for nuclear power station

    International Nuclear Information System (INIS)

    Yanai, Katsuya; Shinohara, Katsuhiko

    1978-01-01

    Nuclear power stations are the large scale compound system composed of many process systems. Accordingly, for the safe and high reliability operation of the plants, it is necessary to grasp the conditions of respective processes exactly and control the operation correctly. For this purpose, the process instrumentation undertakes the important function to monitor the plant operation. Hitachi Ltd. has exerted ceaseless efforts since long before to establish the basic technology for the process instrumentation in nuclear power stations, to develop and improve hardwares of high reliability, and to establish the quality control system. As for the features of the process instrumentation in nuclear power stations, the enormous quantity of measurement, the diversity of measured variables, the remote measurement and monitoring method, and the ensuring of high reliability are enumerated. Also the hardwares must withstand earthquakes, loss of coolant accidents, radiations, leaks and fires. Hitachi Unitrol Sigma Series is the measurement system which is suitable to the general process instrumentation in nuclear power stations, and satisfies sufficiently the basic requirements described above. It has various features as the nuclear energy system, such as high reliability by the use of ICs, the methods of calculation and transmission considering signal linkage, loop controller system and small size. HIACS-1000 Series is the analog controller of high reliability for water control. (Kako, I.)

  14. Economic instruments for environmental policy making in Ontario

    International Nuclear Information System (INIS)

    Barg, S.; Duraiappah, A.; Van Exan, S.

    2000-01-01

    The conditions and approaches required for a successful implementation of economic instruments in Ontario are reviewed. The advantages and disadvantages of economic instruments are discussed, as are some design issues. Some best practices and practical experiences from Canada, the United States, and Europe are examined through the use of nine specific case studies. Each one highlights a different environmental challenge, such as energy efficiency, air pollution, water pollution, waste management along with the solutions that were implemented. The situations described were not all successful, but there is much to be learned from unsuccessful episodes. Lessons learned from the review of the case studies were presented. The points to ponder when using economic instruments in Ontario were highlighted. The command and control policy instrument must be kept in context when considering economic instruments. The reasons that underline the preference of the economic theory for economic instruments are discussed. The different types of economic instruments are described, and the considerations related to the design and comparison of economic instruments is briefly discussed. The authors concluded with several points to ponder: there are a number of options available, details must not be neglected, consultation with the interested parties is important, there is a need for frequent reassessment, and using a number of instruments is helpful. 55 refs., tabs., figs

  15. Psychological variables involved in teacher’s job performance

    OpenAIRE

    Torres Valladares, Manuel; Lajo Lazo, Rosario

    2014-01-01

    The purpose of this study is to analyze the casual relations that can exist between some psychological variables (Personality Type A, Stress facing and Burnout Syndrome) and the labour performance of university teachers from  five faculties of medicine of Lima Metropolitana. The instruments used were: Blumenthal’s inventory of auto report of behaviour type A, COPE, Maslasch’s Burnout inventory and the teacher’s labour performance made by Manuel Fernández Arata. All these instruments were subj...

  16. Variable speed wind turbine control by discrete-time sliding mode approach.

    Science.gov (United States)

    Torchani, Borhen; Sellami, Anis; Garcia, Germain

    2016-05-01

    The aim of this paper is to propose a new design variable speed wind turbine control by discrete-time sliding mode approach. This methodology is designed for linear saturated system. The saturation constraint is reported on inputs vector. To this end, the back stepping design procedure is followed to construct a suitable sliding manifold that guarantees the attainment of a stabilization control objective. It is well known that the mechanisms are investigated in term of the most proposed assumptions to deal with the damping, shaft stiffness and inertia effect of the gear. The objectives are to synthesize robust controllers that maximize the energy extracted from wind, while reducing mechanical loads and rotor speed tracking combined with an electromagnetic torque. Simulation results of the proposed scheme are presented. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  17. Musical Sound, Instruments, and Equipment

    Science.gov (United States)

    Photinos, Panos

    2017-12-01

    'Musical Sound, Instruments, and Equipment' offers a basic understanding of sound, musical instruments and music equipment, geared towards a general audience and non-science majors. The book begins with an introduction of the fundamental properties of sound waves, and the perception of the characteristics of sound. The relation between intensity and loudness, and the relation between frequency and pitch are discussed. The basics of propagation of sound waves, and the interaction of sound waves with objects and structures of various sizes are introduced. Standing waves, harmonics and resonance are explained in simple terms, using graphics that provide a visual understanding. The development is focused on musical instruments and acoustics. The construction of musical scales and the frequency relations are reviewed and applied in the description of musical instruments. The frequency spectrum of selected instruments is explored using freely available sound analysis software. Sound amplification and sound recording, including analog and digital approaches, are discussed in two separate chapters. The book concludes with a chapter on acoustics, the physical factors that affect the quality of the music experience, and practical ways to improve the acoustics at home or small recording studios. A brief technical section is provided at the end of each chapter, where the interested reader can find the relevant physics and sample calculations. These quantitative sections can be skipped without affecting the comprehension of the basic material. Questions are provided to test the reader's understanding of the material. Answers are given in the appendix.

  18. Infectious complications in head and neck cancer patients treated with cetuximab: propensity score and instrumental variable analysis.

    Directory of Open Access Journals (Sweden)

    Ching-Chih Lee

    Full Text Available BACKGROUND: To compare the infection rates between cetuximab-treated patients with head and neck cancers (HNC and untreated patients. METHODOLOGY: A national cohort of 1083 HNC patients identified in 2010 from the Taiwan National Health Insurance Research Database was established. After patients were followed for one year, propensity score analysis and instrumental variable analysis were performed to assess the association between cetuximab therapy and the infection rates. RESULTS: HNC patients receiving cetuximab (n = 158 were older, had lower SES, and resided more frequently in rural areas as compared to those without cetuximab therapy. 125 patients, 32 (20.3% in the group using cetuximab and 93 (10.1% in the group not using it presented infections. The propensity score analysis revealed a 2.3-fold (adjusted odds ratio [OR] = 2.27; 95% CI, 1.46-3.54; P = 0.001 increased risk for infection in HNC patients treated with cetuximab. However, using IVA, the average treatment effect of cetuximab was not statistically associated with increased risk of infection (OR, 0.87; 95% CI, 0.61-1.14. CONCLUSIONS: Cetuximab therapy was not statistically associated with infection rate in HNC patients. However, older HNC patients using cetuximab may incur up to 33% infection rate during one year. Particular attention should be given to older HNC patients treated with cetuximab.

  19. Squeezing more information out of time variable gravity data with a temporal decomposition approach

    DEFF Research Database (Denmark)

    Barletta, Valentina Roberta; Bordoni, A.; Aoudia, A.

    2012-01-01

    an explorative approach based on a suitable time series decomposition, which does not rely on predefined time signatures. The comparison and validation against the fitting approach commonly used in GRACE literature shows a very good agreement for what concerns trends and periodic signals on one side......A measure of the Earth's gravity contains contributions from solid Earth as well as climate-related phenomena, that cannot be easily distinguished both in time and space. After more than 7years, the GRACE gravity data available now support more elaborate analysis on the time series. We propose...... used to assess the possibility of finding evidence of meaningful geophysical signals different from hydrology over Africa in GRACE data. In this case we conclude that hydrological phenomena are dominant and so time variable gravity data in Africa can be directly used to calibrate hydrological models....

  20. Physical Interactions with Digital Strings - A hybrid approach to a digital keyboard instrument

    DEFF Research Database (Denmark)

    Dahlstedt, Palle

    2017-01-01

    of stopping and muting the strings at arbitrary positions. The parameters of the string model are controlled through TouchKeys multitouch sensors on each key, combined with MIDI data and acoustic signals from the digital keyboard frame, using a novel mapping. The instrument is evaluated from a performing...... of control. The contributions are two-fold. First, the use of acoustic sounds from a physical keyboard for excitations and resonances results in a novel hybrid keyboard instrument in itself. Second, the digital model of "inside piano" playing, using multitouch keyboard data, allows for performance techniques...

  1. Microfluidic Apps for off-the-shelf instruments.

    Science.gov (United States)

    Mark, Daniel; von Stetten, Felix; Zengerle, Roland

    2012-07-21

    Within the last decade a huge increase in research activity in microfluidics could be observed. However, despite several commercial success stories, microfluidic chips are still not sold in high numbers in mass markets so far. Here we promote a new concept that could be an alternative approach to commercialization: designing microfluidic chips for existing off-the-shelf instruments. Such "Microfluidic Apps" could significantly lower market entry barriers and provide many advantages: developers of microfluidic chips make use of existing equipment or platforms and do not have to develop instruments from scratch; end-users can profit from microfluidics without the need to invest in new equipment; instrument manufacturers benefit from an expanded customer base due to the new applications that can be implemented in their instruments. Microfluidic Apps could be considered as low-cost disposables which can easily be distributed globally via web-shops. Therefore they could be a door-opener for high-volume mass markets.

  2. Identifying the most informative variables for decision-making problems – a survey of recent approaches and accompanying problems

    Czech Academy of Sciences Publication Activity Database

    Pudil, Pavel; Somol, Petr

    2008-01-01

    Roč. 16, č. 4 (2008), s. 37-55 ISSN 0572-3043 R&D Projects: GA MŠk 1M0572 Grant - others:GA MŠk(CZ) 2C06019 Institutional research plan: CEZ:AV0Z10750506 Keywords : variable selection * decision making Subject RIV: BD - Theory of Information http://library.utia.cas.cz/separaty/2008/RO/pudil-identifying%20the%20most%20informative%20variables%20for%20decision- making %20problems%20a%20survey%20of%20recent%20approaches%20and%20accompanying%20problems.pdf

  3. A socio-cultural instrumental approach to emotion regulation: Culture and the regulation of positive emotions.

    Science.gov (United States)

    Ma, Xiaoming; Tamir, Maya; Miyamoto, Yuri

    2018-02-01

    We propose a sociocultural instrumental approach to emotion regulation. According to this approach, cultural differences in the tendency to savor rather than dampen positive emotions should be more pronounced when people are actively pursuing goals (i.e., contexts requiring higher cognitive effort) than when they are not (i.e., contexts requiring lower cognitive efforts), because cultural beliefs about the utility of positive emotions should become most relevant when people are engaging in active goal pursuit. Four studies provided support for our theory. First, European Americans perceived more utility and less harm of positive emotions than Japanese did (Study 1). Second, European Americans reported a stronger relative preference for positive emotions than Asians, but this cultural difference was larger in high cognitive effort contexts than in moderate or low cognitive effort contexts (Study 2). Third, European Americans reported trying to savor rather than dampen positive emotions more than Asians did when preparing to take an exam, a typical high cognitive effort context (Studies 3-4), but these cultural differences were attenuated when an exam was not expected (Study 3) and disappeared when participants expected to interact with a stranger (Study 4). These findings suggest that cultural backgrounds and situational demands interact to shape how people regulate positive emotions. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  4. Musical Instrument Classification Based on Nonlinear Recurrence Analysis and Supervised Learning

    Directory of Open Access Journals (Sweden)

    R.Rui

    2013-04-01

    Full Text Available In this paper, the phase space reconstruction of time series produced by different instruments is discussed based on the nonlinear dynamic theory. The dense ratio, a novel quantitative recurrence parameter, is proposed to describe the difference of wind instruments, stringed instruments and keyboard instruments in the phase space by analyzing the recursive property of every instrument. Furthermore, a novel supervised learning algorithm for automatic classification of individual musical instrument signals is addressed deriving from the idea of supervised non-negative matrix factorization (NMF algorithm. In our approach, the orthogonal basis matrix could be obtained without updating the matrix iteratively, which NMF is unable to do. The experimental results indicate that the accuracy of the proposed method is improved by 3% comparing with the conventional features in the individual instrument classification.

  5. Alternative approaches to pollution control and waste management: Regulatory and economic instruments

    International Nuclear Information System (INIS)

    Bernstein, J.D.

    1993-01-01

    The purpose of the paper is to present an overview of the most common strategies and policy instruments (that is, regulatory and economic) used in developed and developing countries to achieve pollution control and waste management objectives. Although this topic has been at the center of theoretical controversy both within and outside the World Bank, the paper is not intended to contribute to this debate. Rather, its purpose is to explore how regulatory and economic instruments are used to control air and water pollution, protect ground water, and manage solid and hazardous wastes. The paper is directed to policy makers at the national, state, and local levels of government, as well as to other parties responsible for pollution control and waste management programs

  6. A Fiji multi-coral δ18O composite approach to obtaining a more accurate reconstruction of the last two-centuries of the ocean-climate variability in the South Pacific Convergence Zone region

    Science.gov (United States)

    Dassié, Emilie P.; Linsley, Braddock K.; Corrège, Thierry; Wu, Henry C.; Lemley, Gavin M.; Howe, Steve; Cabioch, Guy

    2014-12-01

    The limited availability of oceanographic data in the tropical Pacific Ocean prior to the satellite era makes coral-based climate reconstructions a key tool for extending the instrumental record back in time, thereby providing a much needed test for climate models and projections. We have generated a unique regional network consisting of five Porites coral δ18O time series from different locations in the Fijian archipelago. Our results indicate that using a minimum of three Porites coral δ18O records from Fiji is statistically sufficient to obtain a reliable signal for climate reconstruction, and that application of an approach used in tree ring studies is a suitable tool to determine this number. The coral δ18O composite indicates that while sea surface temperature (SST) variability is the primary driver of seasonal δ18O variability in these Fiji corals, annual average coral δ18O is more closely correlated to sea surface salinity (SSS) as previously reported. Our results highlight the importance of water mass advection in controlling Fiji coral δ18O and salinity variability at interannual and decadal time scales despite being located in the heavy rainfall region of the South Pacific Convergence Zone (SPCZ). The Fiji δ18O composite presents a secular freshening and warming trend since the 1850s coupled with changes in both interannual (IA) and decadal/interdecadal (D/I) variance. The changes in IA and D/I variance suggest a re-organization of climatic variability in the SPCZ region beginning in the late 1800s to period of a more dominant interannual variability, which could correspond to a southeast expansion of the SPCZ.

  7. The Overtone Fiddle: an Actuated Acoustic Instrument

    DEFF Research Database (Denmark)

    Overholt, Daniel

    2011-01-01

    both traditional violin techniques, as well as extended playing techniques that incorporate shared man/machine control of the resulting sound. A magnetic pickup system is mounted to the end of the fiddle’s fingerboard in order to detect the signals from the vibrating strings, deliberately not capturing...... vibrations from the full body of the instrument. This focused sensing approach allows less restrained use of DSP-generated feedback signals, as there is very little direct leakage from the actuator embedded in the body of the instrument back to the pickup....

  8. Asymptotics of diagonal elements of projection matrices under many instruments/regressors

    Czech Academy of Sciences Publication Activity Database

    Anatolyev, Stanislav; Yaskov, P.

    2017-01-01

    Roč. 33, č. 3 (2017), s. 717-738 ISSN 0266-4666 Institutional support: RVO:67985998 Keywords : instrumental variable estimation * inference * models Subject RIV: AH - Economics OBOR OECD: Applied Economics, Econometrics Impact factor: 1.011, year: 2016

  9. ReOBS: a new approach to synthesize long-term multi-variable dataset and application to the SIRTA supersite

    Science.gov (United States)

    Chiriaco, Marjolaine; Dupont, Jean-Charles; Bastin, Sophie; Badosa, Jordi; Lopez, Julio; Haeffelin, Martial; Chepfer, Helene; Guzman, Rodrigo

    2018-05-01

    A scientific approach is presented to aggregate and harmonize a set of 60 geophysical variables at hourly timescale over a decade, and to allow multiannual and multi-variable studies combining atmospheric dynamics and thermodynamics, radiation, clouds and aerosols from ground-based observations. Many datasets from ground-based observations are currently in use worldwide. They are very valuable because they contain complete and precise information due to their spatio-temporal co-localization over more than a decade. These datasets, in particular the synergy between different type of observations, are under-used because of their complexity and diversity due to calibration, quality control, treatment, format, temporal averaging, metadata, etc. Two main results are presented in this article: (1) a set of methods available for the community to robustly and reliably process ground-based data at an hourly timescale over a decade is described and (2) a single netCDF file is provided based on the SIRTA supersite observations. This file contains approximately 60 geophysical variables (atmospheric and in ground) hourly averaged over a decade for the longest variables. The netCDF file is available and easy to use for the community. In this article, observations are re-analyzed. The prefix re refers to six main steps: calibration, quality control, treatment, hourly averaging, homogenization of the formats and associated metadata, as well as expertise on more than a decade of observations. In contrast, previous studies (i) took only some of these six steps into account for each variable, (ii) did not aggregate all variables together in a single file and (iii) did not offer an hourly resolution for about 60 variables over a decade (for the longest variables). The approach described in this article can be applied to different supersites and to additional variables. The main implication of this work is that complex atmospheric observations are made readily available for scientists

  10. MANU. Instrumentation of Buffer Demo. Preliminary Study

    International Nuclear Information System (INIS)

    Laaksonen, R.

    2010-01-01

    The purpose of this work is to describe feasible measuring and monitoring alternatives which can be used, if needed, in medium to full scale nuclear waste repository deposition hole mock-up tests. The focus of the work was to determine what variables can actually be measured, how to achieve the measurements and what kind of demands comes from the modelling, scientific, and technical points of view. This project includes a review of the previous waste repository mock-up tests carried out in several European countries such as Belgium, Czech Republic, Spain and Sweden. Also information was gathered by interviewing domestic and foreign scientists specialized in the fields of measurement instrumentation and related in-situ and laboratory work. On the basis of this review, recommendations were developed for the necessary actions needed to be done from the instrumentation point of view for future tests. It is possible to measure and monitor the processes going on in a deposition hole in-situ conditions. The data received during a test in real repository conditions enables to follow the processes and to verify the hypothesis made on the behaviour of various components of the repository: buffer, canister, rock and backfill. Because full scale testing is expensive, the objectives and hypothesis must be carefully set and the test itself with its instrumentation must serve very specific objectives. The main purpose of mock-up tests is to verify that the conditions surrounding the canister are according to the design requirements. A whole mock-up test and demonstration process requires a lot of time and effort. The instrumentation part of the work must also start at early stages to ensure that the instrumentation itself will not become bottlenecked nor suffer from low quality solutions. The planning of the instrumentation work could be done in collaboration with foreign scientists which have participated to previous instrumentation projects. (orig.)

  11. Instrumentation

    International Nuclear Information System (INIS)

    Prieur, G.; Nadi, M.; Hedjiedj, A.; Weber, S.

    1995-01-01

    This second chapter on instrumentation gives little general consideration on history and classification of instrumentation, and two specific states of the art. The first one concerns NMR (block diagram of instrumentation chain with details on the magnets, gradients, probes, reception unit). The first one concerns precision instrumentation (optical fiber gyro-meter and scanning electron microscope), and its data processing tools (programmability, VXI standard and its history). The chapter ends with future trends on smart sensors and Field Emission Displays. (D.L.). Refs., figs

  12. Longitudinal Research with Latent Variables

    CERN Document Server

    van Montfort, Kees; Satorra, Albert

    2010-01-01

    This book combines longitudinal research and latent variable research, i.e. it explains how longitudinal studies with objectives formulated in terms of latent variables should be carried out, with an emphasis on detailing how the methods are applied. Because longitudinal research with latent variables currently utilizes different approaches with different histories, different types of research questions, and different computer programs to perform the analysis, the book is divided into nine chapters. Starting from some background information about the specific approach, short history and the ma

  13. A framework for assessing Health Economic Evaluation (HEE) quality appraisal instruments.

    Science.gov (United States)

    Langer, Astrid

    2012-08-16

    Health economic evaluations support the health care decision-making process by providing information on costs and consequences of health interventions. The quality of such studies is assessed by health economic evaluation (HEE) quality appraisal instruments. At present, there is no instrument for measuring and improving the quality of such HEE quality appraisal instruments. Therefore, the objectives of this study are to establish a framework for assessing the quality of HEE quality appraisal instruments to support and improve their quality, and to apply this framework to those HEE quality appraisal instruments which have been subject to more scrutiny than others, in order to test the framework and to demonstrate the shortcomings of existing HEE quality appraisal instruments. To develop the quality assessment framework for HEE quality appraisal instruments, the experiences of using appraisal tools for clinical guidelines are used. Based on a deductive iterative process, clinical guideline appraisal instruments identified through literature search are reviewed, consolidated, and adapted to produce the final quality assessment framework for HEE quality appraisal instruments. The final quality assessment framework for HEE quality appraisal instruments consists of 36 items organized within 7 dimensions, each of which captures a specific domain of quality. Applying the quality assessment framework to four existing HEE quality appraisal instruments, it is found that these four quality appraisal instruments are of variable quality. The framework described in this study should be regarded as a starting point for appraising the quality of HEE quality appraisal instruments. This framework can be used by HEE quality appraisal instrument producers to support and improve the quality and acceptance of existing and future HEE quality appraisal instruments. By applying this framework, users of HEE quality appraisal instruments can become aware of methodological deficiencies

  14. Variation in posture quality across musical instruments and its impact during performances.

    Science.gov (United States)

    Blanco-Piñeiro, Patricia; Díaz-Pereira, M Pino; Martínez Vidal, Aurora

    2018-06-01

    Bad posture increases the risk that a musician may suffer from musculoskeletal disorders. This study compared posture quality required by different instruments or families of instruments. Using an ad-hoc postural observation instrument embracing 11 postural variables, four experts evaluated the postures of 100 students attending a Spanish higher conservatory of music. The agreement of the experts' evaluations was statistically confirmed by a Cohen's κ value between 0.855 and 1.000 and a Kendall value between 0.709 and 1.000 (p instrument families and seated posture with respect to pelvic attitude, dorsal curvature and head alignment in both sagittal and frontal planes. This analysis also showed an association between instrument families and standing posture with respect to the frontal plane of the axis of gravity, pelvic attitude, head alignment in the frontal plane, the sagittal plane of the shoulders and overall posture. While certain postural defects appear to be common to all families of instruments, others are more characteristic of some families than others. The instrument associated with the best posture quality was the bagpipe, followed by percussion and strings.

  15. Radiation Budget Instrument (RBI) for JPSS-2

    Science.gov (United States)

    Georgieva, Elena; Priestley, Kory; Dunn, Barry; Cageao, Richard; Barki, Anum; Osmundsen, Jim; Turczynski, Craig; Abedin, Nurul

    2015-01-01

    Radiation Budget Instrument (RBI) will be one of five instruments flying aboard the JPSS-2 spacecraft, a polar-orbiting sun-synchronous satellite in Low Earth Orbit. RBI is a passive remote sensing instrument that will follow the successful legacy of the Clouds and Earth's Radiant Energy System (CERES) instruments to make measurement of Earth's short and longwave radiation budget. The goal of RBI is to provide an independent measurement of the broadband reflected solar radiance and Earth's emitted thermal radiance by using three spectral bands (Shortwave, Longwave, and Total) that will have the same overlapped point spread function (PSF) footprint on Earth. To ensure precise NIST-traceable calibration in space the RBI sensor is designed to use a visible calibration target (VCT), a solar calibration target (SCT), and an infrared calibration target (ICT) containing phase change cells (PCC) to enable on-board temperature calibration. The VCT is a thermally controlled integrating sphere with space grade Spectralon covering the inner surface. Two sides of the sphere will have fiber-coupled laser diodes in the UV to IR wavelength region. An electrical substitution radiometer on the integrating sphere will monitor the long term stability of the sources and the possible degradation of the Spectralon in space. In addition the radiometric calibration operations will use the Spectralon diffusers of the SCT to provide accurate measurements of Solar degradation. All those stable on-orbit references will ensure that calibration stability is maintained over the RBI sensor lifetime. For the preflight calibration the RBI will view five calibration sources - two integrating spheres and three CrIS (Cross-track Infrared Sounder ) -like blackbodies whose outputs will be validated with NIST calibration approach. Thermopile are the selected detectors for the RBI. The sensor has a requirement to perform lunar calibration in addition to solar calibration in space in a way similar to CERES

  16. Beam Instrumentation and Diagnostics

    CERN Document Server

    Strehl, Peter

    2006-01-01

    This treatise covers all aspects of the design and the daily operations of a beam diagnostic system for a large particle accelerator. A very interdisciplinary field, it involves contributions from physicists, electrical and mechanical engineers and computer experts alike so as to satisfy the ever-increasing demands for beam parameter variability for a vast range of operation modi and particles. The author draws upon 40 years of research and work, most of them spent as the head of the beam diagnostics group at GSI. He has illustrated the more theoretical aspects with many real-life examples that will provide beam instrumentation designers with ideas and tools for their work.

  17. Asymptotics of diagonal elements of projection matrices under many instruments/regressors

    Czech Academy of Sciences Publication Activity Database

    Anatolyev, Stanislav; Yaskov, P.

    2017-01-01

    Roč. 33, č. 3 (2017), s. 717-738 ISSN 0266-4666 Institutional support: Progres-Q24 Keywords : instrumental variable estimation * inference * models Subject RIV: AH - Economics OBOR OECD: Applied Economics, Econometrics Impact factor: 1.011, year: 2016

  18. Instrumentation

    Energy Technology Data Exchange (ETDEWEB)

    Decreton, M

    2001-04-01

    SCK-CEN's research and development programme on instrumentation involves the assessment and the development of sensitive measurement systems used within a radiation environment. Particular emphasis is on the assessment of optical fibre components and their adaptability to radiation environments. The evaluation of ageing processes of instrumentation in fission plants, the development of specific data evaluation strategies to compensate for ageing induced degradation of sensors and cable performance form part of these activities. In 2000, particular emphasis was on in-core reactor instrumentation applied to fusion, accelerator driven and water-cooled fission reactors. This involved the development of high performance instrumentation for irradiation experiments in the BR2 reactor in support of new instrumentation needs for MYRRHA, and for diagnostic systems for the ITER reactor.

  19. Instrumentation

    International Nuclear Information System (INIS)

    Decreton, M.

    2001-01-01

    SCK-CEN's research and development programme on instrumentation involves the assessment and the development of sensitive measurement systems used within a radiation environment. Particular emphasis is on the assessment of optical fibre components and their adaptability to radiation environments. The evaluation of ageing processes of instrumentation in fission plants, the development of specific data evaluation strategies to compensate for ageing induced degradation of sensors and cable performance form part of these activities. In 2000, particular emphasis was on in-core reactor instrumentation applied to fusion, accelerator driven and water-cooled fission reactors. This involved the development of high performance instrumentation for irradiation experiments in the BR2 reactor in support of new instrumentation needs for MYRRHA, and for diagnostic systems for the ITER reactor

  20. A View of Current Evaluative Practices in Instrumental Music Teacher Education

    Science.gov (United States)

    Peterson, Amber Dahlén

    2014-01-01

    The purpose of this study was to examine how instrumental music educator skills are being evaluated in current undergraduate programs. While accrediting organizations mandate certain elements of these programs, they provide limited guidance on what evaluative approaches should be used. Instrumental music teacher educators in the College Music…

  1. Manufacturing challenge: An employee perception of the impact of BEM variables on motivation

    Science.gov (United States)

    Nyaude, Alaster

    The study examines the impact of Thomas F. Gilbert's Behavior Engineering Model (BEM) variables on employee perception of motivation at an aerospace equipment manufacturing plant in Georgia. The research process involved literature review, and determination of an appropriate survey instrument for the study. The Hersey-Chevalier modified PROBE instrument (Appendix C) was used with Dr Roger Chevalier's validation. The participants' responses were further examined to determine the influence of demographic control variables of age, gender, length of service with the company and education on employee perception of motivation. The results indicated that the top three highly motivating variables were knowledge and skills, capacity and resources. Knowledge and skills was perceived to be highly motivating, capacity as second highly motivating and resources as the third highly motivating variable. Interestingly, the fourth highly motivating variable was information, the fifth was motives and the sixth was incentives. The results also showed that demographic control variables had no influence on employee perception of motivation. Further research may be required to understand to what extend these BEM variables impact employee perceptions of motivation.

  2. Linear latent variable models: the lava-package

    DEFF Research Database (Denmark)

    Holst, Klaus Kähler; Budtz-Jørgensen, Esben

    2013-01-01

    are implemented including robust standard errors for clustered correlated data, multigroup analyses, non-linear parameter constraints, inference with incomplete data, maximum likelihood estimation with censored and binary observations, and instrumental variable estimators. In addition an extensive simulation......An R package for specifying and estimating linear latent variable models is presented. The philosophy of the implementation is to separate the model specification from the actual data, which leads to a dynamic and easy way of modeling complex hierarchical structures. Several advanced features...

  3. A Nonlinear Mixed Effects Approach for Modeling the Cell-To-Cell Variability of Mig1 Dynamics in Yeast.

    Directory of Open Access Journals (Sweden)

    Joachim Almquist

    Full Text Available The last decade has seen a rapid development of experimental techniques that allow data collection from individual cells. These techniques have enabled the discovery and characterization of variability within a population of genetically identical cells. Nonlinear mixed effects (NLME modeling is an established framework for studying variability between individuals in a population, frequently used in pharmacokinetics and pharmacodynamics, but its potential for studies of cell-to-cell variability in molecular cell biology is yet to be exploited. Here we take advantage of this novel application of NLME modeling to study cell-to-cell variability in the dynamic behavior of the yeast transcription repressor Mig1. In particular, we investigate a recently discovered phenomenon where Mig1 during a short and transient period exits the nucleus when cells experience a shift from high to intermediate levels of extracellular glucose. A phenomenological model based on ordinary differential equations describing the transient dynamics of nuclear Mig1 is introduced, and according to the NLME methodology the parameters of this model are in turn modeled by a multivariate probability distribution. Using time-lapse microscopy data from nearly 200 cells, we estimate this parameter distribution according to the approach of maximizing the population likelihood. Based on the estimated distribution, parameter values for individual cells are furthermore characterized and the resulting Mig1 dynamics are compared to the single cell times-series data. The proposed NLME framework is also compared to the intuitive but limited standard two-stage (STS approach. We demonstrate that the latter may overestimate variabilities by up to almost five fold. Finally, Monte Carlo simulations of the inferred population model are used to predict the distribution of key characteristics of the Mig1 transient response. We find that with decreasing levels of post-shift glucose, the transient

  4. A spray flamelet/progress variable approach combined with a transported joint PDF model for turbulent spray flames

    Science.gov (United States)

    Hu, Yong; Olguin, Hernan; Gutheil, Eva

    2017-05-01

    A spray flamelet/progress variable approach is developed for use in spray combustion with partly pre-vaporised liquid fuel, where a laminar spray flamelet library accounts for evaporation within the laminar flame structures. For this purpose, the standard spray flamelet formulation for pure evaporating liquid fuel and oxidiser is extended by a chemical reaction progress variable in both the turbulent spray flame model and the laminar spray flame structures, in order to account for the effect of pre-vaporised liquid fuel for instance through use of a pilot flame. This new approach is combined with a transported joint probability density function (PDF) method for the simulation of a turbulent piloted ethanol/air spray flame, and the extension requires the formulation of a joint three-variate PDF depending on the gas phase mixture fraction, the chemical reaction progress variable, and gas enthalpy. The molecular mixing is modelled with the extended interaction-by-exchange-with-the-mean (IEM) model, where source terms account for spray evaporation and heat exchange due to evaporation as well as the chemical reaction rate for the chemical reaction progress variable. This is the first formulation using a spray flamelet model considering both evaporation and partly pre-vaporised liquid fuel within the laminar spray flamelets. Results with this new formulation show good agreement with the experimental data provided by A.R. Masri, Sydney, Australia. The analysis of the Lagrangian statistics of the gas temperature and the OH mass fraction indicates that partially premixed combustion prevails near the nozzle exit of the spray, whereas further downstream, the non-premixed flame is promoted towards the inner rich-side of the spray jet since the pilot flame heats up the premixed inner spray zone. In summary, the simulation with the new formulation considering the reaction progress variable shows good performance, greatly improving the standard formulation, and it provides new

  5. The Instrumentation Channel for the MUCOOL Experiment

    International Nuclear Information System (INIS)

    Kahn, S. A.; Guler, H.; Lu, C.; McDonald, K. T.; Prebys, E. J.; Vahsen, S. E.

    1999-01-01

    The MUCOOL facility is proposed to examine cooling techniques that could be used in a muon collider. The solenoidal beam channel before and after the cooling test section are instrumented to measure the beam emittance. This instrumentation channel includes a bent solenoid to provide dispersion and time projection chambers to measure the beam variables before and after the bend. The momentum of the muons is obtained from a measurement of the drift of the muon trajectory in the bent solenoid. The timing measurement is made by determining the phase from the momentum of the muon before and after it traverses RF cavities or by the use of a fast Cherenkov chamber. A computer simulation of the muon solenoidal channel is performed using GEANT. This study evaluates the resolution of the beam emittance measurement for MUCOOL

  6. the research quality plus (rq+) assessment instrument

    International Development Research Centre (IDRC) Digital Library (Canada)

    Thomas Schwandt

    THE RESEARCH QUALITY PLUS (RQ+) ASSESSMENT INSTRUMENT ... consistent way to allow for further meta-analysis about research quality over time. ... Addresses complex and integrative problems, requiring systems-based approaches ..... benefits or financial costs for participants that might not be appropriate in the ...

  7. A cognitive approach to social assistance instruments: how public assistance recipients represent and interpret the anti-poverty progra m Familias en Acción

    Directory of Open Access Journals (Sweden)

    Valeria Ayola Betancourt

    2016-06-01

    Full Text Available This article considers the contribution of the cognitive framework of public policy and the instrument approach to understanding poverty regulation in Colombia through its instruments. We analyze the relationship between the ideological frame of the Familias en Acción program and the recipient’s construction of representations, meanings and interpretations about it. It describes the manner in which the beneficiaries are interpreting the State intervention, but it also describes some social effects arising from the implementation of the programs, taking into account the local scope. For this we use qualitative research techniques and stand out the semi-structured interview as method. Our fieldwork, is based on a comparison between Cartagena city and the San Jacinto’s rural zone.

  8. “You Can’t Play a Sad Song on the Banjo:” Acoustic Factors in the Judgment of Instrument Capacity to Convey Sadness

    Directory of Open Access Journals (Sweden)

    David Huron

    2014-05-01

    Full Text Available Forty-four Western-enculturated musicians completed two studies. The first group was asked to judge the relative sadness of forty-four familiar Western instruments. An independent group was asked to assess a number of acoustical properties for those same instruments. Using the estimated acoustical properties as predictor variables in a multiple regression analysis, a significant correlation was found between those properties known to contribute to sad prosody in speech and the judged sadness of the instruments. The best predictor variable was the ability of the instrument to make small pitch movements. Other variables investigated included the darkness of the timbre, the ability to play low pitches, the ability to play quietly, and the capacity of the instrument to "mumble." Four of the acoustical factors were found to exhibit a considerable amount of shared variance, suggesting that they may originate in a common underlying factor. It is suggested that the shared proximal cause of these acoustical features may be low physical energy.

  9. Comparing Diagnostic Accuracy of Cognitive Screening Instruments: A Weighted Comparison Approach

    Directory of Open Access Journals (Sweden)

    A.J. Larner

    2013-03-01

    Full Text Available Background/Aims: There are many cognitive screening instruments available to clinicians when assessing patients' cognitive function, but the best way to compare the diagnostic utility of these tests is uncertain. One method is to undertake a weighted comparison which takes into account the difference in sensitivity and specificity of two tests, the relative clinical misclassification costs of true- and false-positive diagnosis, and also disease prevalence. Methods: Data were examined from four pragmatic diagnostic accuracy studies from one clinic which compared the Mini-Mental State Examination (MMSE with the Addenbrooke's Cognitive Examination-Revised (ACE-R, the Montreal Cognitive Assessment (MoCA, the Test Your Memory (TYM test, and the Mini-Mental Parkinson (MMP, respectively. Results: Weighted comparison calculations suggested a net benefit for ACE-R, MoCA, and MMP compared to MMSE, but a net loss for TYM test compared to MMSE. Conclusion: Routine incorporation of weighted comparison or other similar net benefit measures into diagnostic accuracy studies merits consideration to better inform clinicians of the relative value of cognitive screening instruments.

  10. Environmental Variables and Pupils' Academic Performance in ...

    African Journals Online (AJOL)

    This causal-comparative study was carried out to investigate the influence of environmental variables on pupils' academic performance in primary science in Cross River State, Nigeria. Three hypotheses were formulated to guide the study. Two instruments were used to collect data for the study namely: environmental ...

  11. Instrumental variable analysis

    NARCIS (Netherlands)

    Stel, Vianda S.; Dekker, Friedo W.; Zoccali, Carmine; Jager, Kitty J.

    2013-01-01

    The main advantage of the randomized controlled trial (RCT) is the random assignment of treatment that prevents selection by prognosis. Nevertheless, only few RCTs can be performed given their high cost and the difficulties in conducting such studies. Therefore, several analytical methods for

  12. Investigation of Music Student Efficacy as Influenced by Age, Experience, Gender, Ethnicity, and Type of Instrument Played in South Carolina

    Science.gov (United States)

    White, Norman

    2010-01-01

    The purpose of this research study was to quantitatively examine South Carolina high school instrumental music students' self-efficacy as measured by the Generalized Self-Efficacy (GSE) instrument (Schwarzer & Jerusalem, 1993). The independent variables of age, experience, gender, ethnicity, and type of instrument played) were correlated with…

  13. Instrument comparison for Aerosolized Titanium Dioxide

    Science.gov (United States)

    Ranpara, Anand

    Recent toxicological studies have shown that the surface area of ultrafine particles (UFP i.e., particles with diameters less than 0.1 micrometer) has a stronger correlation with adverse health effects than does mass of these particles. Ultrafine titanium dioxide (TiO2) particles are widely used in industry, and their use is associated with adverse health outcomes, such as micro vascular dysfunctions and pulmonary damages. The primary aim of this experimental study was to compare a variety of laboratory and industrial hygiene (IH) field study instruments all measuring the same aerosolized TiO2. The study also observed intra-instrument variability between measurements made by two apparently identical devices of the same type of instrument placed side-by-side. The types of instruments studied were (1) DustTrak(TM) DRX, (2) Personal Data RAMs(TM) (PDR), (3) GRIMM, (4) Diffusion charger (DC) and (5) Scanning Mobility Particle Sizer (SMPS). Two devices of each of the four IH field study instrument types were used to measure six levels of mass concentration of fine and ultrafine TiO2 aerosols in controlled chamber tests. Metrics evaluated included real-time mass, active surface area and number/geometric surface area distributions, and off-line gravimetric mass and morphology on filters. DustTrak(TM) DRXs and PDRs were used for mass concentration measurements. DCs were used for active surface area concentration measurements. GRIMMs were used for number concentration measurements. SMPS was used for inter-instrument comparisons of surface area and number concentrations. The results indicated that two apparently identical devices of each DRX and PDR were statistically not different with each other for all the trials of both the sizes of powder (p < 5%). Mean difference between mass concentrations measured by two DustTrak DRX devices was smaller than that measured by two PDR devices. DustTrak DRX measurements were closer to the reference method, gravimetric mass concentration

  14. The MCLIB library: Monte Carlo simulation of neutron scattering instruments

    Energy Technology Data Exchange (ETDEWEB)

    Seeger, P.A.

    1995-09-01

    Monte Carlo is a method to integrate over a large number of variables. Random numbers are used to select a value for each variable, and the integrand is evaluated. The process is repeated a large number of times and the resulting values are averaged. For a neutron transport problem, first select a neutron from the source distribution, and project it through the instrument using either deterministic or probabilistic algorithms to describe its interaction whenever it hits something, and then (if it hits the detector) tally it in a histogram representing where and when it was detected. This is intended to simulate the process of running an actual experiment (but it is much slower). This report describes the philosophy and structure of MCLIB, a Fortran library of Monte Carlo subroutines which has been developed for design of neutron scattering instruments. A pair of programs (LQDGEOM and MC{_}RUN) which use the library are shown as an example.

  15. The MCLIB library: Monte Carlo simulation of neutron scattering instruments

    International Nuclear Information System (INIS)

    Seeger, P.A.

    1995-01-01

    Monte Carlo is a method to integrate over a large number of variables. Random numbers are used to select a value for each variable, and the integrand is evaluated. The process is repeated a large number of times and the resulting values are averaged. For a neutron transport problem, first select a neutron from the source distribution, and project it through the instrument using either deterministic or probabilistic algorithms to describe its interaction whenever it hits something, and then (if it hits the detector) tally it in a histogram representing where and when it was detected. This is intended to simulate the process of running an actual experiment (but it is much slower). This report describes the philosophy and structure of MCLIB, a Fortran library of Monte Carlo subroutines which has been developed for design of neutron scattering instruments. A pair of programs (LQDGEOM and MC RUN) which use the library are shown as an example

  16. Ergonomic investigation of weight distribution of laparoscopic instruments.

    Science.gov (United States)

    Lin, Chiuhsiang Joe; Chen, Hung-Jen; Lo, Ying-Chu

    2011-06-01

    Laparoscopic surgery procedures require highly specialized visually controlled movements. Investigations of industrial applications indicate that the length as well as the weight of hand-held tools substantially affects movement time (MT). Different weight distributions may have similar effects on long-shafted laparoscopic instruments when performing surgical procedures. For this reason, the current experiment aimed at finding direct evidence of the weight distribution effect in an accurate task. Ten right-handed subjects made continuous Fitts' pointing tasks using a long laparoscopic instrument. The factors and levels were target width: (2.5, 3, 3.5, and 4 cm), target distance (14, 23, and 37 cm), and weight distribution (uniform, front, middle, and rear). Weight distribution was made by chips of lead attached to the laparoscopic instrument. MT, error rate, and throughput (TP) were recorded as dependent variables. There were significant differences between the weight distribution in MT and in TP. The middle position was found to require the least time to manipulate the laparoscopic instrument in pointing tasks and also obtained the highest TP. These analyses and findings pointed to a design direction for the ergonomics and usability of the long hand-held tool such as the laparoscopic instrument in this study. To optimize efficiency in using these tools, the consideration of a better weight design is important and should not be neglected.

  17. Comparison of instrumented anterior interbody fusion with instrumented circumferential lumbar fusion.

    Science.gov (United States)

    Madan, S S; Boeree, N R

    2003-12-01

    Posterior lumbar interbody fusion (PLIF) restores disc height, the load bearing ability of anterior ligaments and muscles, root canal dimensions, and spinal balance. It immobilizes the painful degenerate spinal segment and decompresses the nerve roots. Anterior lumbar interbody fusion (ALIF) does the same, but could have complications of graft extrusion, compression and instability contributing to pseudarthrosis in the absence of instrumentation. The purpose of this study was to assess and compare the outcome of instrumented circumferential fusion through a posterior approach [PLIF and posterolateral fusion (PLF)] with instrumented ALIF using the Hartshill horseshoe cage, for comparable degrees of internal disc disruption and clinical disability. It was designed as a prospective study, comparing the outcome of two methods of instrumented interbody fusion for internal disc disruption. Between April 1994 and June 1998, the senior author (N.R.B.) performed 39 instrumented ALIF procedures and 35 instrumented circumferential fusion with PLIF procedures. The second author, an independent assessor (S.M.), performed the entire review. Preoperative radiographic assessment included plain radiographs, magnetic resonance imaging (MRI) and provocative discography in all the patients. The outcome in the two groups was compared in terms of radiological improvement and clinical improvement, measured on the basis of improvement of back pain and work capacity. Preoperatively, patients were asked to fill out a questionnaire giving their demographic details, maximum walking distance and current employment status in order to establish the comparability of the two groups. Patient assessment was with the Oswestry Disability Index, quality of life questionnaire (subjective), pain drawing, visual analogue scale, disability benefit, compensation status, and psychological profile. The results of the study showed a satisfactory outcome (scorelife questionnaire) score of 71.8% (28 patients) in

  18. An emotional processing writing intervention and heart rate variability: the role of emotional approach.

    Science.gov (United States)

    Seeley, Saren H; Yanez, Betina; Stanton, Annette L; Hoyt, Michael A

    2017-08-01

    Expressing and understanding one's own emotional responses to negative events, particularly those that challenge the attainment of important life goals, is thought to confer physiological benefit. Individual preferences and/or abilities in approaching emotions might condition the efficacy of interventions designed to encourage written emotional processing (EP). This study examines the physiological impact (as indexed by heart rate variability (HRV)) of an emotional processing writing (EPW) task as well as the moderating influence of a dispositional preference for coping through emotional approach (EP and emotional expression (EE)), in response to a laboratory stress task designed to challenge an important life goal. Participants (n = 98) were randomly assigned to either EPW or fact control writing (FCW) following the stress task. Regression analyses revealed a significant dispositional EP by condition interaction, such that high EP participants in the EPW condition demonstrated higher HRV after writing compared to low EP participants. No significant main effects of condition or EE coping were observed. These findings suggest that EPW interventions may be best suited for those with preference or ability to process emotions related to a stressor or might require adaptation for those who less often cope through emotional approach.

  19. A regression modeling approach for studying carbonate system variability in the northern Gulf of Alaska

    Science.gov (United States)

    Evans, Wiley; Mathis, Jeremy T.; Winsor, Peter; Statscewich, Hank; Whitledge, Terry E.

    2013-01-01

    northern Gulf of Alaska (GOA) shelf experiences carbonate system variability on seasonal and annual time scales, but little information exists to resolve higher frequency variability in this region. To resolve this variability using platforms-of-opportunity, we present multiple linear regression (MLR) models constructed from hydrographic data collected along the Northeast Pacific Global Ocean Ecosystems Dynamics (GLOBEC) Seward Line. The empirical algorithms predict dissolved inorganic carbon (DIC) and total alkalinity (TA) using observations of nitrate (NO3-), temperature, salinity and pressure from the surface to 500 m, with R2s > 0.97 and RMSE values of 11 µmol kg-1 for DIC and 9 µmol kg-1 for TA. We applied these relationships to high-resolution NO3- data sets collected during a novel 20 h glider flight and a GLOBEC mesoscale SeaSoar survey. Results from the glider flight demonstrated time/space along-isopycnal variability of aragonite saturations (Ωarag) associated with a dicothermal layer (a cold near-surface layer found in high latitude oceans) that rivaled changes seen vertically through the thermocline. The SeaSoar survey captured the uplift to aragonite saturation horizon (depth where Ωarag = 1) shoaled to a previously unseen depth in the northern GOA. This work is similar to recent studies aimed at predicting the carbonate system in continental margin settings, albeit demonstrates that a NO3--based approach can be applied to high-latitude data collected from platforms capable of high-frequency measurements.

  20. Study The role of latent variables in lost working days by Structural Equation Modeling Approach

    Directory of Open Access Journals (Sweden)

    Meysam Heydari

    2016-12-01

    Full Text Available Background: Based on estimations, each year about 250 million work-related injuries and many temporary or permanent disabilities occur which most are preventable. Oil and Gas industries are among industries with high incidence of injuries in the world. The aim of this study has investigated  the role and effect of different risk management variables on lost working days (LWD in the seismic projects. Methods: This study was a retrospective, cross-sectional and systematic analysis, which was carried out on occupational accidents between 2008-2015(an 8 years period in different seismic projects for oilfield exploration at Dana Energy (Iranian Seismic Company. The preliminary sample size of the study were 487accidents. A systems analysis approach were applied by using root case analysis (RCA and structural equation modeling (SEM. Tools for the data analysis were included, SPSS23 and AMOS23  software. Results: The mean of lost working days (LWD, was calculated 49.57, the final model of structural equation modeling showed that latent variables of, safety and health training factor(-0.33, risk assessment factor(-0.55 and risk control factor (-0.61 as direct causes significantly affected of lost working days (LWD in the seismic industries (p< 0.05. Conclusion: The finding of present study revealed that combination of variables affected in lost working days (LWD. Therefore,the role of these variables in accidents should be investigated and suitable programs should be considered for them.

  1. Real-time instrument-failure detection in the LOFT pressurizer using functional redundancy

    International Nuclear Information System (INIS)

    Tylee, J.L.

    1982-07-01

    The functional redundancy approach to detecting instrument failures in a pressurized water reactor (PWR) pressurizer is described and evaluated. This real-time method uses a bank of Kalman filters (one for each instrument) to generate optimal estimates of the pressurizer state. By performing consistency checks between the output of each filter, failed instruments can be identified. Simulation results and actual pressurizer data are used to demonstrate the capabilities of the technique

  2. Metrological Array of Cyber-Physical Systems. Part 7. Additive Error Correction for Measuring Instrument

    Directory of Open Access Journals (Sweden)

    Yuriy YATSUK

    2015-06-01

    Full Text Available Since during design it is impossible to use the uncertainty approach because the measurement results are still absent and as noted the error approach that can be successfully applied taking as true the nominal value of instruments transformation function. Limiting possibilities of additive error correction of measuring instruments for Cyber-Physical Systems are studied basing on general and special methods of measurement. Principles of measuring circuit maximal symmetry and its minimal reconfiguration are proposed for measurement or/and calibration. It is theoretically justified for the variety of correction methods that minimum additive error of measuring instruments exists under considering the real equivalent parameters of input electronic switches. Terms of self-calibrating and verification the measuring instruments in place are studied.

  3. 8 years of Solar Spectral Irradiance Observations from the ISS with the SOLAR/SOLSPEC Instrument

    Science.gov (United States)

    Damé, L.; Bolsée, D.; Meftah, M.; Irbah, A.; Hauchecorne, A.; Bekki, S.; Pereira, N.; Cessateur, G.; Marchand, M.; Thiéblemont, R.; Foujols, T.

    2016-12-01

    Accurate measurements of Solar Spectral Irradiance (SSI) are of primary importance for a better understanding of solar physics and of the impact of solar variability on climate (via Earth's atmospheric photochemistry). The acquisition of a top of atmosphere reference solar spectrum and of its temporal and spectral variability during the unusual solar cycle 24 is of prime interest for these studies. These measurements are performed since April 2008 with the SOLSPEC spectro-radiometer from the far ultraviolet to the infrared (166 nm to 3088 nm). This instrument, developed under a fruitful LATMOS/BIRA-IASB collaboration, is part of the Solar Monitoring Observatory (SOLAR) payload, externally mounted on the Columbus module of the International Space Station (ISS). The SOLAR mission, with its actual 8 years duration, will cover almost the entire solar cycle 24. We present here the in-flight operations and performances of the SOLSPEC instrument, including the engineering corrections, calibrations and improved know-how procedure for aging corrections. Accordingly, a SSI reference spectrum from the UV to the NIR will be presented, together with its UV variability, as measured by SOLAR/SOLSPEC. Uncertainties on these measurements and comparisons with other instruments will be briefly discussed.

  4. Instrument-mounted displays for reducing cognitive load during surgical navigation.

    Science.gov (United States)

    Herrlich, Marc; Tavakol, Parnian; Black, David; Wenig, Dirk; Rieder, Christian; Malaka, Rainer; Kikinis, Ron

    2017-09-01

    Surgical navigation systems rely on a monitor placed in the operating room to relay information. Optimal monitor placement can be challenging in crowded rooms, and it is often not possible to place the monitor directly beside the situs. The operator must split attention between the navigation system and the situs. We present an approach for needle-based interventions to provide navigational feedback directly on the instrument and close to the situs by mounting a small display onto the needle. By mounting a small and lightweight smartwatch display directly onto the instrument, we are able to provide navigational guidance close to the situs and directly in the operator's field of view, thereby reducing the need to switch the focus of view between the situs and the navigation system. We devise a specific variant of the established crosshair metaphor suitable for the very limited screen space. We conduct an empirical user study comparing our approach to using a monitor and a combination of both. Results from the empirical user study show significant benefits for cognitive load, user preference, and general usability for the instrument-mounted display, while achieving the same level of performance in terms of time and accuracy compared to using a monitor. We successfully demonstrate the feasibility of our approach and potential benefits. With ongoing technological advancements, instrument-mounted displays might complement standard monitor setups for surgical navigation in order to lower cognitive demands and for improved usability of such systems.

  5. Biodiversity conservation and climate mitigation: What role can economic instruments play?

    NARCIS (Netherlands)

    Ring, I.; Drechsler, M.; Teeffelen, van A.J.A.; Irawan, S.; Venter, O.

    2010-01-01

    Tradable permits and intergovernmental fiscal transfers play an increasing role in both biodiversity conservation and climate mitigation. In comparison to regulatory and planning approaches these economic instruments offer a more flexible and cost-effective approach to biodiversity conservation.

  6. Radioisotope instruments

    CERN Document Server

    Cameron, J F; Silverleaf, D J

    1971-01-01

    International Series of Monographs in Nuclear Energy, Volume 107: Radioisotope Instruments, Part 1 focuses on the design and applications of instruments based on the radiation released by radioactive substances. The book first offers information on the physical basis of radioisotope instruments; technical and economic advantages of radioisotope instruments; and radiation hazard. The manuscript then discusses commercial radioisotope instruments, including radiation sources and detectors, computing and control units, and measuring heads. The text describes the applications of radioisotop

  7. Hydroclimate variability: comparing dendroclimatic records and future GCM scenarios

    International Nuclear Information System (INIS)

    Lapp, S.

    2008-01-01

    Drought events of the 20th Century in western North America have been linked to teleconnections that influence climate variability on inter-annual and decadal to multi-decadal time scales. These teleconnections represent the changes sea surface temperatures (SSTs) in the tropical and extra-tropical regions of the Pacific Ocean, ENSO (El-Nino Southern Oscillation) and PDO (Pacific Decadal Oscillation), respectively, and the Atlantic Ocean, AMO (Atlantic Multidecadal Oscillation), and also to atmospheric circulation patterns (PNA: Pacific-North American). A network of precipitation sensitive tree-ring chronologies from Montana, Alberta, Saskatchewan and NWT highly correlate to the climate moisture index (CMI) of precipitation potential evapotranspiration (P-PET), thus, capturing the long-term hydroclimatic variability of the region. Reconstructions of annual and seasonal CMI identify drought events in previous centuries that are more extreme in magnitude, frequency and duration than recorded during the instrumental period. Variability in the future climate will include these natural climate cycles as well as modulations of these cycles affected by human induced global warming. The proxy hydroclimate records derived from tree-rings present information on decadal and multi-decadal hydroclimatic variability for the past millennium; therefore, providing a unique opportunity to validate the climate variability simulated by GCMs (Global Climate Models) on longer time scales otherwise constrained by the shorter observation records. Developing scenarios of future variability depends: 1) on our understanding of the interaction of these teleconnection; and, 2) to identify climate models that are able to accurately simulate the hydroclimatic variability as detected in the instrumental and proxy records. (author)

  8. Sistemic Approach - a Complexity Management Instrument

    Directory of Open Access Journals (Sweden)

    Vadim Dumitrascu

    2006-04-01

    Full Text Available he systemic principle uses the deduction and the induction, analyse and synthesis, inferency and proferency, in order to find out the interdependencies and the inner connections that make mooving the complex organized entities. The true valences of this approach can be found neither in the simplist models of the “in-out” type, nor in the “circular” models that fill in the Economics and Management handbooks, and that consecrate another kind of formalism, but in the constructiviste-reflexive strategies, used in order to explain the economic and social structures.

  9. Monte Carlo simulations as a part of the configuration for neutron instruments

    International Nuclear Information System (INIS)

    Andersen, P.; Lefmann, K.; Theil Kuhn, L.; Willendrup, P.K.; Farhi, E.

    2004-01-01

    We demonstrate the use of simulations in the process of determining the optimal instrument setup by virtual experiments using the simulation package McStas. The focus of the analysis is the optimization of the multi-blade analyzer present at the RITA-II spectrometer at PSI for a test experiment with powder diffraction. The agreement in resolution and signal-to-background ratio between measurements and simulations shows the validity of the approach. For a sample with broadened Bragg peaks, the optimal instrument configuration could be established only on careful consideration. This shows that in general instrument simulations are approaching a state where they can assist users in selecting the optimal configuration and possibly demonstrate the feasibility of their experiments prior to allocated beamtime

  10. Response of Nuclear Power Plant Instrumentation Cables Exposed to Fire Conditions.

    Energy Technology Data Exchange (ETDEWEB)

    Muna, Alice Baca [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); LaFleur, Chris Bensdotter [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brooks, Dusty Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    This report presents the results of instrumentation cable tests sponsored by the US Nuclear Regulatory Commission (NRC) Office of Nuclear Regulatory Research and performed at Sandia National Laboratories (SNL). The goal of the tests was to assess thermal and electrical response behavior under fire-exposure conditions for instrumentation cables and circuits. The test objective was to assess how severe radiant heating conditions surrounding an instrumentation cable affect current or voltage signals in an instrumentation circuit. A total of thirty-nine small-scale tests were conducted. Ten different instrumentation cables were tested, ranging from one conductor to eight-twisted pairs. Because the focus of the tests was thermoset (TS) cables, only two of the ten cables had thermoplastic (TP) insulation and jacket material and the remaining eight cables were one of three different TS insulation and jacket material. Two instrumentation cables from previous cable fire testing were included, one TS and one TP. Three test circuits were used to simulate instrumentation circuits present in nuclear power plants: a 4–20 mA current loop, a 10–50 mA current loop and a 1–5 VDC voltage loop. A regression analysis was conducted to determine key variables affecting signal leakage time.

  11. Classic electrocardiogram-based and mobile technology derived approaches to heart rate variability are not equivalent.

    Science.gov (United States)

    Guzik, Przemyslaw; Piekos, Caroline; Pierog, Olivia; Fenech, Naiman; Krauze, Tomasz; Piskorski, Jaroslaw; Wykretowicz, Andrzej

    2018-05-01

    We compared classic ECG-derived versus a mobile approach to heart rate variability (HRV) measurement. 29 young adult healthy volunteers underwent a simultaneous recording of heart rate using an ECG and a chest heart rate monitor at supine rest, during mental stress and active standing. Mean RR interval, Standard Deviation of Normal-to-Normal (SDNN) of RR intervals, and Root Mean Square of the Successive Differences (RMSSD) between RR intervals were computed in 168 pairs of 5-minute epochs by in-house software on a PC (only sinus beats) and by mobile application "ELITEHRV" on a smartphone (no beat type identification). ECG analysis showed that 33.9% of the recordings contained at least one non-sinus beat or artefact, the mobile app did not report this. The mean RR intervals were significantly longer (p = 0.0378), while SDNN (p = 0.0001) and RMSSD (p = 0.0199) were smaller for the mobile approach. Measures of identical HRV parameters by ECG-based and mobile approaches are not equivalent. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. The long view: Causes of climate change over the instrumental period

    Science.gov (United States)

    Hegerl, G. C.; Schurer, A. P.; Polson, D.; Iles, C. E.; Bronnimann, S.

    2016-12-01

    The period of instrumentally recorded data has seen remarkable changes in climate, with periods of rapid warming, and periods of stagnation or cooling. A recent analysis of the observed temperature change from the instrumental record confirms that most of the warming recorded since the middle of the 20rst century has been caused by human influences, but shows large uncertainty in separating greenhouse gas from aerosol response if accounting for model uncertainty. The contribution by natural forcing and internal variability to the recent warming is estimated to be small, but becomes more important when analysing climate change over earlier or shorter time periods. For example, the enigmatic early 20th century warming was a period of strong climate anomalies, including the US dustbowl drought and exceptional heat waves, and pronounced Arctic warming. Attribution results suggests that about half of the global warming 1901-1950 was forced by greenhouse gases increases, with an anomalously strong contribution by climate variability, and contributions by natural forcing. Long term variations in circulation are important for some regional climate anomalies. Precipitation is important for impacts of climate change and precipitation changes are uncertain in models. Analysis of the instrumental record suggests a human influence on mean and heavy precipitation, and supports climate model estimates of the spatial pattern of precipitation sensitivity to warming. Broadly, and particularly over ocean, wet regions are getting wetter and dry regions are getting drier. In conclusion, the historical record provides evidence for a strong response to external forcings, supports climate models, and raises questions about multi-decadal variability.

  13. Teachers' Pedagogical Management and Instrumental Performance in Students of an Artistic Higher Education School

    Science.gov (United States)

    De La Cruz Bautista, Edwin

    2017-01-01

    This research aims to know the relationship between the variables teachers' pedagogical management and instrumental performance in students from an Artistic Higher Education School. It is a descriptive and correlational research that seeks to find the relationship between both variables. The sample of the study consisted of 30 students of the…

  14. DEVELOPING EVALUATION INSTRUMENT FOR MATHEMATICS EDUCATIONAL SOFTWARE

    Directory of Open Access Journals (Sweden)

    Wahyu Setyaningrum

    2012-02-01

    Full Text Available The rapid increase and availability of mathematics software, either for classroom or individual learning activities, presents a challenge for teachers. It has been argued that many products are limited in quality. Some of the more commonly used software products have been criticized for poor content, activities which fail to address some learning issues, poor graphics presentation, inadequate documentation, and other technical problems. The challenge for schools is to ensure that the educational software used in classrooms is appropriate and effective in supporting intended outcomes and goals. This paper aimed to develop instrument for evaluating mathematics educational software in order to help teachers in selecting the appropriate software. The instrument considers the notion of educational including content, teaching and learning skill, interaction, and feedback and error correction; and technical aspects of educational software including design, clarity, assessment and documentation, cost and hardware and software interdependence. The instrument use a checklist approach, the easier and effective methods in assessing the quality of educational software, thus the user needs to put tick in each criteria. The criteria in this instrument are adapted and extended from standard evaluation instrument in several references.   Keywords: mathematics educational software, educational aspect, technical aspect.

  15. MODERN TENDENCIES OF USING INTEGRATIVE APPROACH TO ORGANIZING ART TEACHERS’ INSTRUMENTAL AND PERFORMING TRAINING

    Directory of Open Access Journals (Sweden)

    Zhanna Kartashova

    2016-04-01

    Full Text Available In the article the modern tendencies of using integrative approach to organizing art teachers’ instrumental and performing training. The concept “integration” is singled out; it is defined as the process of recovery, replenishment, combining previously isolated parts; moving of the system to a great organic integrity. It is disclosed that the integration means considering multidimensionality of element features which are being integrated while accumulating quantitative features and emerging a new quality and individual features of integral elements are saved. It is proved that integrating is interrelation of art varieties in the process of their aesthetic and educational impact on pupils that is the whole perception of a work (its content and its emotional, rational, ethical and aesthetic form in the unity of tasks of developing artistic and aesthetic senses, thoughts, tastes and pupils’ ideals. It is thought that integration in art pedagogy is held at three levels: internal artistic and aesthetic synthesis of various arts organically combined with students creative activity; interdisciplinary humanistic synthesis of the arts, the native language, literature, and folklore; looking for semantic blocks, images, concepts that have universal meaning, which, entering all spheres of human consciousness, such as science and mathematics, seamlessly combining them into a coherent system. It is noted that the most efficient approach is appeal to the learning subjects of Humanities cycle – music, literature and art. It is concluded that designing of training should be started with the analyzing prospective art teacher’s activity. It should be understood what the teacher has to do, not in general formulation, but at the level of actions and operations.

  16. Semiparametric methods for estimation of a nonlinear exposure‐outcome relationship using instrumental variables with application to Mendelian randomization

    Science.gov (United States)

    Staley, James R.

    2017-01-01

    ABSTRACT Mendelian randomization, the use of genetic variants as instrumental variables (IV), can test for and estimate the causal effect of an exposure on an outcome. Most IV methods assume that the function relating the exposure to the expected value of the outcome (the exposure‐outcome relationship) is linear. However, in practice, this assumption may not hold. Indeed, often the primary question of interest is to assess the shape of this relationship. We present two novel IV methods for investigating the shape of the exposure‐outcome relationship: a fractional polynomial method and a piecewise linear method. We divide the population into strata using the exposure distribution, and estimate a causal effect, referred to as a localized average causal effect (LACE), in each stratum of population. The fractional polynomial method performs metaregression on these LACE estimates. The piecewise linear method estimates a continuous piecewise linear function, the gradient of which is the LACE estimate in each stratum. Both methods were demonstrated in a simulation study to estimate the true exposure‐outcome relationship well, particularly when the relationship was a fractional polynomial (for the fractional polynomial method) or was piecewise linear (for the piecewise linear method). The methods were used to investigate the shape of relationship of body mass index with systolic blood pressure and diastolic blood pressure. PMID:28317167

  17. Translation of an instrument. The US-Nordic Family Dynamics Nursing Research Project.

    Science.gov (United States)

    White, M; Elander, G

    1992-01-01

    Translation of a research instrument questionnaire from English to another language is analyzed in relation to principles involved, procedures followed, and problems confronted by nurse researchers from the US-Nordic Family Dynamics Nursing Research Project. Of paramount importance in translation are translation equivalency, congruent value orientation, and careful use of colloquialisms. It is important to recognize that copyright guidelines apply in the translation of an instrument. Approaches to solving instrument translation problems are discussed.

  18. Environmental policy instruments and technological change in the energy sector: findings from comparative empirical research

    International Nuclear Information System (INIS)

    Skjaerseth, J.B.; Christiansen, A.C.

    2006-01-01

    This article explores the extent to which and in what ways environmental policy instruments may affect patterns of environmental friendly technological change in the energy sector. Our argument is based on the assumption, however, that technological change is also affected by the political context in which the instruments are applied and by the nature of the problem itself. Comparative empirical research involving different European countries, sectors and policy fields were examined, including climate change, air pollution and wind power. The relationship between environmental policy instruments and technological change is extremely complex, not least due to the impact of other factors that may be more decisive than environmental ones. Against this backdrop, it was concluded that: 1) a portfolio of policy instruments works to the extent that different types of policy instruments affect the different drivers and stages behind technological change needed to solve specific problems. The need for a portfolio of policy instruments depends on the technological challenge being faced; 2) voluntary approaches facilitated constructive corporate strategies, but mandatory approaches tended to be more effective in stimulating short term major technological change; 3) voluntary approaches work well in the short term when the problem to be solved is characterized by lack of information and coordination. (author)

  19. Evaluation of Sensor Configurations for Robotic Surgical Instruments.

    Science.gov (United States)

    Gómez-de-Gabriel, Jesús M; Harwin, William

    2015-10-27

    Designing surgical instruments for robotic-assisted minimally-invasive surgery (RAMIS) is challenging due to constraints on the number and type of sensors imposed by considerations such as space or the need for sterilization. A new method for evaluating the usability of virtual teleoperated surgical instruments based on virtual sensors is presented. This method uses virtual prototyping of the surgical instrument with a dual physical interaction, which allows testing of different sensor configurations in a real environment. Moreover, the proposed approach has been applied to the evaluation of prototypes of a two-finger grasper for lump detection by remote pinching. In this example, the usability of a set of five different sensor configurations, with a different number of force sensors, is evaluated in terms of quantitative and qualitative measures in clinical experiments with 23 volunteers. As a result, the smallest number of force sensors needed in the surgical instrument that ensures the usability of the device can be determined. The details of the experimental setup are also included.

  20. Systems approach in energy management

    International Nuclear Information System (INIS)

    Dutta-Choudhury, K.

    1993-01-01

    Several years ago when the author was working in the chemicals division of a paper company in Instrumentation and Controls, one experience had a lasting impact on his work approach which is systems approach. The maintenance manager told the author that a very important piece of boiler instrument of the power plant had broken down and delivery of the replacement needed to be expedited. The instrument was ordered over the phone in another city. The purchase order was personally delivered at the supplier's office and arrangements were made so the instrument was put on the next flight. A week later the maintenance manager indicated that the particular instrument still had not arrived in the plant and he could not run the power plant. Thus the company incurred substantial losses. Further inquiries showed that the instrument did indeed arrive at the plant stores on time. But, in the absence of any instructions thereon, the instrument was not delivered to the power plant. The sense of urgency was lost in the existing delivery process. In other words, the process or system failed. The whole process from requisitioning to delivery of ordered items was analyzed and corrective procedures were incorporated to prevent future repetitions. This brings up the subject of systems approach in engineering management in general and energy management in particular. This involves defining an objective and designing a system for an effective way of getting there

  1. Definition of the limit of quantification in the presence of instrumental and non-instrumental errors. Comparison among various definitions applied to the calibration of zinc by inductively coupled plasma-mass spectrometry

    Science.gov (United States)

    Badocco, Denis; Lavagnini, Irma; Mondin, Andrea; Favaro, Gabriella; Pastore, Paolo

    2015-12-01

    The limit of quantification (LOQ) in the presence of instrumental and non-instrumental errors was proposed. It was theoretically defined combining the two-component variance regression and LOQ schemas already present in the literature and applied to the calibration of zinc by the ICP-MS technique. At low concentration levels, the two-component variance LOQ definition should be always used above all when a clean room is not available. Three LOQ definitions were accounted for. One of them in the concentration and two in the signal domain. The LOQ computed in the concentration domain, proposed by Currie, was completed by adding the third order terms in the Taylor expansion because they are of the same order of magnitude of the second ones so that they cannot be neglected. In this context, the error propagation was simplified by eliminating the correlation contributions by using independent random variables. Among the signal domain definitions, a particular attention was devoted to the recently proposed approach based on at least one significant digit in the measurement. The relative LOQ values resulted very large in preventing the quantitative analysis. It was found that the Currie schemas in the signal and concentration domains gave similar LOQ values but the former formulation is to be preferred as more easily computable.

  2. Incorporation of personal computers in a research reactor instrumentation system for data monitoring and analysis

    International Nuclear Information System (INIS)

    Leopando, L.S.

    1998-01-01

    The research contract was implemented by obtaining off-the shelf personal computer hardware and data acquisition cards, designing the interconnection with the instrumentation system, writing and debugging the software, and the assembling and testing the set-up. The hardware was designed to allow all variables monitored by the instrumentation system to be accessible to the computers, without requiring any major modification of the instrumentation system and without compromising reactor safety in any way. The computer hardware addition was also designed to have no effect on any existing function of the instrumentation system. The software was designed to implement only graphical display and automated logging of reactor variables. Additional functionality could be easily added in the future with software revision because all the reactor variables are already available in the computer. It would even be possible to ''close the loop'' and control the reactor through software. It was found that most of the effort in an undertaking of this sort will be in software development, but the job can be done even by non-computer specialized reactor people working with programming languages they are already familiar with. It was also found that the continuing rapid advance of personal computer technology makes it essential that such a project be undertaken with inevitability of future hardware upgrading in mind. The hardware techniques and the software developed may find applicability in other research reactors, especially those with a generic analog research reactor TRIGA console. (author)

  3. Validation of the Organizational Culture Assessment Instrument

    Science.gov (United States)

    Heritage, Brody; Pollock, Clare; Roberts, Lynne

    2014-01-01

    Organizational culture is a commonly studied area in industrial/organizational psychology due to its important role in workplace behaviour, cognitions, and outcomes. Jung et al.'s [1] review of the psychometric properties of organizational culture measurement instruments noted many instruments have limited validation data despite frequent use in both theoretical and applied situations. The Organizational Culture Assessment Instrument (OCAI) has had conflicting data regarding its psychometric properties, particularly regarding its factor structure. Our study examined the factor structure and criterion validity of the OCAI using robust analysis methods on data gathered from 328 (females = 226, males = 102) Australian employees. Confirmatory factor analysis supported a four factor structure of the OCAI for both ideal and current organizational culture perspectives. Current organizational culture data demonstrated expected reciprocally-opposed relationships between three of the four OCAI factors and the outcome variable of job satisfaction but ideal culture data did not, thus indicating possible weak criterion validity when the OCAI is used to assess ideal culture. Based on the mixed evidence regarding the measure's properties, further examination of the factor structure and broad validity of the measure is encouraged. PMID:24667839

  4. Validation of the organizational culture assessment instrument.

    Directory of Open Access Journals (Sweden)

    Brody Heritage

    Full Text Available Organizational culture is a commonly studied area in industrial/organizational psychology due to its important role in workplace behaviour, cognitions, and outcomes. Jung et al.'s [1] review of the psychometric properties of organizational culture measurement instruments noted many instruments have limited validation data despite frequent use in both theoretical and applied situations. The Organizational Culture Assessment Instrument (OCAI has had conflicting data regarding its psychometric properties, particularly regarding its factor structure. Our study examined the factor structure and criterion validity of the OCAI using robust analysis methods on data gathered from 328 (females = 226, males = 102 Australian employees. Confirmatory factor analysis supported a four factor structure of the OCAI for both ideal and current organizational culture perspectives. Current organizational culture data demonstrated expected reciprocally-opposed relationships between three of the four OCAI factors and the outcome variable of job satisfaction but ideal culture data did not, thus indicating possible weak criterion validity when the OCAI is used to assess ideal culture. Based on the mixed evidence regarding the measure's properties, further examination of the factor structure and broad validity of the measure is encouraged.

  5. Software for simulation and design of neutron scattering instrumentation

    DEFF Research Database (Denmark)

    Bertelsen, Mads

    designed using the software. The Union components uses a new approach to simulation of samples in McStas. The properties of a sample are split into geometrical and material, simplifying user input, and allowing the construction of complicated geometries such as sample environments. Multiple scattering...... from conventional choices. Simulation of neutron scattering instrumentation is used when designing instrumentation, but also to understand instrumental effects on the measured scattering data. The Monte Carlo ray-tracing package McStas is among the most popular, capable of simulating the path of each...... neutron through the instrument using an easy to learn language. The subject of the defended thesis is contributions to the McStas language in the form of the software package guide_bot and the Union components.The guide_bot package simplifies the process of optimizing neutron guides by writing the Mc...

  6. Cost-effective design of economic instruments in nutrition policy

    DEFF Research Database (Denmark)

    Jensen, Jørgen Dejgård; Smed, Sinne

    2007-01-01

    This paper addresses the potential for using economic regulation, e.g. taxes or subsidies, as instruments to combat the increasing problems of inappropriate diets, leading to health problems such as obesity, diabetes 2, cardiovascular diseases etc. in most countries. Such policy measures may....... The analyses demonstrate that the average cost-effectiveness with regard to changing the intake of selected nutritional variables can be improved by 10–30 per cent if taxes/subsidies are targeted against these nutrients, compared with targeting selected food categories. Finally, the paper raises a range...... of issues, which need to be investigated further, before firm conclusions about the suitability of economic instruments in nutrition policy can be drawn....

  7. Instrumentation

    International Nuclear Information System (INIS)

    Umminger, K.

    2008-01-01

    A proper measurement of the relevant single and two-phase flow parameters is the basis for the understanding of many complex thermal-hydraulic processes. Reliable instrumentation is therefore necessary for the interaction between analysis and experiment especially in the field of nuclear safety research where postulated accident scenarios have to be simulated in experimental facilities and predicted by complex computer code systems. The so-called conventional instrumentation for the measurement of e. g. pressures, temperatures, pressure differences and single phase flow velocities is still a solid basis for the investigation and interpretation of many phenomena and especially for the understanding of the overall system behavior. Measurement data from such instrumentation still serves in many cases as a database for thermal-hydraulic system codes. However some special instrumentation such as online concentration measurement for boric acid in the water phase or for non-condensibles in steam atmosphere as well as flow visualization techniques were further developed and successfully applied during the recent years. Concerning the modeling needs for advanced thermal-hydraulic codes, significant advances have been accomplished in the last few years in the local instrumentation technology for two-phase flow by the application of new sensor techniques, optical or beam methods and electronic technology. This paper will give insight into the current state of instrumentation technology for safety-related thermohydraulic experiments. Advantages and limitations of some measurement processes and systems will be indicated as well as trends and possibilities for further development. Aspects of instrumentation in operating reactors will also be mentioned.

  8. Kinetic characteristics of the gait of a musician carrying or not his instrument

    Directory of Open Access Journals (Sweden)

    Carlos Bolli Mota

    2009-01-01

    Full Text Available The integrity of the locomotor system can be compromised by the transport of certain objects, especially when done in an inadequate manner. Due to their weight and size, the transport of musical instruments can contribute to body dysfunctions in musicians who frequently have to carry their instruments, influencing balance andbody posture. Thus, the soil reaction force was investigated during the gait of a musician carrying or not his instrument. Two AMTI (Advanced Mechanical Technologies, Inc. platforms were used for kinetic data acquisition. A total of 40 measurements were obtainedfor gait and balance: 20 without carrying the instrument and 20 while carrying the instrument. The t test showed significant differences between the two situations for all variables analyzed. The results suggest that the locomotor system suffers alterationswhen carrying any kind of load, as was the case here in which the subject carried 7.75% of his own weight.

  9. Minilaparoscopic hysterectomy made easy: first report on alternative instrumentation and new integrated energy platform.

    Science.gov (United States)

    Ng, Ying Woo; Lim, Li Min; Fong, Yoke Fai

    2014-05-01

    Minilaparoscopy is an attractive approach for hysterectomy due to advantages such as reduced morbidities and enhanced cosmesis. However, it has not been popularized due to the lack of suitable instruments and high technical demand. We aim to highlight the first case of minilaparoscopic hysterectomy reported in Asia and the use of a new integrated energy platform, Thunderbeat. We would like to propose an alternative method of instrumentation, so as to improve the feasibility and safety of minilaparoscopic hysterectomy. The first minilaparoscopic hysterectomy in Singapore was successfully completed using the alternative instrumentation and new energy platform. There was no conversion or complication during the surgery. The patient recovered uneventfully. To our knowledge, this is the first report on the use of such alternative instrumentation. This approach in instrumentation and the new energy platform will improve the feasibility and speed of the surgery and ensure safety in our patients. © 2014 The Authors. Journal of Obstetrics and Gynaecology Research © 2014 Japan Society of Obstetrics and Gynecology.

  10. Multi-Institutional Development of a Mastoidectomy Performance Evaluation Instrument.

    Science.gov (United States)

    Kerwin, Thomas; Hittle, Brad; Stredney, Don; De Boeck, Paul; Wiet, Gregory

    A method for rating surgical performance of a mastoidectomy procedure that is shown to apply universally across teaching institutions has not yet been devised. This work describes the development of a rating instrument created from a multi-institutional consortium. Using a participatory design and a modified Delphi approach, a multi-institutional group of expert otologists constructed a 15-element task-based checklist for evaluating mastoidectomy performance. This instrument was further refined into a 14-element checklist focusing on the concept of safety after using it to rate a large and varied population of performances. Twelve otolaryngological surgical training programs in the United States. A total of 14 surgeons from 12 different institutions took part in the construction of the instrument. By using 14 experts from 12 different institutions and a literature review, individual metrics were identified, rated as to the level of importance and operationally defined to create a rating scale for mastoidectomy performance. Initial use of the rating scale showed modest rater agreement. The operational definitions of individual metrics were modified to emphasize "safe" as opposed to "proper" technique. A second rating instrument was developed based on this feedback. Using a consensus-building approach with multiple rounds of communication between experts is a feasible way to construct a rating instrument for mastoidectomy. Expert opinion alone using a Delphi method provides face and content validity evidence, however, this is not sufficient to develop a universally acceptable rating instrument. A continued process of development and experimentation to demonstrate evidence for reliability and validity making use of a large population of raters and performances is necessary to achieve universal acceptance. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  11. Introducing instrumental variables in the LS-SVM based identification framework

    NARCIS (Netherlands)

    Laurain, V.; Zheng, W-X.; Toth, R.

    2011-01-01

    Least-Squares Support Vector Machines (LS-SVM) represent a promising approach to identify nonlinear systems via nonparametric estimation of the nonlinearities in a computationally and stochastically attractive way. All the methods dedicated to the solution of this problem rely on the minimization of

  12. Development and validation of an instrument to assess perceived social influence on health behaviors

    Science.gov (United States)

    HOLT, CHERYL L.; CLARK, EDDIE M.; ROTH, DAVID L.; CROWTHER, MARTHA; KOHLER, CONNIE; FOUAD, MONA; FOUSHEE, RUSTY; LEE, PATRICIA A.; SOUTHWARD, PENNY L.

    2012-01-01

    Assessment of social influence on health behavior is often approached through a situational context. The current study adapted an existing, theory-based instrument from another content domain to assess Perceived Social Influence on Health Behavior (PSI-HB) among African Americans, using an individual difference approach. The adapted instrument was found to have high internal reliability (α = .81–.84) and acceptable testretest reliability (r = .68–.85). A measurement model revealed a three-factor structure and supported the theoretical underpinnings. Scores were predictive of health behaviors, particularly among women. Future research using the new instrument may have applied value assessing social influence in the context of health interventions. PMID:20522506

  13. Instrument accuracy in reactor vessel inventory tracking systems

    International Nuclear Information System (INIS)

    Anderson, J.L.; Anderson, R.L.; Morelock, T.C.; Hauang, T.L.; Phillips, L.E.

    1986-01-01

    Instrumentation needs for detection of inadequate core cooling. Studies of the Three Mile Island accident identified the need for additional instrumentation to detect inadequate core cooling (ICC) in nuclear power plants. Industry studies by plant owners and reactor vendors supported the conclusion that improvements were needed to help operators diagnose the approach to or existence of ICC as well as to provide more complete information for operator control of safety injection flow to minimize the consequences of such an accident. In 1980, the US Nuclear Regulatory Commission (NRC) required further studies by the industry and described ICC instrumentation design requirements that included human factors and environmental considerations. On December 10, 1982, NRC issued to Babcock and Wilcox (B and W) licensees orders for Modification of License and transmitted to pressurized water reactor licensees Generic Letter 82-28 to inform them of the revised NRC requirements. The instrumentation requirements include upgraded subcooling margin monitors (SMM), upgraded core exit thermocouples (CET), and installation of a reactor coolant inventory tracking system. NRC Regulatory Guide 1.97, which covers accident monitoring instrumentation, was revised (Rev. 3) to be consistent with the requirements of item II.F.2 of NUREG-0737

  14. Motivation for Instrument Education: A Study from the Perspective of Expectancy-Value and Flow Theories

    Science.gov (United States)

    Burak, Sabahat

    2014-01-01

    Problem Statement: In the process of instrument education, students being unwilling (lacking motivation) to play an instrument or to practise is a problem that educators frequently face. Recognizing the factors motivating the students will yield useful results for instrument educators in terms of developing correct teaching methods and approaches.…

  15. A state variable approach to the BESSY II local beam-position-feedback system

    International Nuclear Information System (INIS)

    Gilpatrick, J.D.; Khan, S.; Kraemer, D.

    1996-01-01

    At the BESSY II facility, stability of the electron beam position and angle near insertion devices (IDs) is of utmost importance. Disturbances due to ground motion could result in unwanted broad-bandwidth beam-jitter which decreases the electron (and resultant photon) beam's effective brightness. Therefore, feedback techniques must be used. Operating over a frequency range of 100-Hz, a local feedback system will correct these beam-trajectory errors using the four bumps around IDs. This paper reviews how the state-variable feedback approach can be applied to real-time correction of these beam position and angle errors. A frequency-domain solution showing beam jitter reduction is presented. Finally, this paper reports results of a beam-feedback test at BESSY I

  16. A standardized approach to study human variability in isometric thermogenesis during low-intensity physical activity

    Directory of Open Access Journals (Sweden)

    Delphine eSarafian

    2013-07-01

    Full Text Available Limitations of current methods: The assessment of human variability in various compartments of daily energy expenditure (EE under standardized conditions is well defined at rest (as basal metabolic rate and thermic effect of feeding, and currently under validation for assessing the energy cost of low-intensity dynamic work. However, because physical activities of daily life consist of a combination of both dynamic and isometric work, there is also a need to develop standardized tests for assessing human variability in the energy cost of low-intensity isometric work.Experimental objectives: Development of an approach to study human variability in isometric thermogenesis by incorporating a protocol of intermittent leg press exercise of varying low-intensity isometric loads with measurements of EE by indirect calorimetry. Results: EE was measured in the seated position with the subject at rest or while intermittently pressing both legs against a press-platform at 5 low-intensity isometric loads (+5, +10, + 15, +20 and +25 kg force, each consisting of a succession of 8 cycles of press (30 s and rest (30 s. EE, integrated over each 8-min period of the intermittent leg press exercise, was found to increase linearly across the 5 isometric loads with a correlation coefficient (r > 0.9 for each individual. The slope of this EE-Load relationship, which provides the energy cost of this standardized isometric exercise expressed per kg force applied intermittently (30 s in every min, was found to show good repeatability when assessed in subjects who repeated the same experimental protocol on 3 separate days: its low intra-individual coefficient of variation (CV of ~ 10% contrasted with its much higher inter-individual CV of 35%; the latter being mass-independent but partly explained by height. Conclusion: This standardized approach to study isometric thermogenesis opens up a new avenue for research in EE phenotyping and metabolic predisposition to obesity

  17. Instrumentation requirements from the user's view

    International Nuclear Information System (INIS)

    Harsha, P.T.

    1988-01-01

    The use of combustor diagnostics is considered from the point of view of demonstration of performance of an airbreathing hypersonic engine. The basic need is seen to be that of providing the data necessary to verify performance predictions for the engine as installed in the airplane. This necessitates the use of a diagnostics capability that can provide the inputs required by the computational analyses that will be used to assess this performance. Because of the cost of ground test facilities, a premium is placed on measurement technique reliability and redundancy of instrumentation. A mix of nonintrusive optical techniques and probe-based measurements is seen to be the best approach using current diagnostics capability; one such instrument mix is outlined for a ramjet/scramjet test program. 11 references

  18. DOE Handbook: Guide to good practices evaluation instrument examples

    International Nuclear Information System (INIS)

    1997-01-01

    Training evaluation determines a training program's effectiveness in meeting its intended purpose: producing competent employees. Evaluation is the quality assurance component of a systematic approach to training program. This guide provides information on evaluation instruments used to gather employee, supervisor, and instructor feedback to identify strengths and weaknesses of training programs at DOE facilities. It should be used in conjunction with ''DOE Training Program Handbook: A Systematic Approach to Training'' and ''DOE Handbook, Alternative Systematic Approaches to Training.''

  19. DOE Handbook: Guide to good practices evaluation instrument examples

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-01-01

    Training evaluation determines a training program`s effectiveness in meeting its intended purpose: producing competent employees. Evaluation is the quality assurance component of a systematic approach to training program. This guide provides information on evaluation instruments used to gather employee, supervisor, and instructor feedback to identify strengths and weaknesses of training programs at DOE facilities. It should be used in conjunction with ``DOE Training Program Handbook: A Systematic Approach to Training`` and ``DOE Handbook, Alternative Systematic Approaches to Training.``

  20. Variable Lifting Index (VLI): A New Method for Evaluating Variable Lifting Tasks.

    Science.gov (United States)

    Waters, Thomas; Occhipinti, Enrico; Colombini, Daniela; Alvarez-Casado, Enrique; Fox, Robert

    2016-08-01

    We seek to develop a new approach for analyzing the physical demands of highly variable lifting tasks through an adaptation of the Revised NIOSH (National Institute for Occupational Safety and Health) Lifting Equation (RNLE) into a Variable Lifting Index (VLI). There are many jobs that contain individual lifts that vary from lift to lift due to the task requirements. The NIOSH Lifting Equation is not suitable in its present form to analyze variable lifting tasks. In extending the prior work on the VLI, two procedures are presented to allow users to analyze variable lifting tasks. One approach involves the sampling of lifting tasks performed by a worker over a shift and the calculation of the Frequency Independent Lift Index (FILI) for each sampled lift and the aggregation of the FILI values into six categories. The Composite Lift Index (CLI) equation is used with lifting index (LI) category frequency data to calculate the VLI. The second approach employs a detailed systematic collection of lifting task data from production and/or organizational sources. The data are organized into simplified task parameter categories and further aggregated into six FILI categories, which also use the CLI equation to calculate the VLI. The two procedures will allow practitioners to systematically employ the VLI method to a variety of work situations where highly variable lifting tasks are performed. The scientific basis for the VLI procedure is similar to that for the CLI originally presented by NIOSH; however, the VLI method remains to be validated. The VLI method allows an analyst to assess highly variable manual lifting jobs in which the task characteristics vary from lift to lift during a shift. © 2015, Human Factors and Ergonomics Society.

  1. Neural correlates of recognition and naming of musical instruments.

    Science.gov (United States)

    Belfi, Amy M; Bruss, Joel; Karlan, Brett; Abel, Taylor J; Tranel, Daniel

    2016-10-01

    Retrieval of lexical (names) and conceptual (semantic) information is frequently impaired in individuals with neurological damage. One category of items that is often affected is musical instruments. However, distinct neuroanatomical correlates underlying lexical and conceptual knowledge for musical instruments have not been identified. We used a neuropsychological approach to explore the neural correlates of knowledge retrieval for musical instruments. A large sample of individuals with focal brain damage (N = 298), viewed pictures of 16 musical instruments and were asked to name and identify each instrument. Neuroanatomical data were analyzed with a proportional MAP-3 method to create voxelwise lesion proportion difference maps. Impaired naming (lexical retrieval) of musical instruments was associated with damage to the left temporal pole and inferior pre- and postcentral gyri. Impaired recognition (conceptual knowledge retrieval) of musical instruments was associated with a more broadly and bilaterally distributed network of regions, including ventromedial prefrontal cortices, occipital cortices, and superior temporal gyrus. The findings extend our understanding of how musical instruments are processed at neural system level, and elucidate factors that may explain why brain damage may or may not produce anomia or agnosia for musical instruments. Our findings also help inform broader understanding of category-related knowledge mapping in the brain, as musical instruments possess several characteristics that are similar to various other categories of items: They are inanimate and highly manipulable (similar to tools), produce characteristic sounds (similar to animals), and require fine-grained visual differentiation between each other (similar to people). (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  2. Instrumental interaction

    OpenAIRE

    Luciani , Annie

    2007-01-01

    International audience; The expression instrumental interaction as been introduced by Claude Cadoz to identify a human-object interaction during which a human manipulates a physical object - an instrument - in order to perform a manual task. Classical examples of instrumental interaction are all the professional manual tasks: playing violin, cutting fabrics by hand, moulding a paste, etc.... Instrumental interaction differs from other types of interaction (called symbolic or iconic interactio...

  3. On Darboux's approach to R-separability of variables. Classification of conformally flat 4-dimensional binary metrics

    International Nuclear Information System (INIS)

    Szereszewski, A; Sym, A

    2015-01-01

    The standard method of separation of variables in PDEs called the Stäckel–Robertson–Eisenhart (SRE) approach originated in the papers by Robertson (1928 Math. Ann. 98 749–52) and Eisenhart (1934 Ann. Math. 35 284–305) on separability of variables in the Schrödinger equation defined on a pseudo-Riemannian space equipped with orthogonal coordinates, which in turn were based on the purely classical mechanics results by Paul Stäckel (1891, Habilitation Thesis, Halle). These still fundamental results have been further extended in diverse directions by e.g. Havas (1975 J. Math. Phys. 16 1461–8; J. Math. Phys. 16 2476–89) or Koornwinder (1980 Lecture Notes in Mathematics 810 (Berlin: Springer) pp 240–63). The involved separability is always ordinary (factor R = 1) and regular (maximum number of independent parameters in separation equations). A different approach to separation of variables was initiated by Gaston Darboux (1878 Ann. Sci. E.N.S. 7 275–348) which has been almost completely forgotten in today’s research on the subject. Darboux’s paper was devoted to the so-called R-separability of variables in the standard Laplace equation. At the outset he did not make any specific assumption about the separation equations (this is in sharp contrast to the SRE approach). After impressive calculations Darboux obtained a complete solution of the problem. He found not only eleven cases of ordinary separability Eisenhart (1934 Ann. Math. 35 284–305) but also Darboux–Moutard–cyclidic metrics (Bôcher 1894 Ueber die Reihenentwickelungen der Potentialtheorie (Leipzig: Teubner)) and non-regularly separable Dupin-cyclidic metrics as well. In our previous paper Darboux’s approach was extended to the case of the stationary Schrödinger equation on Riemannian spaces admitting orthogonal coordinates. In particular the class of isothermic metrics was defined (isothermicity of the metric is a necessary condition for its R-separability). An important sub

  4. Instrumentation development

    International Nuclear Information System (INIS)

    Ubbes, W.F.; Yow, J.L. Jr.

    1988-01-01

    Instrumentation is developed for the Civilian Radioactive Waste Management Program to meet several different (and sometimes conflicting) objectives. This paper addresses instrumentation development for data needs that are related either directly or indirectly to a repository site, but does not touch on instrumentation for work with waste forms or other materials. Consequently, this implies a relatively large scale for the measurements, and an in situ setting for instrument performance. In this context, instruments are needed for site characterization to define phenomena, develop models, and obtain parameter values, and for later design and performance confirmation testing in the constructed repository. The former set of applications is more immediate, and is driven by the needs of program design and performance assessment activities. A host of general technical and nontechnical issues have arisen to challenge instrumentation development. Instruments can be classed into geomechanical, geohydrologic, or other specialty categories, but these issues cut across artificial classifications. These issues are outlined. Despite this imposing list of issues, several case histories are cited to evaluate progress in the area

  5. Measuring the Quality of Service in the Financial Area of a Public University: Development and Validation of the Instrument

    Directory of Open Access Journals (Sweden)

    Víctor Manuel Alcantar Enríquez

    2014-12-01

    Full Text Available This research, consisting of a descriptive study with a non-experimental design, involved an analysis to determine the validity and reliability of an instrument composed of seventeen items aimed at assessing the quality of service in the financial area of a public university by means of four variables: tangibility, reliability, responsiveness and empathy. The methodological strategy included the design of the measuring instrument; verification of the validity of content and construct; and analysis of internal consistency by means of Cronbach’s alpha. The instrument was applied to 152 users of the service, attaining a reliability coefficient of 0.943. The results show that with respect to specific concepts, the questions were clear; nevertheless we found it necessary to relocate items and rename variables, which resulted in a valid and reliable instrument for measuring the quality of service in the context under study.

  6. Nuclear instrumentation for the industrial measuring systems

    International Nuclear Information System (INIS)

    Normand, S.

    2010-01-01

    This work deals with nuclear instrumentation and its application to industry, power plant fuel reprocessing plant and finally with homeland security. The first part concerns the reactor instrumentation, in-core and ex-core measurement system. Ionization Uranium fission chamber will be introduced with their acquisition system especially Campbell mode system. Some progress have been done on regarding sensors failure foresee. The second part of this work deals with reprocessing plant and associated instrumentation for nuclear waste management. Proportional counters techniques will be discussed, especially Helium-3 counter, and new development on electronic concept for reprocessing nuclear waste plant (one electronic for multipurpose acquisition system). For nuclear safety and security for human and homeland will be introduce. First we will explain a new particular approach on operational dosimetric measurement and secondly, we will show new kind of organic scintillator material and associated electronics. Signal treatment with real time treatment is embedded, in order to make neutron gamma discrimination possible even in solid organic scintillator. Finally, the conclusion will point out future, with most trends in research and development on nuclear instrumentation for next years. (author) [fr

  7. Instrument design and optimization using genetic algorithms

    International Nuclear Information System (INIS)

    Hoelzel, Robert; Bentley, Phillip M.; Fouquet, Peter

    2006-01-01

    This article describes the design of highly complex physical instruments by using a canonical genetic algorithm (GA). The procedure can be applied to all instrument designs where performance goals can be quantified. It is particularly suited to the optimization of instrument design where local optima in the performance figure of merit are prevalent. Here, a GA is used to evolve the design of the neutron spin-echo spectrometer WASP which is presently being constructed at the Institut Laue-Langevin, Grenoble, France. A comparison is made between this artificial intelligence approach and the traditional manual design methods. We demonstrate that the search of parameter space is more efficient when applying the genetic algorithm, and the GA produces a significantly better instrument design. Furthermore, it is found that the GA increases flexibility, by facilitating the reoptimization of the design after changes in boundary conditions during the design phase. The GA also allows the exploration of 'nonstandard' magnet coil geometries. We conclude that this technique constitutes a powerful complementary tool for the design and optimization of complex scientific apparatus, without replacing the careful thought processes employed in traditional design methods

  8. Instrument design and optimization using genetic algorithms

    Science.gov (United States)

    Hölzel, Robert; Bentley, Phillip M.; Fouquet, Peter

    2006-10-01

    This article describes the design of highly complex physical instruments by using a canonical genetic algorithm (GA). The procedure can be applied to all instrument designs where performance goals can be quantified. It is particularly suited to the optimization of instrument design where local optima in the performance figure of merit are prevalent. Here, a GA is used to evolve the design of the neutron spin-echo spectrometer WASP which is presently being constructed at the Institut Laue-Langevin, Grenoble, France. A comparison is made between this artificial intelligence approach and the traditional manual design methods. We demonstrate that the search of parameter space is more efficient when applying the genetic algorithm, and the GA produces a significantly better instrument design. Furthermore, it is found that the GA increases flexibility, by facilitating the reoptimization of the design after changes in boundary conditions during the design phase. The GA also allows the exploration of "nonstandard" magnet coil geometries. We conclude that this technique constitutes a powerful complementary tool for the design and optimization of complex scientific apparatus, without replacing the careful thought processes employed in traditional design methods.

  9. Gender Mainstreaming or Instrumentalization of Women?

    Directory of Open Access Journals (Sweden)

    Marie France Labrecque

    2010-09-01

    Full Text Available This question is discussed on the basis of first hand data collected in Mexico between 2004 and 2007. The research aimed at examining how gender equity policies elaborated at the international level using an approach known as “gender mainstreaming” are transformed within national and local contexts. In a first step, the context of the emergence of the gender mainstreaming approach is reconstituted, and in a second step we try to clarify how and under what circumstances, in a country like Mexico, women can be instrumentalized within this approach, as for example, when gender mainstreaming is applied without any critical vision as it is so under neoliberalism. The main example rests on the case of microcredit for maya women in the state of Yucatan.

  10. Understanding Hydrological Processes in Variable Source Areas in the Glaciated Northeastern US Watersheds under Variable Climate Conditions

    Science.gov (United States)

    Steenhuis, T. S.; Azzaino, Z.; Hoang, L.; Pacenka, S.; Worqlul, A. W.; Mukundan, R.; Stoof, C.; Owens, E. M.; Richards, B. K.

    2017-12-01

    The New York City source watersheds in the Catskill Mountains' humid, temperate climate has long-term hydrological and water quality monitoring data It is one of the few catchments where implementation of source and landscape management practices has led to decreased phosphorus concentration in the receiving surface waters. One of the reasons is that landscape measures correctly targeted the saturated variable source runoff areas (VSA) in the valley bottoms as the location where most of the runoff and other nonpoint pollutants originated. Measures targeting these areas were instrumental in lowering phosphorus concentration. Further improvements in water quality can be made based on a better understanding of the flow processes and water table fluctuations in the VSA. For that reason, we instrumented a self-contained upland variable source watershed with a landscape characteristic of a soil underlain by glacial till at shallow depth similar to the Catskill watersheds. In this presentation, we will discuss our experimental findings and present a mathematical model. Variable source areas have a small slope making gravity the driving force for the flow, greatly simplifying the simulation of the flow processes. The experimental data and the model simulations agreed for both outflow and water table fluctuations. We found that while the flows to the outlet were similar throughout the year, the discharge of the VSA varies greatly. This was due to transpiration by the plants which became active when soil temperatures were above 10oC. We found that shortly after the temperature increased above 10oC the baseflow stopped and only surface runoff occurred when rainstorms exceeded the storage capacity of the soil in at least a portion of the variable source area. Since plant growth in the variable source area was a major variable determining the base flow behavior, changes in temperature in the future - affecting the duration of the growing season - will affect baseflow and

  11. Evaluating environmental policy instruments mixes; a methodology illustrated by noise policy in the Netherlands

    NARCIS (Netherlands)

    Weber, Miriam; Driessen, Peter P J; Runhaar, Hens A C

    2014-01-01

    Environmental policy is characterised by complexity, in causes and effects, resulting in various combinations of policy instruments. However, evaluating these policy instrument mixes and assessing their effectiveness is difficult because of a lack of methodological approaches. This paper therefore

  12. A state-and-transition simulation modeling approach for estimating the historical range of variability

    Directory of Open Access Journals (Sweden)

    Kori Blankenship

    2015-04-01

    Full Text Available Reference ecological conditions offer important context for land managers as they assess the condition of their landscapes and provide benchmarks for desired future conditions. State-and-transition simulation models (STSMs are commonly used to estimate reference conditions that can be used to evaluate current ecosystem conditions and to guide land management decisions and activities. The LANDFIRE program created more than 1,000 STSMs and used them to assess departure from a mean reference value for ecosystems in the United States. While the mean provides a useful benchmark, land managers and researchers are often interested in the range of variability around the mean. This range, frequently referred to as the historical range of variability (HRV, offers model users improved understanding of ecosystem function, more information with which to evaluate ecosystem change and potentially greater flexibility in management options. We developed a method for using LANDFIRE STSMs to estimate the HRV around the mean reference condition for each model state in ecosystems by varying the fire probabilities. The approach is flexible and can be adapted for use in a variety of ecosystems. HRV analysis can be combined with other information to help guide complex land management decisions.

  13. A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Campos, E [Argonne National Lab. (ANL), Argonne, IL (United States); Sisterson, Douglas [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-12-01

    The U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. The ARM Facility currently provides data and supporting metadata (information about the data or data quality) to its users through a number of sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, the Facility relies on instrument mentors and the ARM Data Quality Office (DQO) to ensure, assess, and report measurement quality. Therefore, an easily accessible, well-articulated estimate of ARM measurement uncertainty is needed. Note that some of the instrument observations require mathematical algorithms (retrievals) to convert a measured engineering variable into a useful geophysical measurement. While those types of retrieval measurements are identified, this study does not address particular methods for retrieval uncertainty. As well, the ARM Facility also provides engineered data products, or value-added products (VAPs), based on multiple instrument measurements. This study does not include uncertainty estimates for those data products. We propose here that a total measurement uncertainty should be calculated as a function of the instrument uncertainty (calibration factors), the field uncertainty (environmental factors), and the retrieval uncertainty (algorithm factors). The study will not expand on methods for computing these uncertainties. Instead, it will focus on the practical identification, characterization, and inventory of the measurement uncertainties already available in the ARM community through the ARM instrument mentors and their ARM instrument handbooks. As a result, this study will address the first steps towards reporting ARM measurement uncertainty

  14. Are developing countries ready for first world waste policy instruments?

    CSIR Research Space (South Africa)

    Godfrey, Linda K

    2007-10-01

    Full Text Available behavior Sardinia 2007, Eleventh International Waste Management and Landfill Symposium 2.2 Economic instruments In contrast with command-and-control approaches, which mandate specific behaviours, economic instruments (EIs) (such as taxes, subsidies..., and marketable permits) aim to change behaviour indirectly by changing the prices (and hence incentives) that individuals and businesses face. For example, taxes per unit of waste collected or disposed of, create incentives to reduce waste generation...

  15. A Fast and Sensitive New Satellite SO2 Retrieval Algorithm based on Principal Component Analysis: Application to the Ozone Monitoring Instrument

    Science.gov (United States)

    Li, Can; Joiner, Joanna; Krotkov, A.; Bhartia, Pawan K.

    2013-01-01

    We describe a new algorithm to retrieve SO2 from satellite-measured hyperspectral radiances. We employ the principal component analysis technique in regions with no significant SO2 to capture radiance variability caused by both physical processes (e.g., Rayleigh and Raman scattering and ozone absorption) and measurement artifacts. We use the resulting principal components and SO2 Jacobians calculated with a radiative transfer model to directly estimate SO2 vertical column density in one step. Application to the Ozone Monitoring Instrument (OMI) radiance spectra in 310.5-340 nm demonstrates that this approach can greatly reduce biases in the operational OMI product and decrease the noise by a factor of 2, providing greater sensitivity to anthropogenic emissions. The new algorithm is fast, eliminates the need for instrument-specific radiance correction schemes, and can be easily adapted to other sensors. These attributes make it a promising technique for producing longterm, consistent SO2 records for air quality and climate research.

  16. Comparative Study of Three Rotary Instruments for root canal Preparation using Computed Tomography

    International Nuclear Information System (INIS)

    Mohamed, A.M.E.

    2015-01-01

    Cleaning and shaping the root canal is a key to success in root canal treatment. This includes the removal of organic substrate from the root canal system by chemo mechanical methods, and the shaping of the root canal system into a continuously tapered preparation. This should be done while maintaining the original path of the root canal. Although instruments for root canal preparation have been progressively developed and optimized, a complete mechanical debridement of the root canal system is rarely achievable. One of the main reasons is the geometrical dis symmetry between the root canal and preparation instruments. Rotary instruments regardless of their type and form produce a preparation with a round outline if they are used in a simple linear filing motion, which in most of the cases do not coincide with the outline of the root canal. Root canal preparation in narrow, curved canals is a challenge even for experienced endodontists. Shaping of curved canals became more effective after the introduction of nickel-titanium (Ni-Ti) endodontic instruments. Despite the advantages of Ni-Ti rotary instruments, intra canal fracture is the most common procedural accident that occurs with these instruments during clinical use. It is a common experience between clinicians that Ni-Ti rotary instruments may undergo unexpected fracture without any visible warning, such as any previous permanent defect or deformation. Pro Taper Ni-Ti instruments were introduced with a unique design of variable taper within one instrument and continuously changing helical angles. Pro Taper rotary instruments are claimed to generate lower torque values during their use because of their modified nonradial landed cross-section that increases the cutting efficiency and reduces contact areas. On the other hand, the variable taper within one instrument is believed to reduce the ‘taper lock’ effect (torsional failure) in comparison with similarly tapered instruments. Nevertheless, Pro Taper

  17. Authentication of nuclear-material assays made with in-plant instruments

    International Nuclear Information System (INIS)

    Hatcher, C.R.; Hsue, S.T.; Russo, P.A.

    1982-01-01

    This paper develops a general approach for International Atomic Energy Agency (IAEA) authentication of nuclear material assays made with in-plant instruments under facility operator control. The IAEA is evaluating the use of in-plant instruments as a part of international safeguards at large bulk-handling facilities, such as reprocessing plants, fuel fabrication plants, and enrichment plants. One of the major technical problems associated with IAEA use of data from in-plant instruments is the need to show that there has been no tampering with the measurements. Two fundamentally different methods are discussed that can be used by IAEA inspectors to independently verify (or authenticate) measurements made with in-plant instruments. Method 1, called external authentication, uses a protected IAEA measurement technique to compare in-plant instrument results with IAEA results. Method 2, called internal authentication, uses protected IAEA standards, known physical constants, and special test procedures to determine the performance characteristics of the in-plant instrument. The importance of measurement control programs to detect normally expected instrument failures and procedural errors is also addressed. The paper concludes with a brief discussion of factors that should be considered by the designers of new in-plant instruments in order to facilitate IAEA authentication procedures

  18. Varying ultrasound power level to distinguish surgical instruments and tissue.

    Science.gov (United States)

    Ren, Hongliang; Anuraj, Banani; Dupont, Pierre E

    2018-03-01

    We investigate a new framework of surgical instrument detection based on power-varying ultrasound images with simple and efficient pixel-wise intensity processing. Without using complicated feature extraction methods, we identified the instrument with an estimated optimal power level and by comparing pixel values of varying transducer power level images. The proposed framework exploits the physics of ultrasound imaging system by varying the transducer power level to effectively distinguish metallic surgical instruments from tissue. This power-varying image-guidance is motivated from our observations that ultrasound imaging at different power levels exhibit different contrast enhancement capabilities between tissue and instruments in ultrasound-guided robotic beating-heart surgery. Using lower transducer power levels (ranging from 40 to 75% of the rated lowest ultrasound power levels of the two tested ultrasound scanners) can effectively suppress the strong imaging artifacts from metallic instruments and thus, can be utilized together with the images from normal transducer power levels to enhance the separability between instrument and tissue, improving intraoperative instrument tracking accuracy from the acquired noisy ultrasound volumetric images. We performed experiments in phantoms and ex vivo hearts in water tank environments. The proposed multi-level power-varying ultrasound imaging approach can identify robotic instruments of high acoustic impedance from low-signal-to-noise-ratio ultrasound images by power adjustments.

  19. Nuclear power plant control and instrumentation activities in Argentina during 1989-1991

    International Nuclear Information System (INIS)

    Lorenzetti, J.R.

    1992-01-01

    A brief resume of the activities in the different areas of control and instrumentation is included. As there was a delay in the construction of the new power plant most of the effort were dedicated to the plants that they are in operation. It has been added instrumentation to have better information in the control room and to check new variables of the plant according with the experience learned from the operation. It was dedicated special strength in the areas of training simulators and in service inspection. (author)

  20. Error Budget for a Calibration Demonstration System for the Reflected Solar Instrument for the Climate Absolute Radiance and Refractivity Observatory

    Science.gov (United States)

    Thome, Kurtis; McCorkel, Joel; McAndrew, Brendan

    2013-01-01

    A goal of the Climate Absolute Radiance and Refractivity Observatory (CLARREO) mission is to observe highaccuracy, long-term climate change trends over decadal time scales. The key to such a goal is to improving the accuracy of SI traceable absolute calibration across infrared and reflected solar wavelengths allowing climate change to be separated from the limit of natural variability. The advances required to reach on-orbit absolute accuracy to allow climate change observations to survive data gaps exist at NIST in the laboratory, but still need demonstration that the advances can move successfully from to NASA and/or instrument vendor capabilities for spaceborne instruments. The current work describes the radiometric calibration error budget for the Solar, Lunar for Absolute Reflectance Imaging Spectroradiometer (SOLARIS) which is the calibration demonstration system (CDS) for the reflected solar portion of CLARREO. The goal of the CDS is to allow the testing and evaluation of calibration approaches, alternate design and/or implementation approaches and components for the CLARREO mission. SOLARIS also provides a test-bed for detector technologies, non-linearity determination and uncertainties, and application of future technology developments and suggested spacecraft instrument design modifications. The resulting SI-traceable error budget for reflectance retrieval using solar irradiance as a reference and methods for laboratory-based, absolute calibration suitable for climatequality data collections is given. Key components in the error budget are geometry differences between the solar and earth views, knowledge of attenuator behavior when viewing the sun, and sensor behavior such as detector linearity and noise behavior. Methods for demonstrating this error budget are also presented.

  1. Rotatable Small Permanent Magnet Array for Ultra-Low Field Nuclear Magnetic Resonance Instrumentation: A Concept Study.

    Science.gov (United States)

    Vogel, Michael W; Giorni, Andrea; Vegh, Viktor; Pellicer-Guridi, Ruben; Reutens, David C

    2016-01-01

    We studied the feasibility of generating the variable magnetic fields required for ultra-low field nuclear magnetic resonance relaxometry with dynamically adjustable permanent magnets. Our motivation was to substitute traditional electromagnets by distributed permanent magnets, increasing system portability. The finite element method (COMSOL®) was employed for the numerical study of a small permanent magnet array to calculate achievable magnetic field strength, homogeneity, switching time and magnetic forces. A manually operated prototype was simulated and constructed to validate the numerical approach and to verify the generated magnetic field. A concentric small permanent magnet array can be used to generate strong sample pre-polarisation and variable measurement fields for ultra-low field relaxometry via simple prescribed magnet rotations. Using the array, it is possible to achieve a pre-polarisation field strength above 100 mT and variable measurement fields ranging from 20-50 μT with 200 ppm absolute field homogeneity within a field-of-view of 5 x 5 x 5 cubic centimetres. A dynamic small permanent magnet array can generate multiple highly homogeneous magnetic fields required in ultra-low field nuclear magnetic resonance (NMR) and magnetic resonance imaging (MRI) instruments. This design can significantly reduce the volume and energy requirements of traditional systems based on electromagnets, improving portability considerably.

  2. Vision though afocal instruments: generalized magnification and eye-instrument interaction

    Science.gov (United States)

    Harris, William F.; Evans, Tanya

    2018-04-01

    In Gaussian optics all observers experience the same magnification, the instrument's angular magnification, when viewing distant objects though a telescope or other afocal instruments. However, analysis in linear optics shows that this is not necessarily so in the presence of astigmatism. Because astigmatism may distort and rotate images it is appropriate to work with generalized angular magnification represented by a 2 × 2 matrix. An expression is derived for the generalized magnification for an arbitrary eye looking through an arbitrary afocal instrument. With afocal instruments containing astigmatic refracting elements not all eyes experience the same generalized magnification; there is interaction between eye and instrument. Eye-instrument interaction may change as the instrument is rotated about its longitudinal axis, there being no interaction in particular orientations. A simple numerical example is given. For sake of completeness, expressions for generalized magnification are also presented in the case of instruments that are not afocal and objects that are not distant.

  3. On-line scheme for parameter estimation of nonlinear lithium ion battery equivalent circuit models using the simplified refined instrumental variable method for a modified Wiener continuous-time model

    International Nuclear Information System (INIS)

    Allafi, Walid; Uddin, Kotub; Zhang, Cheng; Mazuir Raja Ahsan Sha, Raja; Marco, James

    2017-01-01

    Highlights: •Off-line estimation approach for continuous-time domain for non-invertible function. •Model reformulated to multi-input-single-output; nonlinearity described by sigmoid. •Method directly estimates parameters of nonlinear ECM from the measured-data. •Iterative on-line technique leads to smoother convergence. •The model is validated off-line and on-line using NCA battery. -- Abstract: The accuracy of identifying the parameters of models describing lithium ion batteries (LIBs) in typical battery management system (BMS) applications is critical to the estimation of key states such as the state of charge (SoC) and state of health (SoH). In applications such as electric vehicles (EVs) where LIBs are subjected to highly demanding cycles of operation and varying environmental conditions leading to non-trivial interactions of ageing stress factors, this identification is more challenging. This paper proposes an algorithm that directly estimates the parameters of a nonlinear battery model from measured input and output data in the continuous time-domain. The simplified refined instrumental variable method is extended to estimate the parameters of a Wiener model where there is no requirement for the nonlinear function to be invertible. To account for nonlinear battery dynamics, in this paper, the typical linear equivalent circuit model (ECM) is enhanced by a block-oriented Wiener configuration where the nonlinear memoryless block following the typical ECM is defined to be a sigmoid static nonlinearity. The nonlinear Weiner model is reformulated in the form of a multi-input, single-output linear model. This linear form allows the parameters of the nonlinear model to be estimated using any linear estimator such as the well-established least squares (LS) algorithm. In this paper, the recursive least square (RLS) method is adopted for online parameter estimation. The approach was validated on experimental data measured from an 18650-type Graphite

  4. ERASMUS-F: pathfinder for an E-ELT 3D instrumentation

    Science.gov (United States)

    Kelz, Andreas; Roth, Martin M.; Bacon, Roland; Bland-Hawthorn, Joss; Nicklas, Harald E.; Bryant, Julia J.; Colless, Matthew; Croom, Scott; Ellis, Simon; Fleischmann, Andreas; Gillingham, Peter; Haynes, Roger; Hopkins, Andrew; Kosmalski, Johan; O'Byrne, John W.; Olaya, Jean-Christophe; Rambold, William N.; Robertson, Gordon

    2010-07-01

    ERASMUS-F is a pathfinder study for a possible E-ELT 3D-instrumentation, funded by the German Ministry for Education and Research (BMBF). The study investigates the feasibility to combine a broadband optical spectrograph with a new generation of multi-object deployable fibre bundles. The baseline approach is to modify the spectrograph of the Multi-Unit Spectroscopic Explorer (MUSE), which is a VLT integral-field instrument using slicers, with a fibre-fed input. Taking advantage of recent developments in astrophotonics, it is planed to equip such an instrument with fused fibre bundles (hexabundles) that offer larger filling factors than dense-packed classical fibres. The overall project involves an optical and mechanical design study, the specifications of a software package for 3Dspectrophotometry, based upon the experiences with the P3d Data Reduction Software and an investigation of the science case for such an instrument. As a proof-of-concept, the study also involves a pathfinder instrument for the VLT, called the FIREBALL project.

  5. Generic System for Remote Testing and Calibration of Measuring Instruments: Security Architecture

    Science.gov (United States)

    Jurčević, M.; Hegeduš, H.; Golub, M.

    2010-01-01

    Testing and calibration of laboratory instruments and reference standards is a routine activity and is a resource and time consuming process. Since many of the modern instruments include some communication interfaces, it is possible to create a remote calibration system. This approach addresses a wide range of possible applications and permits to drive a number of different devices. On the other hand, remote calibration process involves a number of security issues due to recommendations specified in standard ISO/IEC 17025, since it is not under total control of the calibration laboratory personnel who will sign the calibration certificate. This approach implies that the traceability and integrity of the calibration process directly depends on the collected measurement data. The reliable and secure remote control and monitoring of instruments is a crucial aspect of internet-enabled calibration procedure.

  6. Pre-validation methods for developing a patient reported outcome instrument

    Directory of Open Access Journals (Sweden)

    Castillo Mayret M

    2011-08-01

    Full Text Available Abstract Background Measures that reflect patients' assessment of their health are of increasing importance as outcome measures in randomised controlled trials. The methodological approach used in the pre-validation development of new instruments (item generation, item reduction and question formatting should be robust and transparent. The totality of the content of existing PRO instruments for a specific condition provides a valuable resource (pool of items that can be utilised to develop new instruments. Such 'top down' approaches are common, but the explicit pre-validation methods are often poorly reported. This paper presents a systematic and generalisable 5-step pre-validation PRO instrument methodology. Methods The method is illustrated using the example of the Aberdeen Glaucoma Questionnaire (AGQ. The five steps are: 1 Generation of a pool of items; 2 Item de-duplication (three phases; 3 Item reduction (two phases; 4 Assessment of the remaining items' content coverage against a pre-existing theoretical framework appropriate to the objectives of the instrument and the target population (e.g. ICF; and 5 qualitative exploration of the target populations' views of the new instrument and the items it contains. Results The AGQ 'item pool' contained 725 items. Three de-duplication phases resulted in reduction of 91, 225 and 48 items respectively. The item reduction phases discarded 70 items and 208 items respectively. The draft AGQ contained 83 items with good content coverage. The qualitative exploration ('think aloud' study resulted in removal of a further 15 items and refinement to the wording of others. The resultant draft AGQ contained 68 items. Conclusions This study presents a novel methodology for developing a PRO instrument, based on three sources: literature reporting what is important to patient; theoretically coherent framework; and patients' experience of completing the instrument. By systematically accounting for all items dropped

  7. Evaluation of Sensor Configurations for Robotic Surgical Instruments

    Directory of Open Access Journals (Sweden)

    Jesús M. Gómez-de-Gabriel

    2015-10-01

    Full Text Available Designing surgical instruments for robotic-assisted minimally-invasive surgery (RAMIS is challenging due to constraints on the number and type of sensors imposed by considerations such as space or the need for sterilization. A new method for evaluating the usability of virtual teleoperated surgical instruments based on virtual sensors is presented. This method uses virtual prototyping of the surgical instrument with a dual physical interaction, which allows testing of different sensor configurations in a real environment. Moreover, the proposed approach has been applied to the evaluation of prototypes of a two-finger grasper for lump detection by remote pinching. In this example, the usability of a set of five different sensor configurations, with a different number of force sensors, is evaluated in terms of quantitative and qualitative measures in clinical experiments with 23 volunteers. As a result, the smallest number of force sensors needed in the surgical instrument that ensures the usability of the device can be determined. The details of the experimental setup are also included.

  8. Evaluation of Sensor Configurations for Robotic Surgical Instruments

    Science.gov (United States)

    Gómez-de-Gabriel, Jesús M.; Harwin, William

    2015-01-01

    Designing surgical instruments for robotic-assisted minimally-invasive surgery (RAMIS) is challenging due to constraints on the number and type of sensors imposed by considerations such as space or the need for sterilization. A new method for evaluating the usability of virtual teleoperated surgical instruments based on virtual sensors is presented. This method uses virtual prototyping of the surgical instrument with a dual physical interaction, which allows testing of different sensor configurations in a real environment. Moreover, the proposed approach has been applied to the evaluation of prototypes of a two-finger grasper for lump detection by remote pinching. In this example, the usability of a set of five different sensor configurations, with a different number of force sensors, is evaluated in terms of quantitative and qualitative measures in clinical experiments with 23 volunteers. As a result, the smallest number of force sensors needed in the surgical instrument that ensures the usability of the device can be determined. The details of the experimental setup are also included. PMID:26516863

  9. Instrument calibration reduction through on-line monitoring in the USA. Annex IV

    International Nuclear Information System (INIS)

    Hashemian, H.M.

    2008-01-01

    Nuclear power plants are required to calibrate important instruments once every fuel cycle. This requirement dates back more than 30 years, when commercial nuclear power plants began to operate. Based on calibration data accumulated over this period, it has been determined that the calibration of some instruments, such as pressure transmitters, do not drift enough to warrant calibration as often as once every fuel cycle. This fact, combined with human resources limitations and reduced maintenance budgets, has provided the motivation for the nuclear industry to develop new technologies for identifying drifting instruments during plant operation. Implementing these technologies allows calibration efforts to be focused on the instruments that have drifted out of tolerance, as opposed to current practice, which calls for calibration verification of almost all instruments every fuel cycle. To date, an array of technologies, referred to collectively as 'on-line calibration monitoring', has been developed to meet this objective. These technologies are based on identifying outlier sensors using techniques that compare a particular sensor's output to a calculated estimate of the actual process the sensor is measuring. If on-line monitoring data are collected during plant startup and/or shutdown periods as well as normal operation, the on-line monitoring approach can help verify the calibration of instruments over their entire operating range. Although on-line calibration monitoring is applicable to most sensors and can cover an entire instrument channel, the main application of this approach in nuclear power plants is currently for pressure transmitters (including level and flow transmitters). (author)

  10. Psychological ownership: Development of an instrument

    Directory of Open Access Journals (Sweden)

    Chantal Olckers

    2013-10-01

    Purpose: The purpose of this study was to develop an instrument to measure psychological ownership in a South African context. Motivation for the study: It was found that previous instruments for the measurement of psychological ownership lacked the ability to grasp the extensive reach of psychological ownership. Research design, approach and method: A quantitative cross-sectional survey was conducted on a non-probability convenience sample of 713 skilled, highly-skilled and professional employees from various organisations in both the private and public sectors in South Africa. Main findings: Although a 69-item measurement instrument was developed in order to capture the proposed seven-dimensional psychological ownership construct, it became evident when analysing the data that a four-factor model comprising 35 items was suitable. Practical/managerial implications: If a sense of psychological ownership toward an organisation could be established amongst its employees by addressing the factors as measured by the South African Psychological Ownership Questionnaire, organisations could become enhanced workplaces and, as a result, sustainable performance could be promoted and staff could be retained. Contribution/value-add: The instrument for measuring psychological ownership in a South African context could serve as a diagnostic tool that would allow human resource professionals and managers to determine employees’ sense of psychological ownership regarding their organisation and to focus specifically on weak dimensional areas that could be improved.

  11. Treating pre-instrumental data as "missing" data: using a tree-ring-based paleoclimate record and imputations to reconstruct streamflow in the Missouri River Basin

    Science.gov (United States)

    Ho, M. W.; Lall, U.; Cook, E. R.

    2015-12-01

    Advances in paleoclimatology in the past few decades have provided opportunities to expand the temporal perspective of the hydrological and climatological variability across the world. The North American region is particularly fortunate in this respect where a relatively dense network of high resolution paleoclimate proxy records have been assembled. One such network is the annually-resolved Living Blended Drought Atlas (LBDA): a paleoclimate reconstruction of the Palmer Drought Severity Index (PDSI) that covers North America on a 0.5° × 0.5° grid based on tree-ring chronologies. However, the use of the LBDA to assess North American streamflow variability requires a model by which streamflow may be reconstructed. Paleoclimate reconstructions have typically used models that first seek to quantify the relationship between the paleoclimate variable and the environmental variable of interest before extrapolating the relationship back in time. In contrast, the pre-instrumental streamflow is here considered as "missing" data. A method of imputing the "missing" streamflow data, prior to the instrumental record, is applied through multiple imputation using chained equations for streamflow in the Missouri River Basin. In this method, the distribution of the instrumental streamflow and LBDA is used to estimate sets of plausible values for the "missing" streamflow data resulting in a ~600 year-long streamflow reconstruction. Past research into external climate forcings, oceanic-atmospheric variability and its teleconnections, and assessments of rare multi-centennial instrumental records demonstrate that large temporal oscillations in hydrological conditions are unlikely to be captured in most instrumental records. The reconstruction of multi-centennial records of streamflow will enable comprehensive assessments of current and future water resource infrastructure and operations under the existing scope of natural climate variability.

  12. Variable Structure Disturbance Rejection Control for Nonlinear Uncertain Systems with State and Control Delays via Optimal Sliding Mode Surface Approach

    Directory of Open Access Journals (Sweden)

    Jing Lei

    2013-01-01

    Full Text Available The paper considers the problem of variable structure control for nonlinear systems with uncertainty and time delays under persistent disturbance by using the optimal sliding mode surface approach. Through functional transformation, the original time-delay system is transformed into a delay-free one. The approximating sequence method is applied to solve the nonlinear optimal sliding mode surface problem which is reduced to a linear two-point boundary value problem of approximating sequences. The optimal sliding mode surface is obtained from the convergent solutions by solving a Riccati equation, a Sylvester equation, and the state and adjoint vector differential equations of approximating sequences. Then, the variable structure disturbance rejection control is presented by adopting an exponential trending law, where the state and control memory terms are designed to compensate the state and control delays, a feedforward control term is designed to reject the disturbance, and an adjoint compensator is designed to compensate the effects generated by the nonlinearity and the uncertainty. Furthermore, an observer is constructed to make the feedforward term physically realizable, and thus the dynamical observer-based dynamical variable structure disturbance rejection control law is produced. Finally, simulations are demonstrated to verify the effectiveness of the presented controller and the simplicity of the proposed approach.

  13. Direct concurrent comparison of multiple pediatric acute asthma scoring instruments.

    Science.gov (United States)

    Johnson, Michael D; Nkoy, Flory L; Sheng, Xiaoming; Greene, Tom; Stone, Bryan L; Garvin, Jennifer

    2017-09-01

    Appropriate delivery of Emergency Department (ED) treatment to children with acute asthma requires clinician assessment of acute asthma severity. Various clinical scoring instruments exist to standardize assessment of acute asthma severity in the ED, but their selection remains arbitrary due to few published direct comparisons of their properties. Our objective was to test the feasibility of directly comparing properties of multiple scoring instruments in a pediatric ED. Using a novel approach supported by a composite data collection form, clinicians categorized elements of five scoring instruments before and after initial treatment for 48 patients 2-18 years of age with acute asthma seen at the ED of a tertiary care pediatric hospital ED from August to December 2014. Scoring instruments were compared for inter-rater reliability between clinician types and their ability to predict hospitalization. Inter-rater reliability between clinician types was not different between instruments at any point and was lower (weighted kappa range 0.21-0.55) than values reported elsewhere. Predictive ability of most instruments for hospitalization was higher after treatment than before treatment (p < 0.05) and may vary between instruments after treatment (p = 0.054). We demonstrate the feasibility of comparing multiple clinical scoring instruments simultaneously in ED clinical practice. Scoring instruments had higher predictive ability for hospitalization after treatment than before treatment and may differ in their predictive ability after initial treatment. Definitive conclusions about the best instrument or meaningful comparison between instruments will require a study with a larger sample size.

  14. Instrumental motives in negative emotion regulation in daily life: Frequency, consistency, and predictors.

    Science.gov (United States)

    Kalokerinos, Elise K; Tamir, Maya; Kuppens, Peter

    2017-06-01

    People regulate their emotions not only for hedonic reasons but also for instrumental reasons, to attain the potential benefits of emotions beyond pleasure and pain. However, such instrumental motives have rarely been examined outside the laboratory as they naturally unfold in daily life. To assess whether and how instrumental motives operate outside the laboratory, it is necessary to examine them in response to real and personally relevant stimuli in ecologically valid contexts. In this research, we assessed the frequency, consistency, and predictors of instrumental motives in negative emotion regulation in daily life. Participants (N = 114) recalled the most negative event of their day each evening for 7 days and reported their instrumental motives and negative emotion goals in that event. Participants endorsed performance motives in approximately 1 in 3 events and social, eudaimonic, and epistemic motives in approximately 1 in 10 events. Instrumental motives had substantially higher within- than between-person variance, indicating that they were context-dependent. Indeed, although we found few associations between instrumental motives and personality traits, relationships between instrumental motives and contextual variables were more extensive. Performance, social, and epistemic motives were each predicted by a unique pattern of contextual appraisals. Our data demonstrate that instrumental motives play a role in daily negative emotion regulation as people encounter situations that pose unique regulatory demands. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. An Instrument to Determine the Technological Literacy Levels of Upper Secondary School Students

    Science.gov (United States)

    Luckay, Melanie B.; Collier-Reed, Brandon I.

    2014-01-01

    In this article, an instrument for assessing upper secondary school students' levels of technological literacy is presented. The items making up the instrument emerged from a previous study that employed a phenomenographic research approach to explore students' conceptions of technology in terms of their understanding of the "nature…

  16. Pancreatitis Quality of Life Instrument: Development of a new instrument

    Directory of Open Access Journals (Sweden)

    Wahid Wassef

    2014-02-01

    Full Text Available Objectives: The goal of this project was to develop the first disease-specific instrument for the evaluation of quality of life in chronic pancreatitis. Methods: Focus groups and interview sessions were conducted, with chronic pancreatitis patients, to identify items felt to impact quality of life which were subsequently formatted into a paper-and-pencil instrument. This instrument was used to conduct an online survey by an expert panel of pancreatologists to evaluate its content validity. Finally, the modified instrument was presented to patients during precognitive testing interviews to evaluate its clarity and appropriateness. Results: In total, 10 patients were enrolled in the focus groups and interview sessions where they identified 50 items. Once redundant items were removed, the 40 remaining items were made into a paper-and-pencil instrument referred to as the Pancreatitis Quality of Life Instrument. Through the processes of content validation and precognitive testing, the number of items in the instrument was reduced to 24. Conclusions: This marks the development of the first disease-specific instrument to evaluate quality of life in chronic pancreatitis. It includes unique features not found in generic instruments (economic factors, stigma, and spiritual factors. Although this marks a giant step forward, psychometric evaluation is still needed prior to its clinical use.

  17. Survey of French spine surgeons reveals significant variability in spine trauma practices in 2013.

    Science.gov (United States)

    Lonjon, G; Grelat, M; Dhenin, A; Dauzac, C; Lonjon, N; Kepler, C K; Vaccaro, A R

    2015-02-01

    In France, attempts to define common ground during spine surgery meetings have revealed significant variability in clinical practices across different schools of surgery and the two specialities involved in spine surgery, namely, neurosurgery and orthopaedic surgery. To objectively characterise this variability by performing a survey based on a fictitious spine trauma case. Our working hypothesis was that significant variability existed in trauma practices and that this variability was related to a lack of strong scientific evidence in spine trauma care. We performed a cross-sectional survey based on a clinical vignette describing a 31-year-old male with an L1 burst fracture and neurologic symptoms (numbness). Surgeons received the vignette and a 14-item questionnaire on the management of this patient. For each question, surgeons had to choose among five possible answers. Differences in answers across surgeons were assessed using the Index of Qualitative Variability (IQV), in which 0 indicates no variability and 1 maximal variability. Surgeons also received a questionnaire about their demographics and surgical experience. Of 405 invited spine surgeons, 200 responded to the survey. Five questions had an IQV greater than 0.9, seven an IQV between 0.5 and 0.9, and two an IQV lower than 0.5. Variability was greatest about the need for MRI (IQV=0.93), degree of urgency (IQV=0.93), need for fusion (IQV=0.92), need for post-operative bracing (IQV=0.91), and routine removal of instrumentation (IQV=0.94). Variability was lowest for questions about the need for surgery (IQV=0.42) and use of the posterior approach (IQV=0.36). Answers were influenced by surgeon specialty, age, experience level, and type of centre. Clinical practice regarding spine trauma varies widely in France. Little published evidence is available on which to base recommendations that would diminish this variability. Copyright © 2015. Published by Elsevier Masson SAS.

  18. Testing of Environmental Satellite Bus-Instrument Interfaces Using Engineering Models

    Science.gov (United States)

    Gagnier, Donald; Hayner, Rick; Nosek, Thomas; Roza, Michael; Hendershot, James E.; Razzaghi, Andrea I.

    2004-01-01

    This paper discusses the formulation and execution of a laboratory test of the electrical interfaces between multiple atmospheric scientific instruments and the spacecraft bus that carries them. The testing, performed in 2002, used engineering models of the instruments and the Aura spacecraft bus electronics. Aura is one of NASA s Earth Observatory System missions. The test was designed to evaluate the complex interfaces in the command and data handling subsystems prior to integration of the complete flight instruments on the spacecraft. A problem discovered during the flight integration phase of the observatory can cause significant cost and schedule impacts. The tests successfully revealed problems and led to their resolution before the full-up integration phase, saving significant cost and schedule. This approach could be beneficial for future environmental satellite programs involving the integration of multiple, complex scientific instruments onto a spacecraft bus.

  19. Artificial intelligence programming with LabVIEW: genetic algorithms for instrumentation control and optimization.

    Science.gov (United States)

    Moore, J H

    1995-06-01

    A genetic algorithm for instrumentation control and optimization was developed using the LabVIEW graphical programming environment. The usefulness of this methodology for the optimization of a closed loop control instrument is demonstrated with minimal complexity and the programming is presented in detail to facilitate its adaptation to other LabVIEW applications. Closed loop control instruments have variety of applications in the biomedical sciences including the regulation of physiological processes such as blood pressure. The program presented here should provide a useful starting point for those wishing to incorporate genetic algorithm approaches to LabVIEW mediated optimization of closed loop control instruments.

  20. Rural Parents’ Perceived Stigma of Seeking Mental Health Services for their Children: Development and Evaluation of a New Instrument

    Science.gov (United States)

    Williams, Stacey L.; Polaha, Jodi

    2014-01-01

    The purpose of this paper was to examine the validity of score interpretations of an instrument developed to measure parents’ perceptions of stigma about seeking mental health services for their children. The validity of the score interpretations of the instrument was tested in two studies. Study 1 examined confirmatory factor analysis (CFA) employing a split half approach, and construct and criterion validity using the entire sample of parents in rural Appalachia whose children were experiencing psychosocial concerns (N=347), while Study 2 further examined CFA, construct and criterion validity, as well as predictive validity of the scores on the new scale using a general sample of parents in rural Appalachia (N=184). Results of exploratory and confirmatory factor analyses revealed support for a two factor model of parents’ perceived stigma, which represented both self and public forms of stigma associated with seeking mental health services for their children, and correlated with existing measures of stigma and other psychosocial variables. Further, the new self and public stigma scale significantly predicted parents’ willingness to seek services for children. PMID:24749752

  1. Supervised pre-processing approaches in multiple class variables classification for fish recruitment forecasting

    KAUST Repository

    Fernandes, José Antonio

    2013-02-01

    A multi-species approach to fisheries management requires taking into account the interactions between species in order to improve recruitment forecasting of the fish species. Recent advances in Bayesian networks direct the learning of models with several interrelated variables to be forecasted simultaneously. These models are known as multi-dimensional Bayesian network classifiers (MDBNs). Pre-processing steps are critical for the posterior learning of the model in these kinds of domains. Therefore, in the present study, a set of \\'state-of-the-art\\' uni-dimensional pre-processing methods, within the categories of missing data imputation, feature discretization and feature subset selection, are adapted to be used with MDBNs. A framework that includes the proposed multi-dimensional supervised pre-processing methods, coupled with a MDBN classifier, is tested with synthetic datasets and the real domain of fish recruitment forecasting. The correctly forecasting of three fish species (anchovy, sardine and hake) simultaneously is doubled (from 17.3% to 29.5%) using the multi-dimensional approach in comparison to mono-species models. The probability assessments also show high improvement reducing the average error (estimated by means of Brier score) from 0.35 to 0.27. Finally, these differences are superior to the forecasting of species by pairs. © 2012 Elsevier Ltd.

  2. New Methods for Retrieval of Chlorophyll Red Fluorescence from Hyperspectral Satellite Instruments: Simulations and Application to GOME-2 and SCIAMACHY

    Science.gov (United States)

    Joiner, Joanna; Yoshida, Yasuko; Guanter, Luis; Middleton, Elizabeth M.

    2016-01-01

    Global satellite measurements of solar-induced fluorescence (SIF) from chlorophyll over land and ocean have proven useful for a number of different applications related to physiology, phenology, and productivity of plants and phytoplankton. Terrestrial chlorophyll fluorescence is emitted throughout the red and far-red spectrum, producing two broad peaks near 683 and 736nm. From ocean surfaces, phytoplankton fluorescence emissions are entirely from the red region (683nm peak). Studies using satellite-derived SIF over land have focused almost exclusively on measurements in the far red (wavelengths greater than 712nm), since those are the most easily obtained with existing instrumentation. Here, we examine new ways to use existing hyperspectral satellite data sets to retrieve red SIF (wavelengths less than 712nm) over both land and ocean. Red SIF is thought to provide complementary information to that from the far red for terrestrial vegetation. The satellite instruments that we use were designed to make atmospheric trace-gas measurements and are therefore not optimal for observing SIF; they have coarse spatial resolution and only moderate spectral resolution (0.5nm). Nevertheless, these instruments, the Global Ozone Monitoring Instrument 2 (GOME-2) and the SCanning Imaging Absorption spectroMeter for Atmospheric CHartographY (SCIAMACHY), offer a unique opportunity to compare red and far-red terrestrial SIF at regional spatial scales. Terrestrial SIF has been estimated with ground-, aircraft-, or satellite-based instruments by measuring the filling-in of atmospheric andor solar absorption spectral features by SIF. Our approach makes use of the oxygen (O2) gamma band that is not affected by SIF. The SIF-free O2 gamma band helps to estimate absorption within the spectrally variable O2 B band, which is filled in by red SIF. SIF also fills in the spectrally stable solar Fraunhofer lines (SFLs) at wavelengths both inside and just outside the O2 B band, which further helps

  3. Longitudinal Evaluation of the Integration of Digital Musical Instruments into Existing Compositional Work Processes

    DEFF Research Database (Denmark)

    Gelineck, Steven; Serafin, Stefania

    2012-01-01

    This paper explores a longitudinal approach to the qualitative evaluation of a set of digital musical instruments, which were developed with a focus on creativity and exploration. The instruments were lent to three electronic musicians/composers for a duration of four weeks. Free exploration...

  4. Variable selection in Logistic regression model with genetic algorithm.

    Science.gov (United States)

    Zhang, Zhongheng; Trevino, Victor; Hoseini, Sayed Shahabuddin; Belciug, Smaranda; Boopathi, Arumugam Manivanna; Zhang, Ping; Gorunescu, Florin; Subha, Velappan; Dai, Songshi

    2018-02-01

    Variable or feature selection is one of the most important steps in model specification. Especially in the case of medical-decision making, the direct use of a medical database, without a previous analysis and preprocessing step, is often counterproductive. In this way, the variable selection represents the method of choosing the most relevant attributes from the database in order to build a robust learning models and, thus, to improve the performance of the models used in the decision process. In biomedical research, the purpose of variable selection is to select clinically important and statistically significant variables, while excluding unrelated or noise variables. A variety of methods exist for variable selection, but none of them is without limitations. For example, the stepwise approach, which is highly used, adds the best variable in each cycle generally producing an acceptable set of variables. Nevertheless, it is limited by the fact that it commonly trapped in local optima. The best subset approach can systematically search the entire covariate pattern space, but the solution pool can be extremely large with tens to hundreds of variables, which is the case in nowadays clinical data. Genetic algorithms (GA) are heuristic optimization approaches and can be used for variable selection in multivariable regression models. This tutorial paper aims to provide a step-by-step approach to the use of GA in variable selection. The R code provided in the text can be extended and adapted to other data analysis needs.

  5. Intercomparison of NO3 radical detection instruments in the atmosphere simulation chamber SAPHIR

    Directory of Open Access Journals (Sweden)

    H.-P. Dorn

    2013-05-01

    Full Text Available The detection of atmospheric NO3 radicals is still challenging owing to its low mixing ratios (≈ 1 to 300 pptv in the troposphere. While long-path differential optical absorption spectroscopy (DOAS has been a well-established NO3 detection approach for over 25 yr, newly sensitive techniques have been developed in the past decade. This publication outlines the results of the first comprehensive intercomparison of seven instruments developed for the spectroscopic detection of tropospheric NO3. Four instruments were based on cavity ring-down spectroscopy (CRDS, two utilised open-path cavity-enhanced absorption spectroscopy (CEAS, and one applied "classical" long-path DOAS. The intercomparison campaign "NO3Comp" was held at the atmosphere simulation chamber SAPHIR in Jülich (Germany in June 2007. Twelve experiments were performed in the well-mixed chamber for variable concentrations of NO3, N2O5, NO2, hydrocarbons, and water vapour, in the absence and in the presence of inorganic or organic aerosol. The overall precision of the cavity instruments varied between 0.5 and 5 pptv for integration times of 1 s to 5 min; that of the DOAS instrument was 9 pptv for an acquisition time of 1 min. The NO3 data of all instruments correlated excellently with the NOAA-CRDS instrument, which was selected as the common reference because of its superb sensitivity, high time resolution, and most comprehensive data coverage. The median of the coefficient of determination (r2 over all experiments of the campaign (60 correlations is r2 = 0.981 (quartile 1 (Q1: 0.949; quartile 3 (Q3: 0.994; min/max: 0.540/0.999. The linear regression analysis of the campaign data set yielded very small intercepts (median: 1.1 pptv; Q1/Q3: −1.1/2.6 pptv; min/max: −14.1/28.0 pptv, and the slopes of the regression lines were close to unity (median: 1.01; Q1/Q3: 0.92/1.10; min/max: 0.72/1.36. The deviation of individual regression slopes from unity was always within the combined

  6. A variable stiffness mechanism for steerable percutaneous instruments: integration in a needle.

    Science.gov (United States)

    De Falco, Iris; Culmone, Costanza; Menciassi, Arianna; Dankelman, Jenny; van den Dobbelsteen, John J

    2018-06-04

    Needles are advanced tools commonly used in minimally invasive medical procedures. The accurate manoeuvrability of flexible needles through soft tissues is strongly determined by variations in tissue stiffness, which affects the needle-tissue interaction and thus causes needle deflection. This work presents a variable stiffness mechanism for percutaneous needles capable of compensating for variations in tissue stiffness and undesirable trajectory changes. It is composed of compliant segments and rigid plates alternately connected in series and longitudinally crossed by four cables. The tensioning of the cables allows the omnidirectional steering of the tip and the stiffness tuning of the needle. The mechanism was tested separately under different working conditions, demonstrating a capability to exert up to 3.6 N. Afterwards, the mechanism was integrated into a needle, and the overall device was tested in gelatine phantoms simulating the stiffness of biological tissues. The needle demonstrated the capability to vary deflection (from 11.6 to 4.4 mm) and adapt to the inhomogeneity of the phantoms (from 21 to 80 kPa) depending on the activation of the variable stiffness mechanism. Graphical abstract ᅟ.

  7. An innovative statistical approach for analysing non-continuous variables in environmental monitoring: assessing temporal trends of TBT pollution.

    Science.gov (United States)

    Santos, José António; Galante-Oliveira, Susana; Barroso, Carlos

    2011-03-01

    The current work presents an innovative statistical approach to model ordinal variables in environmental monitoring studies. An ordinal variable has values that can only be compared as "less", "equal" or "greater" and it is not possible to have information about the size of the difference between two particular values. The example of ordinal variable under this study is the vas deferens sequence (VDS) used in imposex (superimposition of male sexual characters onto prosobranch females) field assessment programmes for monitoring tributyltin (TBT) pollution. The statistical methodology presented here is the ordered logit regression model. It assumes that the VDS is an ordinal variable whose values match up a process of imposex development that can be considered continuous in both biological and statistical senses and can be described by a latent non-observable continuous variable. This model was applied to the case study of Nucella lapillus imposex monitoring surveys conducted in the Portuguese coast between 2003 and 2008 to evaluate the temporal evolution of TBT pollution in this country. In order to produce more reliable conclusions, the proposed model includes covariates that may influence the imposex response besides TBT (e.g. the shell size). The model also provides an analysis of the environmental risk associated to TBT pollution by estimating the probability of the occurrence of females with VDS ≥ 2 in each year, according to OSPAR criteria. We consider that the proposed application of this statistical methodology has a great potential in environmental monitoring whenever there is the need to model variables that can only be assessed through an ordinal scale of values.

  8. Using Longitudinal Scales Assessment for Instrumental Music Students

    Science.gov (United States)

    Simon, Samuel H.

    2014-01-01

    In music education, current assessment trends emphasize student reflection, tracking progress over time, and formative as well as summative measures. This view of assessment requires instrumental music educators to modernize their approaches without interfering with methods that have proven to be successful. To this end, the Longitudinal Scales…

  9. Instrument for the measuring magnetic field characteristics of induction acceleration

    International Nuclear Information System (INIS)

    Novikov, V.M.; Romasheva, P.I.

    1976-01-01

    An instrument for the measuring instantaneous values of variable and pulsed magnetic fields with an amplitide of 0.005-2.0 and duration of 5x10 -6 -2x10 -2 sec is described. Time resolution is not less than 0.5 musec, measuring accuracy is about 1%. Induction coils are used as sensors. A digital voltmeter serves as a secondary recorder

  10. Solar Energy Research Center Instrumentation Facility

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Thomas, J.; Papanikolas, John, P.

    2011-11-11

    SOLAR ENERGY RESEARCH CENTER INSTRUMENTATION FACILITY The mission of the Solar Energy Research Center (UNC SERC) at the University of North Carolina at Chapel Hill (UNC-CH) is to establish a world leading effort in solar fuels research and to develop the materials and methods needed to fabricate the next generation of solar energy devices. We are addressing the fundamental issues that will drive new strategies for solar energy conversion and the engineering challenges that must be met in order to convert discoveries made in the laboratory into commercially available devices. The development of a photoelectrosynthesis cell (PEC) for solar fuels production faces daunting requirements: (1) Absorb a large fraction of sunlight; (2) Carry out artificial photosynthesis which involves multiple complex reaction steps; (3) Avoid competitive and deleterious side and reverse reactions; (4) Perform 13 million catalytic cycles per year with minimal degradation; (5) Use non-toxic materials; (6) Cost-effectiveness. PEC efficiency is directly determined by the kinetics of each reaction step. The UNC SERC is addressing this challenge by taking a broad interdisciplinary approach in a highly collaborative setting, drawing on expertise across a broad range of disciplines in chemistry, physics and materials science. By taking a systematic approach toward a fundamental understanding of the mechanism of each step, we will be able to gain unique insight and optimize PEC design. Access to cutting-edge spectroscopic tools is critical to this research effort. We have built professionally-staffed facilities equipped with the state-of the-art instrumentation funded by this award. The combination of staff, facilities, and instrumentation specifically tailored for solar fuels research establishes the UNC Solar Energy Research Center Instrumentation Facility as a unique, world-class capability. This congressionally directed project funded the development of two user facilities: TASK 1: SOLAR

  11. Concurrent variable-interval variable-ratio schedules in a dynamic choice environment.

    Science.gov (United States)

    Bell, Matthew C; Baum, William M

    2017-11-01

    Most studies of operant choice have focused on presenting subjects with a fixed pair of schedules across many experimental sessions. Using these methods, studies of concurrent variable- interval variable-ratio schedules helped to evaluate theories of choice. More recently, a growing literature has focused on dynamic choice behavior. Those dynamic choice studies have analyzed behavior on a number of different time scales using concurrent variable-interval schedules. Following the dynamic choice approach, the present experiment examined performance on concurrent variable-interval variable-ratio schedules in a rapidly changing environment. Our objectives were to compare performance on concurrent variable-interval variable-ratio schedules with extant data on concurrent variable-interval variable-interval schedules using a dynamic choice procedure and to extend earlier work on concurrent variable-interval variable-ratio schedules. We analyzed performances at different time scales, finding strong similarities between concurrent variable-interval variable-interval and concurrent variable-interval variable- ratio performance within dynamic choice procedures. Time-based measures revealed almost identical performance in the two procedures compared with response-based measures, supporting the view that choice is best understood as time allocation. Performance at the smaller time scale of visits accorded with the tendency seen in earlier research toward developing a pattern of strong preference for and long visits to the richer alternative paired with brief "samples" at the leaner alternative ("fix and sample"). © 2017 Society for the Experimental Analysis of Behavior.

  12. Choosing Environmental Policy Instruments in the Real World

    International Nuclear Information System (INIS)

    Greenspan Bell, R.

    2003-01-01

    In their enthusiasm for efficiency over other values, the advocates for market-based instruments for environmental control have reversed the order in which environmental solutions are found. They have written their prescriptions without first doing a physical examination of the patient; in other words, they have first recommended environmental instruments and secondarily tried to bend institutions to support the already identified cure. The engine for environmental regulation consists of the institutions available country by country to carry out environmental policy. Institutional inadequacies such as low functioning legal systems, historical experience (or inexperience) with markets, distorting and often institutionalised corruption, and public acceptance certainly can be fixed. But changing these fundamentals can be a long and arduous process. Those who advise governments to adopt reforms for which the institutional basis does not yet exist put the cart before the horse, a costly mistake that directs weak countries in the direction of solutions they have little hope of implementing. Instead, the donors and advisors should be seeking alternative approaches, for example to encourage incremental improvements and pragmatic goals, by considering a transitional or tiered approach that will take into account existing capabilities and institutions, at the same time acknowledging that a long learning curve lies ahead with inevitably uneven implementation and slippage from time to time. Another approach would be to find examples of small, albeit imperfect, efforts that seem to be working and building on them. The long-term goal should be efficient solutions, but only the most developed countries should be encouraged to attempt difficult environmental policy instruments like taxation and emissions trading schemes

  13. IUE observations of variability in winds from hot stars

    Science.gov (United States)

    Grady, C. A.; Snow, T. P., Jr.

    1981-01-01

    Observations of variability in stellar winds or envelopes provide an important probe of their dynamics. For this purpose a number of O, B, Be, and Wolf-Rayet stars were repeatedly observed with the IUE satellite in high resolution mode. In the course of analysis, instrumental and data handling effects were found to introduce spurious variability in many of the spectra. software was developed to partially compensate for these effects, but limitations remain on the type of variability that can be identified from IUE spectra. With these contraints, preliminary results of multiple observations of two OB stars, one Wolf-Rayet star, and a Be star are discussed.

  14. Development of Flexible Software Process Lines with Variability Operations

    DEFF Research Database (Denmark)

    Dohrmann, Patrick; Schramm, Joachim; Kuhrmann, Marco

    2016-01-01

    the development of flexible software process lines. Method: We conducted a longitudinal study in which we studied 5 variants of the V-Modell XT process line for 2 years. Results: Our results show the variability operation instrument feasible in practice. We analyzed 616 operation exemplars addressing various...

  15. A network-based approach for semi-quantitative knowledge mining and its application to yield variability

    Science.gov (United States)

    Schauberger, Bernhard; Rolinski, Susanne; Müller, Christoph

    2016-12-01

    Variability of crop yields is detrimental for food security. Under climate change its amplitude is likely to increase, thus it is essential to understand the underlying causes and mechanisms. Crop models are the primary tool to project future changes in crop yields under climate change. A systematic overview of drivers and mechanisms of crop yield variability (YV) can thus inform crop model development and facilitate improved understanding of climate change impacts on crop yields. Yet there is a vast body of literature on crop physiology and YV, which makes a prioritization of mechanisms for implementation in models challenging. Therefore this paper takes on a novel approach to systematically mine and organize existing knowledge from the literature. The aim is to identify important mechanisms lacking in models, which can help to set priorities in model improvement. We structure knowledge from the literature in a semi-quantitative network. This network consists of complex interactions between growing conditions, plant physiology and crop yield. We utilize the resulting network structure to assign relative importance to causes of YV and related plant physiological processes. As expected, our findings confirm existing knowledge, in particular on the dominant role of temperature and precipitation, but also highlight other important drivers of YV. More importantly, our method allows for identifying the relevant physiological processes that transmit variability in growing conditions to variability in yield. We can identify explicit targets for the improvement of crop models. The network can additionally guide model development by outlining complex interactions between processes and by easily retrieving quantitative information for each of the 350 interactions. We show the validity of our network method as a structured, consistent and scalable dictionary of literature. The method can easily be applied to many other research fields.

  16. Instrumented Pipeline Initiative

    Energy Technology Data Exchange (ETDEWEB)

    Thomas Piro; Michael Ream

    2010-07-31

    This report summarizes technical progress achieved during the cooperative agreement between Concurrent Technologies Corporation (CTC) and U.S. Department of Energy to address the need for a for low-cost monitoring and inspection sensor system as identified in the Department of Energy (DOE) National Gas Infrastructure Research & Development (R&D) Delivery Reliability Program Roadmap.. The Instrumented Pipeline Initiative (IPI) achieved the objective by researching technologies for the monitoring of pipeline delivery integrity, through a ubiquitous network of sensors and controllers to detect and diagnose incipient defects, leaks, and failures. This report is organized by tasks as detailed in the Statement of Project Objectives (SOPO). The sections all state the objective and approach before detailing results of work.

  17. In search of control variables : A systems approach

    NARCIS (Netherlands)

    Dalenoort, GJ

    1997-01-01

    Motor processes cannot be modeled by a single (unified) model. Instead, a number of models at different levels of description are needed. The concepts of control and control variable only make sense at the functional level. A clear distinction must be made between external models and internal

  18. Satellite-instrument system engineering best practices and lessons

    Science.gov (United States)

    Schueler, Carl F.

    2009-08-01

    This paper focuses on system engineering development issues driving satellite remote sensing instrumentation cost and schedule. A key best practice is early assessment of mission and instrumentation requirements priorities driving performance trades among major instrumentation measurements: Radiometry, spatial field of view and image quality, and spectral performance. Key lessons include attention to technology availability and applicability to prioritized requirements, care in applying heritage, approaching fixed-price and cost-plus contracts with appropriate attention to risk, and assessing design options with attention to customer preference as well as design performance, and development cost and schedule. A key element of success either in contract competition or execution is team experience. Perhaps the most crucial aspect of success, however, is thorough requirements analysis and flowdown to specifications driving design performance with sufficient parameter margin to allow for mistakes or oversights - the province of system engineering from design inception to development, test and delivery.

  19. An algebraic geometric approach to separation of variables

    CERN Document Server

    Schöbel, Konrad

    2015-01-01

    Konrad Schöbel aims to lay the foundations for a consequent algebraic geometric treatment of variable separation, which is one of the oldest and most powerful methods to construct exact solutions for the fundamental equations in classical and quantum physics. The present work reveals a surprising algebraic geometric structure behind the famous list of separation coordinates, bringing together a great range of mathematics and mathematical physics, from the late 19th century theory of separation of variables to modern moduli space theory, Stasheff polytopes and operads. "I am particularly impressed by his mastery of a variety of techniques and his ability to show clearly how they interact to produce his results.”   (Jim Stasheff)   Contents The Foundation: The Algebraic Integrability Conditions The Proof of Concept: A Complete Solution for the 3-Sphere The Generalisation: A Solution for Spheres of Arbitrary Dimension The Perspectives: Applications and Generalisations   Target Groups Scientists in the fie...

  20. A comparison of RANZCR and Singapore-designed radiation oncology practice audit instruments: how does reproducibility affect future approaches to revalidation

    International Nuclear Information System (INIS)

    Lu, Jiade J.; Wynne, Christopher J.; Kumar, Mahesh B.; Shakespeare, Thomas P.; Mukherjee, Rahul; Back, Michael F.

    2004-01-01

    Physician competency assessment requires the use of validated methods and instruments. The Royal Australian and New Zealand College of Radiologists (RANZCR) developed a draft audit form to be evaluated as a competency assessment instrument for radiation oncologists (ROs) in Australasia. We evaluated the reliability of the RANZCR instrument as well as a separate The Cancer Institute (TCI) Singapore-designed instrument by having two ROs perform an independent chart review of 80 randomly selected patients seen at The Cancer Institute (TCI), Singapore. Both RANZCR and TCI Singapore instruments were used to score each chart. Inter- and intra-observer reliability for both audit instruments were compared using misclassification rates as the primary end-point. Overall, for inter-observer reproducibility, 2.3% of TCI Singapore items were misclassified compared to 22.3% of RANZCR items (P < 0.0001, 100.00% confidence that TCI instrument has less inter-observer misclassification). For intra-observer reproducibility, 2.4% of TCI Singapore items were misclassified compared to 13.6% of RANZCR items (P < 0.0001, 100.00% confidence that TCI instrument has less intra-observer misclassification). The proposed RANZCR RO revalidation audit instrument requires further refinement to improve validity. Several items require modification or removal because of lack of reliability, whereas inclusion of other important and reproducible items can be incorporated as demonstrated by the TCI Singapore instrument. The TCI Singapore instrument also has the advantage of incorporating a simple scoring system and criticality index to allow discrimination between ROs and comparisons against future College standards Copyright (2004) Blackwell Publishing Asia Pty Ltd

  1. The interprocess NIR sampling as an alternative approach to multivariate statistical process control for identifying sources of product-quality variability.

    Science.gov (United States)

    Marković, Snežana; Kerč, Janez; Horvat, Matej

    2017-03-01

    We are presenting a new approach of identifying sources of variability within a manufacturing process by NIR measurements of samples of intermediate material after each consecutive unit operation (interprocess NIR sampling technique). In addition, we summarize the development of a multivariate statistical process control (MSPC) model for the production of enteric-coated pellet product of the proton-pump inhibitor class. By developing provisional NIR calibration models, the identification of critical process points yields comparable results to the established MSPC modeling procedure. Both approaches are shown to lead to the same conclusion, identifying parameters of extrusion/spheronization and characteristics of lactose that have the greatest influence on the end-product's enteric coating performance. The proposed approach enables quicker and easier identification of variability sources during manufacturing process, especially in cases when historical process data is not straightforwardly available. In the presented case the changes of lactose characteristics are influencing the performance of the extrusion/spheronization process step. The pellet cores produced by using one (considered as less suitable) lactose source were on average larger and more fragile, leading to consequent breakage of the cores during subsequent fluid bed operations. These results were confirmed by additional experimental analyses illuminating the underlying mechanism of fracture of oblong pellets during the pellet coating process leading to compromised film coating.

  2. An in vitro assessment of the physical properties of novel Hyflex nickel-titanium rotary instruments.

    Science.gov (United States)

    Peters, O A; Gluskin, A K; Weiss, R A; Han, J T

    2012-11-01

    To determine several properties including torsional and fatigue limits, as well as torque during canal preparation, of Hyflex, a rotary instrument manufactured from so-called controlled memory nickel-titanium alloy. The instruments were tested in vitro using a special torque bench that permits both stationary torque tests according to ISO3630-1 and fatigue limit determination, as well as measurement of torque (in Ncm) and apical force (in N) during canal preparation. Fatigue limit (in numbers of cycles to failure) was determined in a 90°, 5 mm radius block-and-rod assembly. Simulated canals in plastic blocks were prepared using both a manufacturer-recommended single-length technique as well as a generic crown-down approach. anova with Bonferroni post hoc procedures was used for statistical analysis. Torque at failure ranged from 0.47 to 1.38 Ncm, with significant differences between instrument sizes (P instruments size 20, .04 taper and size 25, .08 taper, respectively. Torque during canal preparation was significantly higher for small instruments used in the single-length technique but lower for the size 40, .04 taper, compared to a crown-down approach. No instrument fractured; 82% of the instruments used were plastically deformed; however, only 37% of these remained deformed after a sterilization cycle. Hyflex rotary instruments are bendable and flexible and have similar torsional resistance compared to instruments made of conventional NiTi. Fatigue resistance is much higher, and torque during preparation is less, compared to other rotary instruments tested previously under similar conditions. © 2012 International Endodontic Journal.

  3. Intercomparison of two comparative reactivity method instruments inf the Mediterranean basin during summer 2013

    Science.gov (United States)

    Zannoni, N.; Dusanter, S.; Gros, V.; Sarda Esteve, R.; Michoud, V.; Sinha, V.; Locoge, N.; Bonsang, B.

    2015-09-01

    emissions (8-9 July), 2 days of ambient measurements (10-11 July) and 2 days (12-13 July) of plant emissions. We discuss in detail the experimental approach adopted and how the data sets were processed for both instruments. Corrections required for the two instruments lead to higher values of reactivity in ambient air; overall 20 % increase for CRM-MD and 49 % for CRM-LSCE compared to the raw data. We show that ambient OH reactivity measured by the two instruments agrees very well (correlation described by a linear least squares fit with a slope of 1 and R2 of 0.75). This study highlights that ambient measurements of OH reactivity with differently configured CRM instruments yield consistent results in a low NOx (NO + NO2), terpene rich environment, despite differential corrections relevant to each instrument. Conducting more intercomparison exercises, involving more CRM instruments operated under different ambient and instrumental settings will help in assessing the variability induced due to instrument-specific corrections further.

  4. The OCO-3 Mission: Science Objectives and Instrument Performance

    Science.gov (United States)

    Eldering, A.; Basilio, R. R.; Bennett, M. W.

    2017-12-01

    The Orbiting Carbon Observatory 3 (OCO-3) will continue global CO2 and solar-induced chlorophyll fluorescence (SIF) using the flight spare instrument from OCO-2. The instrument is currently being tested, and will be packaged for installation on the International Space Station (ISS) (launch readiness in early 2018.) This talk will focus on the science objectives, updated simulations of the science data products, and the outcome of recent instrument performance tests. The low-inclination ISS orbit lets OCO-3 sample the tropics and sub-tropics across the full range of daylight hours with dense observations at northern and southern mid-latitudes (+/- 52º). The combination of these dense CO2 and SIF measurements provides continuity of data for global flux estimates as well as a unique opportunity to address key deficiencies in our understanding of the global carbon cycle. The instrument utilizes an agile, 2-axis pointing mechanism (PMA), providing the capability to look towards the bright reflection from the ocean and validation targets. The PMA also allows for a snapshot mapping mode to collect dense datasets over 100km by 100km areas. Measurements over urban centers could aid in making estimates of fossil fuel CO2 emissions. Similarly, the snapshot mapping mode can be used to sample regions of interest for the terrestrial carbon cycle. In addition, there is potential to utilize data from ISS instruments ECOSTRESS (ECOsystem Spaceborne Thermal Radiometer Experiment on Space Station) and GEDI (Global Ecosystem Dynamics Investigation), which measure other key variables of the control of carbon uptake by plants, to complement OCO-3 data in science analysis. In 2017, the OCO-2 instrument was transformed into the ISS-ready OCO-3 payload. The transformed instrument was thoroughly tested and characterized. Key characteristics, such as instrument ILS, spectral resolution, and radiometric performance will be described. Analysis of direct sun measurements taken during testing

  5. [Development of an evaluation instrument for service quality in nursing homes].

    Science.gov (United States)

    Lee, Jia; Ji, Eun Sun

    2011-08-01

    The purposes of this study were to identify the factors influencing service quality in nursing homes, and to develop an evaluation instrument for service quality. A three-phase process was employed for the study. 1) The important factors to evaluate the service quality in nursing homes were identified through a literature review, panel discussion and focus group interview, 2) the evaluation instrument was developed, and 3) validity and reliability of the study instrument were tested by factor analysis, Pearson correlation coefficient, Cronbach's α and Cohen's Kappa. Factor analysis showed that the factors influencing service quality in nursing homes were healthcare, diet/assistance, therapy, environment and staff. To improve objectivity of the instrument, quantitative as well as qualitative evaluation approaches were adopted. The study instrument was developed with 30 items and showed acceptable construct validity. The criterion-related validity was a Pearson correlation coefficient of .85 in 151 care facilities. The internal consistency was Cronbach's α=.95. The instrument has acceptable validity and a high degree of reliability. Staff in nursing homes can continuously improve and manage their services using the results of the evaluation instrument.

  6. On-line calibration of process instrumentation channels in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Hashemian, H.M.; Farmer, J.P. [Analysis and Measurement Services Corp., Knoxville, TN (United States)

    1995-04-01

    An on-line instrumentation monitoring system was developed and validated for use in nuclear power plants. This system continuously monitors the calibration status of instrument channels and determines whether or not they require manual calibrations. This is accomplished by comparing the output of each instrument channel to an estimate of the process it is monitoring. If the deviation of the instrument channel from the process estimate is greater than an allowable limit, then the instrument is said to be {open_quotes}out of calibration{close_quotes} and manual adjustments are made to correct the calibration. The success of the on-line monitoring system depends on the accuracy of the process estimation. The system described in this paper incorporates both simple intercomparison techniques as well as analytical approaches in the form of data-driven empirical modeling to estimate the process. On-line testing of the calibration of process instrumentation channels will reduce the number of manual calibrations currently performed, thereby reducing both costs to utilities and radiation exposure to plant personnel.

  7. [Development and validation of an instrument for initial nursing assessment].

    Science.gov (United States)

    Fernández-Sola, Cayetano; Granero-Molina, José; Mollinedo-Mallea, Judith; de Gonzales, María Hilda Peredo; Aguilera-Manrique, Gabriel; Ponce, Mara Luna

    2012-12-01

    The objective of this study, conducted in Bolivia from April to July of 2008, is the design and validation of an initial nursing assessment instrument to be used in clinical and educational environments in Santa Cruz (Bolivia). Twelve Bolivian nurses participated; both document analysis as well as consensus techniques were used to determine the categories and criteria to be assessed. Categories included in the nursing assessment instrument are a physical assessment and the eleven Gordon's Functional Health Patterns. The nursing assessment instrument stands out as being concise, easy to complete and utilizing a nursing approach. It does not include items for advanced nursing assessment. However, it incorporates items regarding lifestyle and the patient's autonomy. The nursing assessment instrument contributes to improving the quality of clinical records, supports the nursing diagnosis and implementation of the nursing process, promotes the nurse's role and helps to standardize practice.

  8. A systematic review of instruments that measure attitudes toward homosexual men.

    Science.gov (United States)

    Grey, Jeremy A; Robinson, Beatrice Bean E; Coleman, Eli; Bockting, Walter O

    2013-01-01

    Scientific interest in the measurement of homophobia and internalized homophobia has grown over the past 30 years, and new instruments and terms have emerged. To help researchers with the challenging task of identifying appropriate measures for studies in sexual-minority health, we reviewed measures of homophobia published in the academic literature from 1970 to 2012. Instruments that measured attitudes toward male homosexuals/homosexuality or measured homosexuals' internalized attitudes toward homosexuality were identified using measurement manuals and a systematic review. A total of 23 instruments met criteria for inclusion, and their features were summarized and compared. All 23 instruments met minimal criteria for adequate scale construction, including scale development, sampling, reliability, and evidence of validity. Validity evidence was diverse and was categorized as interaction with gay men, HIV/AIDS variables, mental health, and conservative religious or political beliefs. Homophobia was additionally correlated with authoritarianism and bias, gender ideology, gender differences, and reactions to homosexual stimuli. Internalized homophobia was validated by examining relationships with disclosing one's homosexuality and level of homosexual identity development. We hope this review will make the process of instrument selection more efficient by allowing researchers to easily locate, evaluate, and choose the proper measure based on their research question and population of interest.

  9. Wireless instrumented klapskates for long-track speed skating

    OpenAIRE

    van der Kruk, E.; den Braver, O.; Schwab, Arend L.; van der Helm, Frans C T; Veeger, H. E J

    2016-01-01

    In the current project, we aim to provide speed skaters with real-time feedback on how to improve their skating performance within an individual stroke. The elite skaters and their coaches wish for a system that determines the mechanical power per stroke. The push-off force of the skater is a crucial variable in this power determination. In this study, we present the construction and calibration of a pair of wireless instrumented klapskates that can continuously and synchronously measure this...

  10. Perception and Modeling of Affective Qualities of Musical Instrument Sounds across Pitch Registers.

    Science.gov (United States)

    McAdams, Stephen; Douglas, Chelsea; Vempala, Naresh N

    2017-01-01

    Composers often pick specific instruments to convey a given emotional tone in their music, partly due to their expressive possibilities, but also due to their timbres in specific registers and at given dynamic markings. Of interest to both music psychology and music informatics from a computational point of view is the relation between the acoustic properties that give rise to the timbre at a given pitch and the perceived emotional quality of the tone. Musician and nonmusician listeners were presented with 137 tones produced at a fixed dynamic marking (forte) playing tones at pitch class D# across each instrument's entire pitch range and with different playing techniques for standard orchestral instruments drawn from the brass, woodwind, string, and pitched percussion families. They rated each tone on six analogical-categorical scales in terms of emotional valence (positive/negative and pleasant/unpleasant), energy arousal (awake/tired), tension arousal (excited/calm), preference (like/dislike), and familiarity. Linear mixed models revealed interactive effects of musical training, instrument family, and pitch register, with non-linear relations between pitch register and several dependent variables. Twenty-three audio descriptors from the Timbre Toolbox were computed for each sound and analyzed in two ways: linear partial least squares regression (PLSR) and nonlinear artificial neural net modeling. These two analyses converged in terms of the importance of various spectral, temporal, and spectrotemporal audio descriptors in explaining the emotion ratings, but some differences also emerged. Different combinations of audio descriptors make major contributions to the three emotion dimensions, suggesting that they are carried by distinct acoustic properties. Valence is more positive with lower spectral slopes, a greater emergence of strong partials, and an amplitude envelope with a sharper attack and earlier decay. Higher tension arousal is carried by brighter sounds

  11. A hybrid approach to fault diagnosis of roller bearings under variable speed conditions

    Science.gov (United States)

    Wang, Yanxue; Yang, Lin; Xiang, Jiawei; Yang, Jianwei; He, Shuilong

    2017-12-01

    Rolling element bearings are one of the main elements in rotating machines, whose failure may lead to a fatal breakdown and significant economic losses. Conventional vibration-based diagnostic methods are based on the stationary assumption, thus they are not applicable to the diagnosis of bearings working under varying speeds. This constraint limits the bearing diagnosis to the industrial application significantly. A hybrid approach to fault diagnosis of roller bearings under variable speed conditions is proposed in this work, based on computed order tracking (COT) and variational mode decomposition (VMD)-based time frequency representation (VTFR). COT is utilized to resample the non-stationary vibration signal in the angular domain, while VMD is used to decompose the resampled signal into a number of band-limited intrinsic mode functions (BLIMFs). A VTFR is then constructed based on the estimated instantaneous frequency and instantaneous amplitude of each BLIMF. Moreover, the Gini index and time-frequency kurtosis are both proposed to quantitatively measure the sparsity and concentration measurement of time-frequency representation, respectively. The effectiveness of the VTFR for extracting nonlinear components has been verified by a bat signal. Results of this numerical simulation also show the sparsity and concentration of the VTFR are better than those of short-time Fourier transform, continuous wavelet transform, Hilbert-Huang transform and Wigner-Ville distribution techniques. Several experimental results have further demonstrated that the proposed method can well detect bearing faults under variable speed conditions.

  12. Influence of Loading Rate on the Calibration of Instrumented Charpy Strikers

    Energy Technology Data Exchange (ETDEWEB)

    Lucon, E.; Scibetta, M.; McColskey, D.; McCowan, C.

    2009-01-15

    One of the key factors for obtaining reliable instrumented Charpy results is the calibration of the instrumented striker. The conventional approach for establishing an analytical relationship between strain gage output and force applied to the transducer is the static calibration, which is preferably performed with the striker installed in the pendulum assembly. However, the response of an instrumented striker under static force application may sometimes differ significantly from its dynamic performance during an actual Charpy test. This is typically reflected in a large difference between absorbed energy returned by the pendulum encoder (KV) and calculated under the instrumented force/displacement test record (Wt). Such difference can be either minimized by optimizing the striker design or analytically removed by adjusting forces and displacements until KV = Wt (the so-called 'Dynamic Force Adjustment'). This study investigates the influence of increasing force application rates on the force/voltage characteristics of two instrumented strikers, one at NIST in Boulder, CO and one at SCK-CEN in Mol, Belgium.

  13. Instrument performance evaluation

    International Nuclear Information System (INIS)

    Swinth, K.L.

    1993-03-01

    Deficiencies exist in both the performance and the quality of health physics instruments. Recognizing the implications of such deficiencies for the protection of workers and the public, in the early 1980s the DOE and the NRC encouraged the development of a performance standard and established a program to test a series of instruments against criteria in the standard. The purpose of the testing was to establish the practicality of the criteria in the standard, to determine the performance of a cross section of available instruments, and to establish a testing capability. Over 100 instruments were tested, resulting in a practical standard and an understanding of the deficiencies in available instruments. In parallel with the instrument testing, a value-impact study clearly established the benefits of implementing a formal testing program. An ad hoc committee also met several times to establish recommendations for the voluntary implementation of a testing program based on the studies and the performance standard. For several reasons, a formal program did not materialize. Ongoing tests and studies have supported the development of specific instruments and have helped specific clients understand the performance of their instruments. The purpose of this presentation is to trace the history of instrument testing to date and suggest the benefits of a centralized formal program

  14. ELABORATING A MEASUREMENT INSTRUMENT FOR THE FLOW EXPERIENCE DURING ONLINE INFORMATION SEARCH

    Directory of Open Access Journals (Sweden)

    Caraivan Luiza

    2012-12-01

    Full Text Available Flow is a construct imported in marketing research from social sciences in order to examine consumer behavior in the online medium. The construct describes a state of deep involvement in a challenging activity, most frequently characterized by high levels of enjoyment, control and concentration. Researchers found that the degree to which online experience is challenging can be defined, measured, and related well to important marketing variables. As shown by our extensive literature review, flow measurements include antecedents, dimensions and consequences of flow. The present paper represents a detailed description of the construct`s operationalization in the context of online information search. In this respect, our main goal is to produce a basic instrument to evaluate the flow experience of online search, in order to capitalize on the premises of an interactive, complex informational medium – the World Wide Web – and on the consequence of an exploratory informational behavior of users. The instrument is conceived to offer a primal possibility to collect data. The composition, source and significance of the 11 scales used to measure the multiple factors of the flow experience during online search are detailed in this study with the aim to ensure the compliance with scientific rigors and to facilitate correct reports of data related to the reliability and validity of measurements. For further research, we propose factor analysis to test the resulted instrument and to ensure that the measures employed are psychometrically sound. Factor analysis refers to a wide range of statistic techniques used to represent a set of variables in concordance with a reduced number of hypothetical variables called factors. Factorial analysis is used to solve two types of problems: reducing the number of variables to increase data processing speed and identifying hidden patterns in the existent data relations. However, we expect our scales to perform

  15. Under What Assumptions Do Site-by-Treatment Instruments Identify Average Causal Effects?

    Science.gov (United States)

    Reardon, Sean F.; Raudenbush, Stephen W.

    2013-01-01

    The increasing availability of data from multi-site randomized trials provides a potential opportunity to use instrumental variables methods to study the effects of multiple hypothesized mediators of the effect of a treatment. We derive nine assumptions needed to identify the effects of multiple mediators when using site-by-treatment interactions…

  16. A REPERTOIRE OF INSTRUMENTS EMPLOYED IN PSYCHOLOGICAL COUNSELING

    Directory of Open Access Journals (Sweden)

    Dorina Maria PASCA

    2014-10-01

    Full Text Available According to Carl Rogers and Albert Ellis [1] [2], a new approach to psychological counseling is needed. Consequently, new and practical means to solve problems that ensue as part of the counseling process are required. From this point of view, this article aims at offering a range of alternatives to approach and involve the client (student in order to achieve the envisaged results of counseling. As such, it offers a concise repertoire of instruments that can be employed in psychological counseling.

  17. Systematization of Instruments of Social and Economic Responsibility of Enterprises: Theoretical Aspect

    Directory of Open Access Journals (Sweden)

    Dielini Maryna M.

    2016-11-01

    Full Text Available The aim of the article is systematization of instruments for implementation of social and economic responsibility of enterprises in Ukraine and theoretical consideration of the presented instruments. The article studies basic views on instruments of social responsibility of business. It is determined which of them are more traditional, and which ones are the latest, that is up to date. Thus, the traditional ones include: philanthropy, charity, sponsorship, volunteering, patronship, monetary grants, equivalent financing. Based on the understanding of the nature of social and economic responsibility of business proposed by the author, to its instruments there can be attributed all traditional instruments, except for volunteering that does not imply obtaining funds for its activities and has only a social effect. There studied modern instruments of business social responsibility, such as social investments, socially responsible investments, social marketing, charity marketing, social programs, social entrepreneurship, social reporting and social expertise, fundraising, socially responsible approaches to doing business and supply chain management. All of them can be regarded as instruments of social and economic responsibility of business.

  18. CLOUDCLOUD : general-purpose instrument monitoring and data managing software

    Science.gov (United States)

    Dias, António; Amorim, António; Tomé, António

    2016-04-01

    monitored with parsing intervals as fast as milliseconds. This software (server+agents+interface+database) comes in easy and ready-to-use packages that can be installed in any operating system, including Android and iOS systems. This software is ideal for use in modular experiments or monitoring stations with large variability in instruments and measuring methods or in large collaborations, where data requires homogenization in order to be effectively transmitted to all involved parties. This work presents the software and provides performance comparison with previously used monitoring systems in the CLOUD experiment at CERN.

  19. New methods of magnet-based instrumentation for NOTES.

    Science.gov (United States)

    Magdeburg, Richard; Hauth, Daniel; Kaehler, Georg

    2013-12-01

    Laparoscopic surgery has displaced open surgery as the standard of care for many clinical conditions. NOTES has been described as the next surgical frontier with the objective of incision-free abdominal surgery. The principal challenge of NOTES procedures is the loss of triangulation and instrument rigidity, which is one of the fundamental concepts of laparoscopic surgery. To overcome these problems necessitates the development of new instrumentation. material and methods: We aimed to assess the use of a very simple combination of internal and external magnets that might allow the vigorous multiaxial traction/counter-traction required in NOTES procedures. The magnet retraction system consisted of an external magnetic assembly and either small internal magnets attached by endoscopic clips to the designated tissue (magnet-clip-approach) or an endoscopic grasping forceps in a magnetic deflector roll (magnet-trocar-approach). We compared both methods regarding precision, time and efficacy by performing transgastric partial uterus resections with better results for the magnet-trocar-approach. This proof-of-principle animal study showed that the combination of external and internal magnets generates sufficient coupling forces at clinically relevant abdominal wall thicknesses, making them suitable for use and evaluation in NOTES procedures, and provides the vigorous multiaxial traction/counter-traction required by the lack of additional abdominal trocars.

  20. On-ground calibration of the BEPICOLOMBO/SIMBIO-SYS at instrument level

    Science.gov (United States)

    Rodriguez-Ferreira, J.; Poulet, F.; Eng, P.; Longval, Y.; Dassas, K.; Arondel, A.; Langevin, Y.; Capaccioni, F.; Filacchione, G.; Palumbo, P.; Cremonese, G.; Dami, M.

    2012-04-01

    The Mercury Planetary Orbiter/BepiColombo carries an integrated suite of instruments, the Spectrometer and Imagers for MPO BepiColombo-Integrated Observatory SYStem (SIMBIO-SYS). SIMBIO-SYS has 3 channels: a stereo imaging system (STC), a high-resolution imager (HRIC) and a visible-near-infrared imaging spectrometer (VIHI). SIMBIO-SYS will scan the surface of Mercury with these three channels and determine the physical, morphological and compositional properties of the entire planet. Before integration on the S/C, an on-ground calibration at the channels and at the instrument levels will be performed so as to describe the instrumental responses as a function of various parameters that might evolve while the instruments will be operating [1]. The Institut d'Astrophysique Spatiale (IAS) is responsible for the on-ground instrument calibration at the instrument level. During the 4 weeks of calibration campaign planned for June 2012, the instrument will be maintained in a mechanical and thermal environment simulating the space conditions. Four Optical stimuli (QTH lamp, Integrating Sphere, BlackBody with variable temperature from 50 to 1200°C and Monochromator), are placed over an optical bench to illuminate the four channels so as to make the radiometric calibration, straylight monitoring, as well as spectral proofing based on laboratory mineral samples. The instrument will be mounted on a hexapod placed inside a thermal vacuum chamber during the calibration campaign. The hexapod will move the channels within the well-characterized incoming beam. We will present the key activities of the preparation of this calibration: the derivation of the instrument radiometric model, the implementation of the optical, mechanical and software interfaces of the calibration assembly, the characterization of the optical bench and the definition of the calibration procedures.

  1. Online Personalization of Hearing Instruments

    Directory of Open Access Journals (Sweden)

    Bert de Vries

    2008-09-01

    Full Text Available Online personalization of hearing instruments refers to learning preferred tuning parameter values from user feedback through a control wheel (or remote control, during normal operation of the hearing aid. We perform hearing aid parameter steering by applying a linear map from acoustic features to tuning parameters. We formulate personalization of the steering parameters as the maximization of an expected utility function. A sparse Bayesian approach is then investigated for its suitability to find efficient feature representations. The feasibility of our approach is demonstrated in an application to online personalization of a noise reduction algorithm. A patient trial indicates that the acoustic features chosen for learning noise control are meaningful, that environmental steering of noise reduction makes sense, and that our personalization algorithm learns proper values for tuning parameters.

  2. Decoding auditory attention to instruments in polyphonic music using single-trial EEG classification

    DEFF Research Database (Denmark)

    Treder, Matthias S.; Purwins, Hendrik; Miklody, Daniel

    2014-01-01

    . Here, we explore polyphonic music as a novel stimulation approach for future use in a brain-computer interface. In a musical oddball experiment, we had participants shift selective attention to one out of three different instruments in music audio clips, with each instrument occasionally playing one...... 11 participants. This is a proof of concept that attention paid to a particular instrument in polyphonic music can be inferred from ongoing EEG, a finding that is potentially relevant for both brain-computer interface and music research....

  3. Incorporating Basic Optical Microscopy in the Instrumental Analysis Laboratory

    Science.gov (United States)

    Flowers, Paul A.

    2011-01-01

    A simple and versatile approach to incorporating basic optical microscopy in the undergraduate instrumental analysis laboratory is described. Attaching a miniature CCD spectrometer to the video port of a standard compound microscope yields a visible microspectrophotometer suitable for student investigations of fundamental spectrometry concepts,…

  4. Combination of individual tree detection and area-based approach in imputation of forest variables using airborne laser data

    Science.gov (United States)

    Vastaranta, Mikko; Kankare, Ville; Holopainen, Markus; Yu, Xiaowei; Hyyppä, Juha; Hyyppä, Hannu

    2012-01-01

    The two main approaches to deriving forest variables from laser-scanning data are the statistical area-based approach (ABA) and individual tree detection (ITD). With ITD it is feasible to acquire single tree information, as in field measurements. Here, ITD was used for measuring training data for the ABA. In addition to automatic ITD (ITD auto), we tested a combination of ITD auto and visual interpretation (ITD visual). ITD visual had two stages: in the first, ITD auto was carried out and in the second, the results of the ITD auto were visually corrected by interpreting three-dimensional laser point clouds. The field data comprised 509 circular plots ( r = 10 m) that were divided equally for testing and training. ITD-derived forest variables were used for training the ABA and the accuracies of the k-most similar neighbor ( k-MSN) imputations were evaluated and compared with the ABA trained with traditional measurements. The root-mean-squared error (RMSE) in the mean volume was 24.8%, 25.9%, and 27.2% with the ABA trained with field measurements, ITD auto, and ITD visual, respectively. When ITD methods were applied in acquiring training data, the mean volume, basal area, and basal area-weighted mean diameter were underestimated in the ABA by 2.7-9.2%. This project constituted a pilot study for using ITD measurements as training data for the ABA. Further studies are needed to reduce the bias and to determine the accuracy obtained in imputation of species-specific variables. The method could be applied in areas with sparse road networks or when the costs of fieldwork must be minimized.

  5. Comparison of Two Methodologies for Calibrating Satellite Instruments in the Visible and Near-Infrared

    Science.gov (United States)

    Barnes, Robert A.; Brown, Steven W.; Lykke, Keith R.; Guenther, Bruce; Butler, James J.; Schwarting, Thomas; Turpie, Kevin; Moyer, David; DeLuccia, Frank; Moeller, Christopher

    2015-01-01

    Traditionally, satellite instruments that measure Earth-reflected solar radiation in the visible and near infrared wavelength regions have been calibrated for radiance responsivity in a two-step method. In the first step, the relative spectral response (RSR) of the instrument is determined using a nearly monochromatic light source such as a lamp-illuminated monochromator. These sources do not typically fill the field-of-view of the instrument nor act as calibrated sources of light. Consequently, they only provide a relative (not absolute) spectral response for the instrument. In the second step, the instrument views a calibrated source of broadband light, such as a lamp-illuminated integrating sphere. The RSR and the sphere absolute spectral radiance are combined to determine the absolute spectral radiance responsivity (ASR) of the instrument. More recently, a full-aperture absolute calibration approach using widely tunable monochromatic lasers has been developed. Using these sources, the ASR of an instrument can be determined in a single step on a wavelength-by-wavelength basis. From these monochromatic ASRs, the responses of the instrument bands to broadband radiance sources can be calculated directly, eliminating the need for calibrated broadband light sources such as lamp-illuminated integrating spheres. In this work, the traditional broadband source-based calibration of the Suomi National Preparatory Project (SNPP) Visible Infrared Imaging Radiometer Suite (VIIRS) sensor is compared with the laser-based calibration of the sensor. Finally, the impact of the new full-aperture laser-based calibration approach on the on-orbit performance of the sensor is considered.

  6. Assessment of variable fluorescence fluorometry as an approach for rapidly detecting living photoautotrophs in ballast water

    Science.gov (United States)

    First, Matthew R.; Robbins-Wamsley, Stephanie H.; Riley, Scott C.; Drake, Lisa A.

    2018-03-01

    Variable fluorescence fluorometry, an analytical approach that estimates the fluorescence yield of chlorophyll a (F0, a proximal measure of algal concentration) and photochemical yield (FV/FM, an indicator of the physiological status of algae) was evaluated as a means to rapidly assess photoautotrophs. Specifically, it was used to gauge the efficacy of ballast water treatment designed to reduce the transport and delivery of potentially invasive organisms. A phytoflagellate, Tetraselmis spp. (10-12 μm) and mixed communities of ambient protists were examined in both laboratory experiments and large-scale field trials simulating 5-d hold times in mock ballast tanks. In laboratory incubations, ambient organisms held in the dark exhibited declining F0 and FV/FM measurements relative to organisms held under lighted conditions. In field experiments, increases and decreases in F0 and FV/FM over the tank hold time corresponded to those of microscope counts of organisms in two of three trials. In the third trial, concentrations of organisms ≥ 10 and protists) increased while F0 and FV/FM decreased. Rapid and sensitive, variable fluorescence fluorometry is appropriate for detecting changes in organism concentrations and physiological status in samples dominated by microalgae. Changes in the heterotrophic community, which may become more prevalent in light-limited ballast tanks, would not be detected via variable fluorescence fluorometry, however.

  7. A SHARIA RETURN AS AN ALTERNATIVE INSTRUMENT FOR MONETARY POLICY

    Directory of Open Access Journals (Sweden)

    Ashief Hamam

    2011-09-01

    Full Text Available Rapid development in Islamic financial industry has not been supported by sharia monetary policy instruments. This study looks at the possibility of sharia returns as the instrument. Using both error correction model and vector error correction model to estimate the data from 2002(1 to 2010(12, this paper finds that sharia return has the same effect as the interest rate in the demand for money. The shock effect of sharia return on broad money supply, Gross Domestic Product, and Consumer Price Index is greater than that of interest rate. In addition, these three variables are more quickly become stable following the shock of sharia return. Keywords: Sharia return, islamic financial system, vector error correction modelJEL classification numbers: E52, G15

  8. A Novel Approach for Solving Semidefinite Programs

    Directory of Open Access Journals (Sweden)

    Hong-Wei Jiao

    2014-01-01

    Full Text Available A novel linearizing alternating direction augmented Lagrangian approach is proposed for effectively solving semidefinite programs (SDP. For every iteration, by fixing the other variables, the proposed approach alternatively optimizes the dual variables and the dual slack variables; then the primal variables, that is, Lagrange multipliers, are updated. In addition, the proposed approach renews all the variables in closed forms without solving any system of linear equations. Global convergence of the proposed approach is proved under mild conditions, and two numerical problems are given to demonstrate the effectiveness of the presented approach.

  9. Electrical Bioimpedance-Controlled Surgical Instrumentation.

    Science.gov (United States)

    Brendle, Christian; Rein, Benjamin; Niesche, Annegret; Korff, Alexander; Radermacher, Klaus; Misgeld, Berno; Leonhardt, Steffen

    2015-10-01

    A bioimpedance-controlled concept for bone cement milling during revision total hip replacement is presented. Normally, the surgeon manually removes bone cement using a hammer and chisel. However, this procedure is relatively rough and unintended harm may occur to tissue at any time. The proposed bioimpedance-controlled surgical instrumentation improves this process because, for example, most risks associated with bone cement removal are avoided. The electrical bioimpedance measurements enable online process-control by using the milling head as both a cutting tool and measurement electrode at the same time. Furthermore, a novel integrated surgical milling tool is introduced, which allows acquisition of electrical bioimpedance data for online control; these data are used as a process variable. Process identification is based on finite element method simulation and on experimental studies with a rapid control prototyping system. The control loop design includes the identified process model, the characterization of noise as being normally distributed and the filtering, which is necessary for sufficient accuracy ( ±0.5 mm). Also, in a comparative study, noise suppression is investigated in silico with a moving average filter and a Kalman filter. Finally, performance analysis shows that the bioimpedance-controlled surgical instrumentation may also performs effectively at a higher feed rate (e.g., 5 mm/s).

  10. Validating a Computer-Assisted Language Learning Attitude Instrument Used in Iranian EFL Context: An Evidence-Based Approach

    Science.gov (United States)

    Aryadoust, Vahid; Mehran, Parisa; Alizadeh, Mehrasa

    2016-01-01

    A few computer-assisted language learning (CALL) instruments have been developed in Iran to measure EFL (English as a foreign language) learners' attitude toward CALL. However, these instruments have no solid validity argument and accordingly would be unable to provide a reliable measurement of attitude. The present study aimed to develop a CALL…

  11. The hydrodynamic basis of the vacuum cleaner effect in continuous-flow PCNL instruments: an empiric approach and mathematical model.

    Science.gov (United States)

    Mager, R; Balzereit, C; Gust, K; Hüsch, T; Herrmann, T; Nagele, U; Haferkamp, A; Schilling, D

    2016-05-01

    Passive removal of stone fragments in the irrigation stream is one of the characteristics in continuous-flow PCNL instruments. So far the physical principle of this so-called vacuum cleaner effect has not been fully understood yet. The aim of the study was to empirically prove the existence of the vacuum cleaner effect and to develop a physical hypothesis and generate a mathematical model for this phenomenon. In an empiric approach, common low-pressure PCNL instruments and conventional PCNL sheaths were tested using an in vitro model. Flow characteristics were visualized by coloring of irrigation fluid. Influence of irrigation pressure, sheath diameter, sheath design, nephroscope design and position of the nephroscope was assessed. Experiments were digitally recorded for further slow-motion analysis to deduce a physical model. In each tested nephroscope design, we could observe the vacuum cleaner effect. Increase in irrigation pressure and reduction in cross section of sheath sustained the effect. Slow-motion analysis of colored flow revealed a synergism of two effects causing suction and transportation of the stone. For the first time, our model showed a flow reversal in the sheath as an integral part of the origin of the stone transportation during vacuum cleaner effect. The application of Bernoulli's equation provided the explanation of these effects and confirmed our experimental results. We widen the understanding of PCNL with a conclusive physical model, which explains fluid mechanics of the vacuum cleaner effect.

  12. Post-implementation review of inadequate core cooling instrumentation

    International Nuclear Information System (INIS)

    Anderson, J.L.; Anderson, R.L.; Hagen, E.W.; Morelock, T.C.; Huang, T.L.; Phillips, L.E.

    1988-01-01

    Studies of Three Mile Island (TMI) accident identified the need for additional instrumentation to detect inadequate core cooling (ICC) in nuclear power plants. Industry studies by plant owners and reactor vendors supported the conclusion that improvements were needed to help operators diagnose the approach to or existence of ICC and to provide more complete information for operator control of safety injection, flow to minimize the consequences of such an accident. In 1980, the US Nuclear Regulatory Commission (NRC) required further studies by the industry and described ICC instrumentation design requirements that included human factors and environmental considerations. On December 10, 1982, NRC issued to Babcock and Wilcox (BandW) licensees' orders for Modification of License and transmitted to all pressurized water reactor (PWR) licensees Generic Letter 82-28 to inform them of the revised NRC requirements. The instrumentation requirements for detection of ICC include upgraded subcooling margin monitors (SMMs), upgraded core exit thermocouples (CETs), and installation of a reactor coolant inventory tracking system (RCITS)

  13. Sistemic Approach – a Complexity Management Instrument

    Directory of Open Access Journals (Sweden)

    Vadim Dumitrascu

    2006-02-01

    Full Text Available The systemic principle uses the deduction and the induction, analyse and synthesis, inferency and proferency, in order to find out the interdependencies and the inner connections that make mooving the complex organized entities. The true valences of this approach can be found neither in the simplist models of the “in-out” type, nor in the “circular” models that fill in the Economics and Management handbooks, and that consecrate another kind of formalism, but in the constructiviste-reflexive strategies, used in order to explain the economic and social structures.

  14. Emotion rendering in music: range and characteristic values of seven musical variables.

    Science.gov (United States)

    Bresin, Roberto; Friberg, Anders

    2011-10-01

    Many studies on the synthesis of emotional expression in music performance have focused on the effect of individual performance variables on perceived emotional quality by making a systematical variation of variables. However, most of the studies have used a predetermined small number of levels for each variable, and the selection of these levels has often been done arbitrarily. The main aim of this research work is to improve upon existing methodologies by taking a synthesis approach. In a production experiment, 20 performers were asked to manipulate values of 7 musical variables simultaneously (tempo, sound level, articulation, phrasing, register, timbre, and attack speed) for communicating 5 different emotional expressions (neutral, happy, scary, peaceful, sad) for each of 4 scores. The scores were compositions communicating four different emotions (happiness, sadness, fear, calmness). Emotional expressions and music scores were presented in combination and in random order for each performer for a total of 5 × 4 stimuli. The experiment allowed for a systematic investigation of the interaction between emotion of each score and intended expressed emotions by performers. A two-way analysis of variance (ANOVA), repeated measures, with factors emotion and score was conducted on the participants' values separately for each of the seven musical factors. There are two main results. The first one is that musical variables were manipulated in the same direction as reported in previous research on emotional expressive music performance. The second one is the identification for each of the five emotions the mean values and ranges of the five musical variables tempo, sound level, articulation, register, and instrument. These values resulted to be independent from the particular score and its emotion. The results presented in this study therefore allow for both the design and control of emotionally expressive computerized musical stimuli that are more ecologically valid than

  15. Consumer's risk in the EMA and FDA regulatory approaches for bioequivalence in highly variable drugs.

    Science.gov (United States)

    Muñoz, Joel; Alcaide, Daniel; Ocaña, Jordi

    2016-05-30

    The 2010 US Food and Drug Administration and European Medicines Agency regulatory approaches to establish bioequivalence in highly variable drugs are both based on linearly scaling the bioequivalence limits, both take a 'scaled average bioequivalence' approach. The present paper corroborates previous work suggesting that none of them adequately controls type I error or consumer's risk, so they result in invalid test procedures in the neighbourhood of a within-subject coefficient of variation osf 30% for the reference (R) formulation. The problem is particularly serious in the US Food and Drug Administration regulation, but it is also appreciable in the European Medicines Agency one. For the partially replicated TRR/RTR/RRT and the replicated TRTR/RTRT crossover designs, we quantify these type I error problems by means of a simulation study, discuss their possible causes and propose straightforward improvements on both regulatory procedures that improve their type I error control while maintaining an adequate power. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  16. Communication: Wigner functions in action-angle variables, Bohr-Sommerfeld quantization, the Heisenberg correspondence principle, and a symmetrical quasi-classical approach to the full electronic density matrix

    International Nuclear Information System (INIS)

    Miller, William H.; Cotton, Stephen J.

    2016-01-01

    It is pointed out that the classical phase space distribution in action-angle (a-a) variables obtained from a Wigner function depends on how the calculation is carried out: if one computes the standard Wigner function in Cartesian variables (p, x), and then replaces p and x by their expressions in terms of a-a variables, one obtains a different result than if the Wigner function is computed directly in terms of the a-a variables. Furthermore, the latter procedure gives a result more consistent with classical and semiclassical theory—e.g., by incorporating the Bohr-Sommerfeld quantization condition (quantum states defined by integer values of the action variable) as well as the Heisenberg correspondence principle for matrix elements of an operator between such states—and has also been shown to be more accurate when applied to electronically non-adiabatic applications as implemented within the recently developed symmetrical quasi-classical (SQC) Meyer-Miller (MM) approach. Moreover, use of the Wigner function (obtained directly) in a-a variables shows how our standard SQC/MM approach can be used to obtain off-diagonal elements of the electronic density matrix by processing in a different way the same set of trajectories already used (in the SQC/MM methodology) to obtain the diagonal elements.

  17. Communication: Wigner functions in action-angle variables, Bohr-Sommerfeld quantization, the Heisenberg correspondence principle, and a symmetrical quasi-classical approach to the full electronic density matrix.

    Science.gov (United States)

    Miller, William H; Cotton, Stephen J

    2016-08-28

    It is pointed out that the classical phase space distribution in action-angle (a-a) variables obtained from a Wigner function depends on how the calculation is carried out: if one computes the standard Wigner function in Cartesian variables (p, x), and then replaces p and x by their expressions in terms of a-a variables, one obtains a different result than if the Wigner function is computed directly in terms of the a-a variables. Furthermore, the latter procedure gives a result more consistent with classical and semiclassical theory-e.g., by incorporating the Bohr-Sommerfeld quantization condition (quantum states defined by integer values of the action variable) as well as the Heisenberg correspondence principle for matrix elements of an operator between such states-and has also been shown to be more accurate when applied to electronically non-adiabatic applications as implemented within the recently developed symmetrical quasi-classical (SQC) Meyer-Miller (MM) approach. Moreover, use of the Wigner function (obtained directly) in a-a variables shows how our standard SQC/MM approach can be used to obtain off-diagonal elements of the electronic density matrix by processing in a different way the same set of trajectories already used (in the SQC/MM methodology) to obtain the diagonal elements.

  18. Communication: Wigner functions in action-angle variables, Bohr-Sommerfeld quantization, the Heisenberg correspondence principle, and a symmetrical quasi-classical approach to the full electronic density matrix

    Energy Technology Data Exchange (ETDEWEB)

    Miller, William H., E-mail: millerwh@berkeley.edu; Cotton, Stephen J., E-mail: StephenJCotton47@gmail.com [Department of Chemistry and Kenneth S. Pitzer Center for Theoretical Chemistry, University of California, and Chemical Sciences Division, Lawrence Berkeley National Laboratory, Berkeley, California 94720 (United States)

    2016-08-28

    It is pointed out that the classical phase space distribution in action-angle (a-a) variables obtained from a Wigner function depends on how the calculation is carried out: if one computes the standard Wigner function in Cartesian variables (p, x), and then replaces p and x by their expressions in terms of a-a variables, one obtains a different result than if the Wigner function is computed directly in terms of the a-a variables. Furthermore, the latter procedure gives a result more consistent with classical and semiclassical theory—e.g., by incorporating the Bohr-Sommerfeld quantization condition (quantum states defined by integer values of the action variable) as well as the Heisenberg correspondence principle for matrix elements of an operator between such states—and has also been shown to be more accurate when applied to electronically non-adiabatic applications as implemented within the recently developed symmetrical quasi-classical (SQC) Meyer-Miller (MM) approach. Moreover, use of the Wigner function (obtained directly) in a-a variables shows how our standard SQC/MM approach can be used to obtain off-diagonal elements of the electronic density matrix by processing in a different way the same set of trajectories already used (in the SQC/MM methodology) to obtain the diagonal elements.

  19. Drugs as instruments: Describing and testing a behavioral approach to the study of neuroenhancement

    Directory of Open Access Journals (Sweden)

    Ralf Brand

    2016-08-01

    Full Text Available Neuroenhancement (NE is the non-medical use of psychoactive substances to produce a subjective enhancement in psychological functioning and experience. So far empirical investigations of individuals’ motivation for NE however have been hampered by the lack of theoretical foundation. This study aimed to apply drug instrumentalization theory to user motivation for NE. We argue that NE should be defined and analyzed from a behavioral perspective rather than in terms of the characteristics of substances used for NE. In the empirical study we explored user behavior by analyzing relationships between drug options (use over-the-counter products, prescription drugs, illicit drugs and postulated drug instrumentalization goals (e.g. improved cognitive performance, counteracting fatigue, improved social interaction. Questionnaire data from 1,438 university students were subjected to exploratory and confirmatory factor analysis to address the question of whether analysis of drug instrumentalization should be based on the assumption that users are aiming to achieve a certain goal and choose their drug accordingly or whether NE behavior is more strongly rooted in a decision to try or use a certain drug option. We used factor mixture modeling to explore whether users could be separated into qualitatively different groups defined by a shared ‘goal × drug option’ configuration. Our results indicate, first, that individuals’ decisions about NE are eventually based on personal attitude to drug options (e.g. willingness to use an over-the-counter product but not to abuse prescription drugs rather than motivated by desire to achieve a specific goal (e.g. fighting tiredness for which different drug options might be tried. Second, data analyses suggested two qualitatively different classes of users. Both predominantly used over-the-counter products, but ‘neuroenhancers’ might be characterized by a higher propensity to instrumentalize over

  20. Design and Implementation Content Validity Study: Development of an instrument for measuring Patient-Centered Communication

    Directory of Open Access Journals (Sweden)

    Vahid Zamanzadeh

    2015-06-01

    Full Text Available ABSTRACT Introduction: The importance of content validity in the instrument psychometric and its relevance with reliability, have made it an essential step in the instrument development. This article attempts to give an overview of the content validity process and to explain the complexity of this process by introducing an example. Methods: We carried out a methodological study conducted to examine the content validity of the patient-centered communication instrument through a two-step process (development and judgment. At the first step, domain determination, sampling (item generation and instrument formation and at the second step, content validity ratio, content validity index and modified kappa statistic was performed. Suggestions of expert panel and item impact scores are used to examine the instrument face validity. Results: From a set of 188 items, content validity process identified seven dimensions includes trust building (eight items, informational support (seven items, emotional support (five items, problem solving (seven items, patient activation (10 items, intimacy/friendship (six items and spirituality strengthening (14 items. Content validity study revealed that this instrument enjoys an appropriate level of content validity. The overall content validity index of the instrument using universal agreement approach was low; however, it can be advocated with respect to the high number of content experts that makes consensus difficult and high value of the S-CVI with the average approach, which was equal to 0.93. Conclusion: This article illustrates acceptable quantities indices for content validity a new instrument and outlines them during design and psychometrics of patient-centered communication measuring instrument.

  1. Machine learning search for variable stars

    Science.gov (United States)

    Pashchenko, Ilya N.; Sokolovsky, Kirill V.; Gavras, Panagiotis

    2018-04-01

    Photometric variability detection is often considered as a hypothesis testing problem: an object is variable if the null hypothesis that its brightness is constant can be ruled out given the measurements and their uncertainties. The practical applicability of this approach is limited by uncorrected systematic errors. We propose a new variability detection technique sensitive to a wide range of variability types while being robust to outliers and underestimated measurement uncertainties. We consider variability detection as a classification problem that can be approached with machine learning. Logistic Regression (LR), Support Vector Machines (SVM), k Nearest Neighbours (kNN), Neural Nets (NN), Random Forests (RF), and Stochastic Gradient Boosting classifier (SGB) are applied to 18 features (variability indices) quantifying scatter and/or correlation between points in a light curve. We use a subset of Optical Gravitational Lensing Experiment phase two (OGLE-II) Large Magellanic Cloud (LMC) photometry (30 265 light curves) that was searched for variability using traditional methods (168 known variable objects) as the training set and then apply the NN to a new test set of 31 798 OGLE-II LMC light curves. Among 205 candidates selected in the test set, 178 are real variables, while 13 low-amplitude variables are new discoveries. The machine learning classifiers considered are found to be more efficient (select more variables and fewer false candidates) compared to traditional techniques using individual variability indices or their linear combination. The NN, SGB, SVM, and RF show a higher efficiency compared to LR and kNN.

  2. Does social trust increase willingness to pay taxes to improve public healthcare? Cross-sectional cross-country instrumental variable analysis.

    Science.gov (United States)

    Habibov, Nazim; Cheung, Alex; Auchynnikava, Alena

    2017-09-01

    The purpose of this paper is to investigate the effect of social trust on the willingness to pay more taxes to improve public healthcare in post-communist countries. The well-documented association between higher levels of social trust and better health has traditionally been assumed to reflect the notion that social trust is positively associated with support for public healthcare system through its encouragement of cooperative behaviour, social cohesion, social solidarity, and collective action. Hence, in this paper, we have explicitly tested the notion that social trust contributes to an increase in willingness to financially support public healthcare. We use micro data from the 2010 Life-in-Transition survey (N = 29,526). Classic binomial probit and instrumental variables ivprobit regressions are estimated to model the relationship between social trust and paying more taxes to improve public healthcare. We found that an increase in social trust is associated with a greater willingness to pay more taxes to improve public healthcare. From the perspective of policy-making, healthcare administrators, policy-makers, and international donors should be aware that social trust is an important factor in determining the willingness of the population to provide much-needed financial resources to supporting public healthcare. From a theoretical perspective, we found that estimating the effect of trust on support for healthcare without taking confounding and measurement error problems into consideration will likely lead to an underestimation of the true effect of trust. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Evaluation of energy efficiency policy instruments effectiveness : case study Croatia

    International Nuclear Information System (INIS)

    Bukarica, V.

    2007-01-01

    This paper proposed a theoretical basis for evaluating energy efficiency policy in the Republic of Croatia and corroborated it with the analysis of energy efficiency market development and transformation. The current status of the market was evaluated and policy instruments were adapted to achieve optimal results. In particular, the energy efficiency market in Croatia was discussed in terms of micro and macro environment factors that influence policy making processes and the choice of policy instruments. The macro environment for energy efficiency market in Croatia is the process of European Union pre-integration with all related national and international legislation, political and economical factors and potential to use financial funds. The micro environment consists of government institutions, local financing institutions and a range of market players on the supply and demand side. Energy efficiency is the most powerful and cost-effective way for achieving goals of sustainable development. Policy instruments developed to improve energy efficiency are oriented towards a cleaner environment, better standard of living, more competitive industry and improved security of energy supply. Energy efficiency is much harder to implement and requires policy interventions. In response to recent trends in the energy sector, such as deregulation and open competition, policy measures aimed at improving energy efficiency should shift from an end-users oriented approach towards a whole market approach. The optimal policy instruments mix should be designed to meet defined targets. However, market dynamics must be taken into consideration. 9 refs., 4 figs

  4. Improved sample size determination for attributes and variables sampling

    International Nuclear Information System (INIS)

    Stirpe, D.; Picard, R.R.

    1985-01-01

    Earlier INMM papers have addressed the attributes/variables problem and, under conservative/limiting approximations, have reported analytical solutions for the attributes and variables sample sizes. Through computer simulation of this problem, we have calculated attributes and variables sample sizes as a function of falsification, measurement uncertainties, and required detection probability without using approximations. Using realistic assumptions for uncertainty parameters of measurement, the simulation results support the conclusions: (1) previously used conservative approximations can be expensive because they lead to larger sample sizes than needed; and (2) the optimal verification strategy, as well as the falsification strategy, are highly dependent on the underlying uncertainty parameters of the measurement instruments. 1 ref., 3 figs

  5. Instrumentation

    International Nuclear Information System (INIS)

    Decreton, M.

    2000-01-01

    SCK-CEN's research and development programme on instrumentation aims at evaluating the potentials of new instrumentation technologies under the severe constraints of a nuclear application. It focuses on the tolerance of sensors to high radiation doses, including optical fibre sensors, and on the related intelligent data processing needed to cope with the nuclear constraints. Main achievements in these domains in 1999 are summarised

  6. Localifecation of variable-basis topological systems | Solovyov ...

    African Journals Online (AJOL)

    The paper provides another approach to the notion of variable-basis topological system generalizing the fixed-basis concept of S. Vickers, considers functorial relationships between the categories of modified variable-basis topological systems and variable-basis fuzzy topological spaces in the sense of S.E. Rodabaugh ...

  7. Small Scale Variability and the Problem of Data Validation

    Science.gov (United States)

    Sparling, L. C.; Avallone, L.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    Numerous measurements taken with a variety of airborne, balloon borne and ground based instruments over the past decade have revealed a complex multiscaled 3D structure in both chemical and dynamical fields in the upper troposphere/lower stratosphere. The variability occurs on scales that are well below the resolution of satellite measurements, leading to problems in measurement validation. We discuss some statistical ideas that can shed some light on the contribution of the natural variability to the inevitable differences in correlative measurements that are not strictly colocated, or that have different spatial resolution.

  8. Remote Access to Instrumental Analysis for Distance Education in Science

    Directory of Open Access Journals (Sweden)

    Dietmar Kennepohl

    2005-11-01

    Full Text Available Remote access to experiments offers distance educators another tool to integrate a strong laboratory component within a science course. Since virtually all modern chemical instrumental analysis in industry now use devices operated by a computer interface, remote control of instrumentation is not only relatively facile, it enhances students’ opportunity to learn the subject matter and be exposed to “real world” contents. Northern Alberta Institute of Technology (NAIT and Athabasca University are developing teaching laboratories based on the control of analytical instruments in real-time via an Internet connection. Students perform real-time analysis using equipment, methods, and skills that are common to modern analytical laboratories (or sophisticated teaching laboratories. Students obtain real results using real substances to arrive at real conclusions, just as they would if they were in a physical laboratory with the equipment; this approach allows students to access to conduct instrumental science experiments, thus providing them with an advantageous route to upgrade their laboratory skills while learning at a distance.

  9. Importance of the macroeconomic variables for variance prediction: A GARCH-MIDAS approach

    DEFF Research Database (Denmark)

    Asgharian, Hossein; Hou, Ai Jun; Javed, Farrukh

    2013-01-01

    This paper aims to examine the role of macroeconomic variables in forecasting the return volatility of the US stock market. We apply the GARCH-MIDAS (Mixed Data Sampling) model to examine whether information contained in macroeconomic variables can help to predict short-term and long-term compone......This paper aims to examine the role of macroeconomic variables in forecasting the return volatility of the US stock market. We apply the GARCH-MIDAS (Mixed Data Sampling) model to examine whether information contained in macroeconomic variables can help to predict short-term and long...

  10. Performing the Super Instrument

    DEFF Research Database (Denmark)

    Kallionpaa, Maria

    2016-01-01

    can empower performers by producing super instrument works that allow the concert instrument to become an ensemble controlled by a single player. The existing instrumental skills of the performer can be multiplied and the qualities of regular acoustic instruments extended or modified. Such a situation......The genre of contemporary classical music has seen significant innovation and research related to new super, hyper, and hybrid instruments, which opens up a vast palette of expressive potential. An increasing number of composers, performers, instrument designers, engineers, and computer programmers...... have become interested in different ways of “supersizing” acoustic instruments in order to open up previously-unheard instrumental sounds. Super instruments vary a great deal but each has a transformative effect on the identity and performance practice of the performing musician. Furthermore, composers...

  11. Stopped-pipe wind instruments: Acoustics of the panpipes

    Science.gov (United States)

    Fletcher, N. H.

    2005-01-01

    Stopped-pipe jet-excited musical instruments are known in many cultures, those best-known today being the panpipes or syrinx of Eastern Europe and of the Peruvian Andes. Although the playing style differs, in each case the instrument consists of a set of graduated bamboo pipes excited by blowing across the open tops. Details of the excitation aerodynamics warrant examination, particularly as the higher notes contain amplitudes of the even harmonics approaching those of the odd harmonics expected from a stopped pipe. Analysis shows that the jet offset is controlled by the fluid dynamics of the jet, and is such that appreciable even-harmonic excitation is generated. The theory is largely confirmed by measurements on a player. .

  12. Managing the Complexity of Human/Machine Interactions in Computerized Learning Environments: Guiding Students' Command Process through Instrumental Orchestrations

    Science.gov (United States)

    Trouche, Luc

    2004-01-01

    After an introduction which addresses some basic questions, this article is organized around three points: (1) The theoretical framework of the so-called "instrumental approach" which has been a theme in the last two CAME symposia; (2) A consideration of two processes ("instrumentalization" and "instrumentation") which interact in the…

  13. A Radar/Radiometer Instrument for Mapping Soil Moisture and Ocean Salinity

    Science.gov (United States)

    Hildebrand, Peter H.; Hilliard, Laurence; Rincon, Rafael; LeVine, David; Mead, James

    2003-01-01

    The RadSTAR instrument combines an L-band, digital beam-forming radar with an L-band synthetic aperture, thinned array (STAR) radiometer. The RadSTAR development will support NASA Earth science goals by developing a novel, L-band scatterometer/ radiometer that measures Earth surface bulk material properties (surface emissions and backscatter) as well as surface characteristics (backscatter). Present, real aperture airborne L-Band active/passive measurement systems such as the JPUPALS (Wilson, et al, 2000) provide excellent sampling characteristics, but have no scanning capabilities, and are extremely large; the huge JPUPALS horn requires a the C-130 airborne platform, operated with the aft loading door open during flight operation. The approach used for the upcoming Aquarius ocean salinity mission or the proposed Hydros soil mission use real apertures with multiple fixed beams or scanning beams. For real aperture instruments, there is no upgrade path to scanning over a broad swath, except rotation of the whole aperture, which is an approach with obvious difficulties as aperture size increases. RadSTAR will provide polarimetric scatterometer and radiometer measurements over a wide swath, in a highly space-efficient configuration. The electronic scanning approaches provided through STAR technology and digital beam forming will enable the large L-band aperture to scan efficiently over a very wide swath. RadSTAR technology development, which merges an interferometric radiometer with a digital beam forming scatterometer, is an important step in the path to space for an L-band scatterometer/radiometer. RadSTAR couples a patch array antenna with a 1.26 GHz digital beam forming radar scatterometer and a 1.4 GHz STAR radiometer to provide Earth surface backscatter and emission measurements in a compact, cross-track scanning instrument with no moving parts. This technology will provide the first L-band, emission and backscatter measurements in a compact aircraft instrument

  14. Analysis of key technologies for virtual instruments metrology

    Science.gov (United States)

    Liu, Guixiong; Xu, Qingui; Gao, Furong; Guan, Qiuju; Fang, Qiang

    2008-12-01

    Virtual instruments (VIs) require metrological verification when applied as measuring instruments. Owing to the software-centered architecture, metrological evaluation of VIs includes two aspects: measurement functions and software characteristics. Complexity of software imposes difficulties on metrological testing of VIs. Key approaches and technologies for metrology evaluation of virtual instruments are investigated and analyzed in this paper. The principal issue is evaluation of measurement uncertainty. The nature and regularity of measurement uncertainty caused by software and algorithms can be evaluated by modeling, simulation, analysis, testing and statistics with support of powerful computing capability of PC. Another concern is evaluation of software features like correctness, reliability, stability, security and real-time of VIs. Technologies from software engineering, software testing and computer security domain can be used for these purposes. For example, a variety of black-box testing, white-box testing and modeling approaches can be used to evaluate the reliability of modules, components, applications and the whole VI software. The security of a VI can be assessed by methods like vulnerability scanning and penetration analysis. In order to facilitate metrology institutions to perform metrological verification of VIs efficiently, an automatic metrological tool for the above validation is essential. Based on technologies of numerical simulation, software testing and system benchmarking, a framework for the automatic tool is proposed in this paper. Investigation on implementation of existing automatic tools that perform calculation of measurement uncertainty, software testing and security assessment demonstrates the feasibility of the automatic framework advanced.

  15. A multivariate and stochastic approach to identify key variables to rank dairy farms on profitability.

    Science.gov (United States)

    Atzori, A S; Tedeschi, L O; Cannas, A

    2013-05-01

    The economic efficiency of dairy farms is the main goal of farmers. The objective of this work was to use routinely available information at the dairy farm level to develop an index of profitability to rank dairy farms and to assist the decision-making process of farmers to increase the economic efficiency of the entire system. A stochastic modeling approach was used to study the relationships between inputs and profitability (i.e., income over feed cost; IOFC) of dairy cattle farms. The IOFC was calculated as: milk revenue + value of male calves + culling revenue - herd feed costs. Two databases were created. The first one was a development database, which was created from technical and economic variables collected in 135 dairy farms. The second one was a synthetic database (sDB) created from 5,000 synthetic dairy farms using the Monte Carlo technique and based on the characteristics of the development database data. The sDB was used to develop a ranking index as follows: (1) principal component analysis (PCA), excluding IOFC, was used to identify principal components (sPC); and (2) coefficient estimates of a multiple regression of the IOFC on the sPC were obtained. Then, the eigenvectors of the sPC were used to compute the principal component values for the original 135 dairy farms that were used with the multiple regression coefficient estimates to predict IOFC (dRI; ranking index from development database). The dRI was used to rank the original 135 dairy farms. The PCA explained 77.6% of the sDB variability and 4 sPC were selected. The sPC were associated with herd profile, milk quality and payment, poor management, and reproduction based on the significant variables of the sPC. The mean IOFC in the sDB was 0.1377 ± 0.0162 euros per liter of milk (€/L). The dRI explained 81% of the variability of the IOFC calculated for the 135 original farms. When the number of farms below and above 1 standard deviation (SD) of the dRI were calculated, we found that 21

  16. Long-term results using LigaSure™ 5 mm instrument for treatment of Zenker's diverticulum

    DEFF Research Database (Denmark)

    Andersen, Michelle Fog; Trolle, Waldemar; Anthonsen, Kristian

    2017-01-01

    The purpose of the present study was to evaluate the long-term results and patient's satisfaction of a new approach using the LigaSure™ 5 mm instrument for treatment of Zenker's diverticulum (ZD) and to compare with other long-term results using traditional treatment modalities. Between December ...... to traditional endoscopic techniques and is now the standard treatment method for ZD in our departments.......The purpose of the present study was to evaluate the long-term results and patient's satisfaction of a new approach using the LigaSure™ 5 mm instrument for treatment of Zenker's diverticulum (ZD) and to compare with other long-term results using traditional treatment modalities. Between December......%) reported no symptoms at all. Our results suggest that endoscopic management of ZD with the LigaSure™ 5 mm instrument is a minimally invasive, fast and safe method with solid long-term outcome with relief of symptoms and patient satisfaction. This new operative instrument was not found inferior...

  17. The regulatory instruments for the correction of energy-related environmental externalities

    International Nuclear Information System (INIS)

    Labanderia Villot, X.; Lopez Otero, X.; Rodriguez Mendez, M.

    2007-01-01

    In this paper we deal with the different regulatory instruments for the correction of energy-related environmental externalities. This objective is justified by the size and general occurrence of this type of externalities in contemporary societies. In this sense, we distinguish between three main generations of instruments: conventional regulations, market mechanisms and voluntary approaches. In all cases, some practical examples of their application are presented, albeit emphasizing the experience with the so-called market instruments and the results of hypothetical simulations for the Spanish case. As a general conclusion we underline the role of economic analysis in the design, choice and evaluation of those mechanisms, which also explains the structure and contents of the article. (Author)

  18. Instrumentation

    Energy Technology Data Exchange (ETDEWEB)

    Decreton, M

    2000-07-01

    SCK-CEN's research and development programme on instrumentation aims at evaluating the potentials of new instrumentation technologies under the severe constraints of a nuclear application. It focuses on the tolerance of sensors to high radiation doses, including optical fibre sensors, and on the related intelligent data processing needed to cope with the nuclear constraints. Main achievements in these domains in 1999 are summarised.

  19. Changes of regional climate variability in central Europe during the past 250 years

    Science.gov (United States)

    Böhm, R.

    2012-05-01

    The paper uses the data potential of very long and homogenized instrumental climate time series in the south central Europe for analyzing one feature which is very dominant in the climate change debate --whether anthropogenic climate warming causes or goes along with an increase of climate extremes. The monthly resolved data of the HISTALP data collection provide 58 single series for the three climate elements, air pressure, air temperature and precipitation, that start earlier than 1831 and extend back to 1760 in some cases. Trends and long-term low frequent climate evolution is only shortly touched in the paper. The main goal is the analysis of trends or changes of high frequent interannual and interseasonal variability. In other words, it is features like extremely hot summers, very cold winters, excessively dry or wet seasons which the study aims at. The methods used are based on detrended highpass series whose variance is analyzed in discrete 30-year windows moving over the entire instrumental period. The analysis of discrete subintervals relies on the unique number of 8 (for precipitation 7) such "normal periods". The second approach is based on the same subintervals though not in fixed but moving windows over the entire instrumental period. The first result of the study is the clear evidence that there has been no increase of variability during the past 250 years in the region. The second finding is similar but concentrates on the recent three decades which are of particular interest because they are the first 30 years with dominating anthropogenic greenhouse gas forcing. We can show that also this recent anthropogenic normal period shows no widening of the PDF (probability density function) compared to the preceding ones. The third finding is based on the moving window technique. It shows that interannual variability changes show a clear centennial oscillating structure for all three climate elements in the region. For the time being we have no explanation

  20. VIRUS instrument enclosures

    Science.gov (United States)

    Prochaska, T.; Allen, R.; Mondrik, N.; Rheault, J. P.; Sauseda, M.; Boster, E.; James, M.; Rodriguez-Patino, M.; Torres, G.; Ham, J.; Cook, E.; Baker, D.; DePoy, Darren L.; Marshall, Jennifer L.; Hill, G. J.; Perry, D.; Savage, R. D.; Good, J. M.; Vattiat, Brian L.

    2014-08-01

    The Visible Integral-Field Replicable Unit Spectrograph (VIRUS) instrument will be installed at the Hobby-Eberly Telescope† in the near future. The instrument will be housed in two enclosures that are mounted adjacent to the telescope, via the VIRUS Support Structure (VSS). We have designed the enclosures to support and protect the instrument, to enable servicing of the instrument, and to cool the instrument appropriately while not adversely affecting the dome environment. The system uses simple HVAC air handling techniques in conjunction with thermoelectric and standard glycol heat exchangers to provide efficient heat removal. The enclosures also provide power and data transfer to and from each VIRUS unit, liquid nitrogen cooling to the detectors, and environmental monitoring of the instrument and dome environments. In this paper, we describe the design and fabrication of the VIRUS enclosures and their subsystems.