WorldWideScience

Sample records for instrumental variables approach

  1. The productivity of mental health care: an instrumental variable approach.

    Lu, Mingshan

    1999-06-01

    BACKGROUND: Like many other medical technologies and treatments, there is a lack of reliable evidence on treatment effectiveness of mental health care. Increasingly, data from non-experimental settings are being used to study the effect of treatment. However, as in a number of studies using non-experimental data, a simple regression of outcome on treatment shows a puzzling negative and significant impact of mental health care on the improvement of mental health status, even after including a large number of potential control variables. The central problem in interpreting evidence from real-world or non-experimental settings is, therefore, the potential "selection bias" problem in observational data set. In other words, the choice/quantity of mental health care may be correlated with other variables, particularly unobserved variables, that influence outcome and this may lead to a bias in the estimate of the effect of care in conventional models. AIMS OF THE STUDY: This paper addresses the issue of estimating treatment effects using an observational data set. The information in a mental health data set obtained from two waves of data in Puerto Rico is explored. The results using conventional models - in which the potential selection bias is not controlled - and that from instrumental variable (IV) models - which is what was proposed in this study to correct for the contaminated estimation from conventional models - are compared. METHODS: Treatment effectiveness is estimated in a production function framework. Effectiveness is measured as the improvement in mental health status. To control for the potential selection bias problem, IV approaches are employed. The essence of the IV method is to use one or more instruments, which are observable factors that influence treatment but do not directly affect patient outcomes, to isolate the effect of treatment variation that is independent of unobserved patient characteristics. The data used in this study are the first (1992

  2. Social interactions and college enrollment: A combined school fixed effects/instrumental variables approach.

    Fletcher, Jason M

    2015-07-01

    This paper provides some of the first evidence of peer effects in college enrollment decisions. There are several empirical challenges in assessing the influences of peers in this context, including the endogeneity of high school, shared group-level unobservables, and identifying policy-relevant parameters of social interactions models. This paper addresses these issues by using an instrumental variables/fixed effects approach that compares students in the same school but different grade-levels who are thus exposed to different sets of classmates. In particular, plausibly exogenous variation in peers' parents' college expectations are used as an instrument for peers' college choices. Preferred specifications indicate that increasing a student's exposure to college-going peers by ten percentage points is predicted to raise the student's probability of enrolling in college by 4 percentage points. This effect is roughly half the magnitude of growing up in a household with married parents (vs. an unmarried household). Copyright © 2015 Elsevier Inc. All rights reserved.

  3. The effect of patient satisfaction with pharmacist consultation on medication adherence: an instrumental variable approach

    Gu NY

    2008-12-01

    Full Text Available There are limited studies on quantifying the impact of patient satisfaction with pharmacist consultation on patient medication adherence. Objectives: The objective of this study is to evaluate the effect of patient satisfaction with pharmacist consultation services on medication adherence in a large managed care organization. Methods: We analyzed data from a patient satisfaction survey of 6,916 patients who had used pharmacist consultation services in Kaiser Permanente Southern California from 1993 to 1996. We compared treating patient satisfaction as exogenous, in a single-equation probit model, with a bivariate probit model where patient satisfaction was treated as endogenous. Different sets of instrumental variables were employed, including measures of patients' emotional well-being and patients' propensity to fill their prescriptions at a non-Kaiser Permanente (KP pharmacy. The Smith-Blundell test was used to test whether patient satisfaction was endogenous. Over-identification tests were used to test the validity of the instrumental variables. The Staiger-Stock weak instrument test was used to evaluate the explanatory power of the instrumental variables. Results: All tests indicated that the instrumental variables method was valid and the instrumental variables used have significant explanatory power. The single equation probit model indicated that the effect of patient satisfaction with pharmacist consultation was significant (p<0.010. However, the bivariate probit models revealed that the marginal effect of pharmacist consultation on medication adherence was significantly greater than the single equation probit. The effect increased from 7% to 30% (p<0.010 after controlling for endogeneity bias. Conclusion: After appropriate adjustment for endogeneity bias, patients satisfied with their pharmacy services are substantially more likely to adhere to their medication. The results have important policy implications given the increasing focus

  4. Is foreign direct investment good for health in low and middle income countries? An instrumental variable approach.

    Burns, Darren K; Jones, Andrew P; Goryakin, Yevgeniy; Suhrcke, Marc

    2017-05-01

    There is a scarcity of quantitative research into the effect of FDI on population health in low and middle income countries (LMICs). This paper investigates the relationship using annual panel data from 85 LMICs between 1974 and 2012. When controlling for time trends, country fixed effects, correlation between repeated observations, relevant covariates, and endogeneity via a novel instrumental variable approach, we find FDI to have a beneficial effect on overall health, proxied by life expectancy. When investigating age-specific mortality rates, we find a stronger beneficial effect of FDI on adult mortality, yet no association with either infant or child mortality. Notably, FDI effects on health remain undetected in all models which do not control for endogeneity. Exploring the effect of sector-specific FDI on health in LMICs, we provide preliminary evidence of a weak inverse association between secondary (i.e. manufacturing) sector FDI and overall life expectancy. Our results thus suggest that FDI has provided an overall benefit to population health in LMICs, particularly in adults, yet investments into the secondary sector could be harmful to health. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Combining fixed effects and instrumental variable approaches for estimating the effect of psychosocial job quality on mental health: evidence from 13 waves of a nationally representative cohort study.

    Milner, Allison; Aitken, Zoe; Kavanagh, Anne; LaMontagne, Anthony D; Pega, Frank; Petrie, Dennis

    2017-06-23

    Previous studies suggest that poor psychosocial job quality is a risk factor for mental health problems, but they use conventional regression analytic methods that cannot rule out reverse causation, unmeasured time-invariant confounding and reporting bias. This study combines two quasi-experimental approaches to improve causal inference by better accounting for these biases: (i) linear fixed effects regression analysis and (ii) linear instrumental variable analysis. We extract 13 annual waves of national cohort data including 13 260 working-age (18-64 years) employees. The exposure variable is self-reported level of psychosocial job quality. The instruments used are two common workplace entitlements. The outcome variable is the Mental Health Inventory (MHI-5). We adjust for measured time-varying confounders. In the fixed effects regression analysis adjusted for time-varying confounders, a 1-point increase in psychosocial job quality is associated with a 1.28-point improvement in mental health on the MHI-5 scale (95% CI: 1.17, 1.40; P variable analysis, a 1-point increase psychosocial job quality is related to 1.62-point improvement on the MHI-5 scale (95% CI: -0.24, 3.48; P = 0.088). Our quasi-experimental results provide evidence to confirm job stressors as risk factors for mental ill health using methods that improve causal inference. © The Author 2017. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  6. A review of instrumental variable estimators for Mendelian randomization.

    Burgess, Stephen; Small, Dylan S; Thompson, Simon G

    2017-10-01

    Instrumental variable analysis is an approach for obtaining causal inferences on the effect of an exposure (risk factor) on an outcome from observational data. It has gained in popularity over the past decade with the use of genetic variants as instrumental variables, known as Mendelian randomization. An instrumental variable is associated with the exposure, but not associated with any confounder of the exposure-outcome association, nor is there any causal pathway from the instrumental variable to the outcome other than via the exposure. Under the assumption that a single instrumental variable or a set of instrumental variables for the exposure is available, the causal effect of the exposure on the outcome can be estimated. There are several methods available for instrumental variable estimation; we consider the ratio method, two-stage methods, likelihood-based methods, and semi-parametric methods. Techniques for obtaining statistical inferences and confidence intervals are presented. The statistical properties of estimates from these methods are compared, and practical advice is given about choosing a suitable analysis method. In particular, bias and coverage properties of estimators are considered, especially with weak instruments. Settings particularly relevant to Mendelian randomization are prioritized in the paper, notably the scenario of a continuous exposure and a continuous or binary outcome.

  7. Instrumental Variables in the Long Run

    Casey, Gregory; Klemp, Marc Patrick Brag

    2017-01-01

    In the study of long-run economic growth, it is common to use historical or geographical variables as instruments for contemporary endogenous regressors. We study the interpretation of these conventional instrumental variable (IV) regressions in a general, yet simple, framework. Our aim...... quantitative implications for the field of long-run economic growth. We also use our framework to examine related empirical techniques. We find that two prominent regression methodologies - using gravity-based instruments for trade and including ancestry-adjusted variables in linear regression models - have...... is to estimate the long-run causal effect of changes in the endogenous explanatory variable. We find that conventional IV regressions generally cannot recover this parameter of interest. To estimate this parameter, therefore, we develop an augmented IV estimator that combines the conventional regression...

  8. Disability as deprivation of capabilities: Estimation using a large-scale survey in Morocco and Tunisia and an instrumental variable approach.

    Trani, Jean-Francois; Bakhshi, Parul; Brown, Derek; Lopez, Dominique; Gall, Fiona

    2018-05-25

    The capability approach pioneered by Amartya Sen and Martha Nussbaum offers a new paradigm to examine disability, poverty and their complex associations. Disability is hence defined as a situation in which a person with an impairment faces various forms of restrictions in functionings and capabilities. Additionally, poverty is not the mere absence of income but a lack of ability to achieve essential functionings; disability is consequently the poverty of capabilities of persons with impairment. It is the lack of opportunities in a given context and agency that leads to persons with disabilities being poorer than other social groups. Consequently, poverty of people with disabilities comprises of complex processes of social exclusion and disempowerment. Despite growing evidence that persons with disabilities face higher levels of poverty, the literature from low and middle-income countries that analyzes the causal link between disability and poverty, remains limited. Drawing on data from a large case control field survey carried out between December 24th , 2013 and February 16th , 2014 in Tunisia and between November 4th , 2013 and June 12th , 2014 in Morocco, we examined the effect of impairment on various basic capabilities, health related quality of life and multidimensional poverty - indicators of poor wellbeing-in Morocco and Tunisia. To demonstrate a causal link between impairment and deprivation of capabilities, we used instrumental variable regression analyses. In both countries, we found lower access to jobs for persons with impairment. Health related quality of life was also lower for this group who also faced a higher risk of multidimensional poverty. There was no significant direct effect of impairment on access to school and acquiring literacy in both countries, and on access to health care and expenses in Tunisia, while having an impairment reduced access to healthcare facilities in Morocco and out of pocket expenditures. These results suggest that

  9. Instrumental variables I: instrumental variables exploit natural variation in nonexperimental data to estimate causal relationships.

    Rassen, Jeremy A; Brookhart, M Alan; Glynn, Robert J; Mittleman, Murray A; Schneeweiss, Sebastian

    2009-12-01

    The gold standard of study design for treatment evaluation is widely acknowledged to be the randomized controlled trial (RCT). Trials allow for the estimation of causal effect by randomly assigning participants either to an intervention or comparison group; through the assumption of "exchangeability" between groups, comparing the outcomes will yield an estimate of causal effect. In the many cases where RCTs are impractical or unethical, instrumental variable (IV) analysis offers a nonexperimental alternative based on many of the same principles. IV analysis relies on finding a naturally varying phenomenon, related to treatment but not to outcome except through the effect of treatment itself, and then using this phenomenon as a proxy for the confounded treatment variable. This article demonstrates how IV analysis arises from an analogous but potentially impossible RCT design, and outlines the assumptions necessary for valid estimation. It gives examples of instruments used in clinical epidemiology and concludes with an outline on estimation of effects.

  10. Instrumental variable methods in comparative safety and effectiveness research.

    Brookhart, M Alan; Rassen, Jeremy A; Schneeweiss, Sebastian

    2010-06-01

    Instrumental variable (IV) methods have been proposed as a potential approach to the common problem of uncontrolled confounding in comparative studies of medical interventions, but IV methods are unfamiliar to many researchers. The goal of this article is to provide a non-technical, practical introduction to IV methods for comparative safety and effectiveness research. We outline the principles and basic assumptions necessary for valid IV estimation, discuss how to interpret the results of an IV study, provide a review of instruments that have been used in comparative effectiveness research, and suggest some minimal reporting standards for an IV analysis. Finally, we offer our perspective of the role of IV estimation vis-à-vis more traditional approaches based on statistical modeling of the exposure or outcome. We anticipate that IV methods will be often underpowered for drug safety studies of very rare outcomes, but may be potentially useful in studies of intended effects where uncontrolled confounding may be substantial.

  11. Instrumental variable methods in comparative safety and effectiveness research†

    Brookhart, M. Alan; Rassen, Jeremy A.; Schneeweiss, Sebastian

    2010-01-01

    Summary Instrumental variable (IV) methods have been proposed as a potential approach to the common problem of uncontrolled confounding in comparative studies of medical interventions, but IV methods are unfamiliar to many researchers. The goal of this article is to provide a non-technical, practical introduction to IV methods for comparative safety and effectiveness research. We outline the principles and basic assumptions necessary for valid IV estimation, discuss how to interpret the results of an IV study, provide a review of instruments that have been used in comparative effectiveness research, and suggest some minimal reporting standards for an IV analysis. Finally, we offer our perspective of the role of IV estimation vis-à-vis more traditional approaches based on statistical modeling of the exposure or outcome. We anticipate that IV methods will be often underpowered for drug safety studies of very rare outcomes, but may be potentially useful in studies of intended effects where uncontrolled confounding may be substantial. PMID:20354968

  12. Sensitivity analysis and power for instrumental variable studies.

    Wang, Xuran; Jiang, Yang; Zhang, Nancy R; Small, Dylan S

    2018-03-31

    In observational studies to estimate treatment effects, unmeasured confounding is often a concern. The instrumental variable (IV) method can control for unmeasured confounding when there is a valid IV. To be a valid IV, a variable needs to be independent of unmeasured confounders and only affect the outcome through affecting the treatment. When applying the IV method, there is often concern that a putative IV is invalid to some degree. We present an approach to sensitivity analysis for the IV method which examines the sensitivity of inferences to violations of IV validity. Specifically, we consider sensitivity when the magnitude of association between the putative IV and the unmeasured confounders and the direct effect of the IV on the outcome are limited in magnitude by a sensitivity parameter. Our approach is based on extending the Anderson-Rubin test and is valid regardless of the strength of the instrument. A power formula for this sensitivity analysis is presented. We illustrate its usage via examples about Mendelian randomization studies and its implications via a comparison of using rare versus common genetic variants as instruments. © 2018, The International Biometric Society.

  13. Falsification Testing of Instrumental Variables Methods for Comparative Effectiveness Research.

    Pizer, Steven D

    2016-04-01

    To demonstrate how falsification tests can be used to evaluate instrumental variables methods applicable to a wide variety of comparative effectiveness research questions. Brief conceptual review of instrumental variables and falsification testing principles and techniques accompanied by an empirical application. Sample STATA code related to the empirical application is provided in the Appendix. Comparative long-term risks of sulfonylureas and thiazolidinediones for management of type 2 diabetes. Outcomes include mortality and hospitalization for an ambulatory care-sensitive condition. Prescribing pattern variations are used as instrumental variables. Falsification testing is an easily computed and powerful way to evaluate the validity of the key assumption underlying instrumental variables analysis. If falsification tests are used, instrumental variables techniques can help answer a multitude of important clinical questions. © Health Research and Educational Trust.

  14. A statistical approach to instrument calibration

    Robert R. Ziemer; David Strauss

    1978-01-01

    Summary - It has been found that two instruments will yield different numerical values when used to measure identical points. A statistical approach is presented that can be used to approximate the error associated with the calibration of instruments. Included are standard statistical tests that can be used to determine if a number of successive calibrations of the...

  15. Instrumental variable estimation in a survival context

    Tchetgen Tchetgen, Eric J; Walter, Stefan; Vansteelandt, Stijn

    2015-01-01

    for regression analysis in a survival context, primarily under an additive hazards model, for which we describe 2 simple methods for estimating causal effects. The first method is a straightforward 2-stage regression approach analogous to 2-stage least squares commonly used for IV analysis in linear regression....... The IV approach is very well developed in the context of linear regression and also for certain generalized linear models with a nonlinear link function. However, IV methods are not as well developed for regression analysis with a censored survival outcome. In this article, we develop the IV approach....... In this approach, the fitted value from a first-stage regression of the exposure on the IV is entered in place of the exposure in the second-stage hazard model to recover a valid estimate of the treatment effect of interest. The second method is a so-called control function approach, which entails adding...

  16. Power calculator for instrumental variable analysis in pharmacoepidemiology.

    Walker, Venexia M; Davies, Neil M; Windmeijer, Frank; Burgess, Stephen; Martin, Richard M

    2017-10-01

    Instrumental variable analysis, for example with physicians' prescribing preferences as an instrument for medications issued in primary care, is an increasingly popular method in the field of pharmacoepidemiology. Existing power calculators for studies using instrumental variable analysis, such as Mendelian randomization power calculators, do not allow for the structure of research questions in this field. This is because the analysis in pharmacoepidemiology will typically have stronger instruments and detect larger causal effects than in other fields. Consequently, there is a need for dedicated power calculators for pharmacoepidemiological research. The formula for calculating the power of a study using instrumental variable analysis in the context of pharmacoepidemiology is derived before being validated by a simulation study. The formula is applicable for studies using a single binary instrument to analyse the causal effect of a binary exposure on a continuous outcome. An online calculator, as well as packages in both R and Stata, are provided for the implementation of the formula by others. The statistical power of instrumental variable analysis in pharmacoepidemiological studies to detect a clinically meaningful treatment effect is an important consideration. Research questions in this field have distinct structures that must be accounted for when calculating power. The formula presented differs from existing instrumental variable power formulae due to its parametrization, which is designed specifically for ease of use by pharmacoepidemiologists. © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association

  17. Management Approach for Earth Venture Instrument

    Hope, Diane L.; Dutta, Sanghamitra

    2013-01-01

    The Earth Venture Instrument (EVI) element of the Earth Venture Program calls for developing instruments for participation on a NASA-arranged spaceflight mission of opportunity to conduct innovative, integrated, hypothesis or scientific question-driven approaches to pressing Earth system science issues. This paper discusses the EVI element and the management approach being used to manage both an instrument development activity as well as the host accommodations activity. In particular the focus will be on the approach being used for the first EVI (EVI-1) selected instrument, Tropospheric Emissions: Monitoring of Pollution (TEMPO), which will be hosted on a commercial GEO satellite and some of the challenges encountered to date and corresponding mitigations that are associated with the management structure for the TEMPO Mission and the architecture of EVI.

  18. Comparison of variance estimators for metaanalysis of instrumental variable estimates

    Schmidt, A. F.; Hingorani, A. D.; Jefferis, B. J.; White, J.; Groenwold, R. H H; Dudbridge, F.; Ben-Shlomo, Y.; Chaturvedi, N.; Engmann, J.; Hughes, A.; Humphries, S.; Hypponen, E.; Kivimaki, M.; Kuh, D.; Kumari, M.; Menon, U.; Morris, R.; Power, C.; Price, J.; Wannamethee, G.; Whincup, P.

    2016-01-01

    Background: Mendelian randomization studies perform instrumental variable (IV) analysis using genetic IVs. Results of individual Mendelian randomization studies can be pooled through meta-analysis. We explored how different variance estimators influence the meta-analysed IV estimate. Methods: Two

  19. Econometrics in outcomes research: the use of instrumental variables.

    Newhouse, J P; McClellan, M

    1998-01-01

    We describe an econometric technique, instrumental variables, that can be useful in estimating the effectiveness of clinical treatments in situations when a controlled trial has not or cannot be done. This technique relies upon the existence of one or more variables that induce substantial variation in the treatment variable but have no direct effect on the outcome variable of interest. We illustrate the use of the technique with an application to aggressive treatment of acute myocardial infarction in the elderly.

  20. Eliminating Survivor Bias in Two-stage Instrumental Variable Estimators.

    Vansteelandt, Stijn; Walter, Stefan; Tchetgen Tchetgen, Eric

    2018-07-01

    Mendelian randomization studies commonly focus on elderly populations. This makes the instrumental variables analysis of such studies sensitive to survivor bias, a type of selection bias. A particular concern is that the instrumental variable conditions, even when valid for the source population, may be violated for the selective population of individuals who survive the onset of the study. This is potentially very damaging because Mendelian randomization studies are known to be sensitive to bias due to even minor violations of the instrumental variable conditions. Interestingly, the instrumental variable conditions continue to hold within certain risk sets of individuals who are still alive at a given age when the instrument and unmeasured confounders exert additive effects on the exposure, and moreover, the exposure and unmeasured confounders exert additive effects on the hazard of death. In this article, we will exploit this property to derive a two-stage instrumental variable estimator for the effect of exposure on mortality, which is insulated against the above described selection bias under these additivity assumptions.

  1. Instrumental variable estimation of treatment effects for duration outcomes

    G.E. Bijwaard (Govert)

    2007-01-01

    textabstractIn this article we propose and implement an instrumental variable estimation procedure to obtain treatment effects on duration outcomes. The method can handle the typical complications that arise with duration data of time-varying treatment and censoring. The treatment effect we

  2. Instrumental variables estimation under a structural Cox model

    Martinussen, Torben; Nørbo Sørensen, Ditte; Vansteelandt, Stijn

    2017-01-01

    Instrumental variable (IV) analysis is an increasingly popular tool for inferring the effect of an exposure on an outcome, as witnessed by the growing number of IV applications in epidemiology, for instance. The majority of IV analyses of time-to-event endpoints are, however, dominated by heurist...

  3. Instrumental variables estimates of peer effects in social networks.

    An, Weihua

    2015-03-01

    Estimating peer effects with observational data is very difficult because of contextual confounding, peer selection, simultaneity bias, and measurement error, etc. In this paper, I show that instrumental variables (IVs) can help to address these problems in order to provide causal estimates of peer effects. Based on data collected from over 4000 students in six middle schools in China, I use the IV methods to estimate peer effects on smoking. My design-based IV approach differs from previous ones in that it helps to construct potentially strong IVs and to directly test possible violation of exogeneity of the IVs. I show that measurement error in smoking can lead to both under- and imprecise estimations of peer effects. Based on a refined measure of smoking, I find consistent evidence for peer effects on smoking. If a student's best friend smoked within the past 30 days, the student was about one fifth (as indicated by the OLS estimate) or 40 percentage points (as indicated by the IV estimate) more likely to smoke in the same time period. The findings are robust to a variety of robustness checks. I also show that sharing cigarettes may be a mechanism for peer effects on smoking. A 10% increase in the number of cigarettes smoked by a student's best friend is associated with about 4% increase in the number of cigarettes smoked by the student in the same time period. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Instrumented Impact Testing: Influence of Machine Variables and Specimen Position

    Lucon, E.; McCowan, C. N.; Santoyo, R. A.

    2008-09-15

    An investigation has been conducted on the influence of impact machine variables and specimen positioning on characteristic forces and absorbed energies from instrumented Charpy tests. Brittle and ductile fracture behavior has been investigated by testing NIST reference samples of low, high and super-high energy levels. Test machine variables included tightness of foundation, anvil and striker bolts, and the position of the center of percussion with respect to the center of strike. For specimen positioning, we tested samples which had been moved away or sideways with respect to the anvils. In order to assess the influence of the various factors, we compared mean values in the reference (unaltered) and altered conditions; for machine variables, t-test analyses were also performed in order to evaluate the statistical significance of the observed differences. Our results indicate that the only circumstance which resulted in variations larger than 5 percent for both brittle and ductile specimens is when the sample is not in contact with the anvils. These findings should be taken into account in future revisions of instrumented Charpy test standards.

  5. Instrumented Impact Testing: Influence of Machine Variables and Specimen Position

    Lucon, E.; McCowan, C. N.; Santoyo, R. A.

    2008-01-01

    An investigation has been conducted on the influence of impact machine variables and specimen positioning on characteristic forces and absorbed energies from instrumented Charpy tests. Brittle and ductile fracture behavior has been investigated by testing NIST reference samples of low, high and super-high energy levels. Test machine variables included tightness of foundation, anvil and striker bolts, and the position of the center of percussion with respect to the center of strike. For specimen positioning, we tested samples which had been moved away or sideways with respect to the anvils. In order to assess the influence of the various factors, we compared mean values in the reference (unaltered) and altered conditions; for machine variables, t-test analyses were also performed in order to evaluate the statistical significance of the observed differences. Our results indicate that the only circumstance which resulted in variations larger than 5 percent for both brittle and ductile specimens is when the sample is not in contact with the anvils. These findings should be taken into account in future revisions of instrumented Charpy test standards.

  6. On the Interpretation of Instrumental Variables in the Presence of Specification Errors

    P.A.V.B. Swamy

    2015-01-01

    Full Text Available The method of instrumental variables (IV and the generalized method of moments (GMM, and their applications to the estimation of errors-in-variables and simultaneous equations models in econometrics, require data on a sufficient number of instrumental variables that are both exogenous and relevant. We argue that, in general, such instruments (weak or strong cannot exist.

  7. Evaluating disease management programme effectiveness: an introduction to instrumental variables.

    Linden, Ariel; Adams, John L

    2006-04-01

    This paper introduces the concept of instrumental variables (IVs) as a means of providing an unbiased estimate of treatment effects in evaluating disease management (DM) programme effectiveness. Model development is described using zip codes as the IV. Three diabetes DM outcomes were evaluated: annual diabetes costs, emergency department (ED) visits and hospital days. Both ordinary least squares (OLS) and IV estimates showed a significant treatment effect for diabetes costs (P = 0.011) but neither model produced a significant treatment effect for ED visits. However, the IV estimate showed a significant treatment effect for hospital days (P = 0.006) whereas the OLS model did not. These results illustrate the utility of IV estimation when the OLS model is sensitive to the confounding effect of hidden bias.

  8. Important variables for parents' postnatal sense of security: evaluating a new Swedish instrument (the PPSS instrument).

    Persson, Eva K; Dykes, Anna-Karin

    2009-08-01

    to evaluate dimensions of both parents' postnatal sense of security the first week after childbirth, and to determine associations between the PPSS instrument and different sociodemographic and situational background variables. evaluative, cross-sectional design. 113 mothers and 99 fathers with children live born at term, from five hospitals in southern Sweden. mothers and fathers had similar feelings concerning postnatal sense of security. Of the dimensions in the PPSS instrument, a sense of midwives'/nurses' empowering behaviour, a sense of one's own general well-being and a sense of the mother's well-being as experienced by the father were the most important dimensions for parents' experienced security. A sense of affinity within the family (for both parents) and a sense of manageable breast feeding (for mothers) were not significantly associated with their experienced security. A sense of participation during pregnancy and general anxiety were significantly associated background variables for postnatal sense of security for both parents. For the mothers, parity and a sense that the father was participating during pregnancy were also significantly associated. more focus on parents' participation during pregnancy as well as midwives'/nurses' empowering behaviour during the postnatal period will be beneficial for both parents' postnatal sense of security.

  9. Instrument Variables for Reducing Noise in Parallel MRI Reconstruction

    Yuchou Chang

    2017-01-01

    Full Text Available Generalized autocalibrating partially parallel acquisition (GRAPPA has been a widely used parallel MRI technique. However, noise deteriorates the reconstructed image when reduction factor increases or even at low reduction factor for some noisy datasets. Noise, initially generated from scanner, propagates noise-related errors during fitting and interpolation procedures of GRAPPA to distort the final reconstructed image quality. The basic idea we proposed to improve GRAPPA is to remove noise from a system identification perspective. In this paper, we first analyze the GRAPPA noise problem from a noisy input-output system perspective; then, a new framework based on errors-in-variables (EIV model is developed for analyzing noise generation mechanism in GRAPPA and designing a concrete method—instrument variables (IV GRAPPA to remove noise. The proposed EIV framework provides possibilities that noiseless GRAPPA reconstruction could be achieved by existing methods that solve EIV problem other than IV method. Experimental results show that the proposed reconstruction algorithm can better remove the noise compared to the conventional GRAPPA, as validated with both of phantom and in vivo brain data.

  10. Bias and Bias Correction in Multi-Site Instrumental Variables Analysis of Heterogeneous Mediator Effects

    Reardon, Sean F.; Unlu, Faith; Zhu, Pei; Bloom, Howard

    2013-01-01

    We explore the use of instrumental variables (IV) analysis with a multi-site randomized trial to estimate the effect of a mediating variable on an outcome in cases where it can be assumed that the observed mediator is the only mechanism linking treatment assignment to outcomes, as assumption known in the instrumental variables literature as the…

  11. 26 CFR 1.1275-5 - Variable rate debt instruments.

    2010-04-01

    ... nonpublicly traded property. A debt instrument (other than a tax-exempt obligation) that would otherwise... variations in the cost of newly borrowed funds in the currency in which the debt instrument is denominated... on the yield of actively traded personal property (within the meaning of section 1092(d)(1)). (ii...

  12. Eutrophication Modeling Using Variable Chlorophyll Approach

    Abdolabadi, H.; Sarang, A.; Ardestani, M.; Mahjoobi, E.

    2016-01-01

    In this study, eutrophication was investigated in Lake Ontario to identify the interactions among effective drivers. The complexity of such phenomenon was modeled using a system dynamics approach based on a consideration of constant and variable stoichiometric ratios. The system dynamics approach is a powerful tool for developing object-oriented models to simulate complex phenomena that involve feedback effects. Utilizing stoichiometric ratios is a method for converting the concentrations of state variables. During the physical segmentation of the model, Lake Ontario was divided into two layers, i.e., the epilimnion and hypolimnion, and differential equations were developed for each layer. The model structure included 16 state variables related to phytoplankton, herbivorous zooplankton, carnivorous zooplankton, ammonium, nitrate, dissolved phosphorus, and particulate and dissolved carbon in the epilimnion and hypolimnion during a time horizon of one year. The results of several tests to verify the model, close to 1 Nash-Sutcliff coefficient (0.98), the data correlation coefficient (0.98), and lower standard errors (0.96), have indicated well-suited model’s efficiency. The results revealed that there were significant differences in the concentrations of the state variables in constant and variable stoichiometry simulations. Consequently, the consideration of variable stoichiometric ratios in algae and nutrient concentration simulations may be applied in future modeling studies to enhance the accuracy of the results and reduce the likelihood of inefficient control policies.

  13. Sistemic Approach - a Complexity Management Instrument

    Vadim Dumitrascu

    2006-04-01

    Full Text Available he systemic principle uses the deduction and the induction, analyse and synthesis, inferency and proferency, in order to find out the interdependencies and the inner connections that make mooving the complex organized entities. The true valences of this approach can be found neither in the simplist models of the “in-out” type, nor in the “circular” models that fill in the Economics and Management handbooks, and that consecrate another kind of formalism, but in the constructiviste-reflexive strategies, used in order to explain the economic and social structures.

  14. Sistemic Approach – a Complexity Management Instrument

    Vadim Dumitrascu

    2006-02-01

    Full Text Available The systemic principle uses the deduction and the induction, analyse and synthesis, inferency and proferency, in order to find out the interdependencies and the inner connections that make mooving the complex organized entities. The true valences of this approach can be found neither in the simplist models of the “in-out” type, nor in the “circular” models that fill in the Economics and Management handbooks, and that consecrate another kind of formalism, but in the constructiviste-reflexive strategies, used in order to explain the economic and social structures.

  15. Instrumentation

    Decreton, M.

    2002-01-01

    SCK-CEN's R and D programme on instrumentation involves the development of advanced instrumentation systems for nuclear applications as well as the assessment of the performance of these instruments in a radiation environment. Particular emphasis is on the use of optical fibres as umbilincal links of a remote handling unit for use during maintanance of a fusion reacor, studies on the radiation hardening of plasma diagnostic systems; investigations on new instrumentation for the future MYRRHA accelerator driven system; space applications related to radiation-hardened lenses; the development of new approaches for dose, temperature and strain measurements; the assessment of radiation-hardened sensors and motors for remote handling tasks and studies of dose measurement systems including the use of optical fibres. Progress and achievements in these areas for 2001 are described

  16. Instrumentation

    Decreton, M

    2002-04-01

    SCK-CEN's R and D programme on instrumentation involves the development of advanced instrumentation systems for nuclear applications as well as the assessment of the performance of these instruments in a radiation environment. Particular emphasis is on the use of optical fibres as umbilincal links of a remote handling unit for use during maintanance of a fusion reacor, studies on the radiation hardening of plasma diagnostic systems; investigations on new instrumentation for the future MYRRHA accelerator driven system; space applications related to radiation-hardened lenses; the development of new approaches for dose, temperature and strain measurements; the assessment of radiation-hardened sensors and motors for remote handling tasks and studies of dose measurement systems including the use of optical fibres. Progress and achievements in these areas for 2001 are described.

  17. Risk exposure mitigation: Approaches and recognised instruments (5

    Matić Vesna

    2014-01-01

    Full Text Available The risk management function development in banks, along with the development of tools that banks can use throughout this process, has had the strong support in international standards, not only in the recommended approaches for calculating economic capital requirements, but also in the qualitatively new treatment of risk exposure mitigation instruments (Basel Accord II. The array of eligible instruments for exposure mitigation under the recommended approaches for their treatment becomes the essential element of economic capital requirements calculation, both in relation to certain types of risk, and in relation to aggregate exposure.

  18. Risk exposure mitigation: Approaches and recognised instruments (3

    Matić Vesna

    2014-01-01

    Full Text Available The risk management function development in banks, along with the development of tools that banks can use throughout this process, has had the strong support in international standards, not only in the recommended approaches for calculating economic capital requirements, but also in the qualitatively new treatment of risk exposure mitigation instruments (Basel Accord II. The array of eligible instruments for exposure mitigation under the recommended approaches for their treatment becomes the essential element of economic capital requirements calculation, both in relation to certain types of risk, and in relation to aggregate exposure.

  19. Risk exposure mitigation: Approaches and recognised instruments (6

    Matić Vesna

    2015-01-01

    Full Text Available The risk management function development in banks, along with the development of tools that banks can use throughout this process, has had the strong support in international standards, not only in the recommended approaches for calculating economic capital requirements, but also in the qualitatively new treatment of risk exposure mitigation instruments (Basel Accord II. The array of eligible instruments for exposure mitigation under the recommended approaches for their treatment becomes the essential element of economic capital requirements calculation, both in relation to certain types of risk, and in relation to aggregate exposure.

  20. Censored Quantile Instrumental Variable Estimates of the Price Elasticity of Expenditure on Medical Care.

    Kowalski, Amanda

    2016-01-02

    Efforts to control medical care costs depend critically on how individuals respond to prices. I estimate the price elasticity of expenditure on medical care using a censored quantile instrumental variable (CQIV) estimator. CQIV allows estimates to vary across the conditional expenditure distribution, relaxes traditional censored model assumptions, and addresses endogeneity with an instrumental variable. My instrumental variable strategy uses a family member's injury to induce variation in an individual's own price. Across the conditional deciles of the expenditure distribution, I find elasticities that vary from -0.76 to -1.49, which are an order of magnitude larger than previous estimates.

  1. Invited Commentary: Using Financial Credits as Instrumental Variables for Estimating the Causal Relationship Between Income and Health.

    Pega, Frank

    2016-05-01

    Social epidemiologists are interested in determining the causal relationship between income and health. Natural experiments in which individuals or groups receive income randomly or quasi-randomly from financial credits (e.g., tax credits or cash transfers) are increasingly being analyzed using instrumental variable analysis. For example, in this issue of the Journal, Hamad and Rehkopf (Am J Epidemiol. 2016;183(9):775-784) used an in-work tax credit called the Earned Income Tax Credit as an instrument to estimate the association between income and child development. However, under certain conditions, the use of financial credits as instruments could violate 2 key instrumental variable analytic assumptions. First, some financial credits may directly influence health, for example, through increasing a psychological sense of welfare security. Second, financial credits and health may have several unmeasured common causes, such as politics, other social policies, and the motivation to maximize the credit. If epidemiologists pursue such instrumental variable analyses, using the amount of an unconditional, universal credit that an individual or group has received as the instrument may produce the most conceptually convincing and generalizable evidence. However, other natural income experiments (e.g., lottery winnings) and other methods that allow better adjustment for confounding might be more promising approaches for estimating the causal relationship between income and health. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Causal null hypotheses of sustained treatment strategies: What can be tested with an instrumental variable?

    Swanson, Sonja A; Labrecque, Jeremy; Hernán, Miguel A

    2018-05-02

    Sometimes instrumental variable methods are used to test whether a causal effect is null rather than to estimate the magnitude of a causal effect. However, when instrumental variable methods are applied to time-varying exposures, as in many Mendelian randomization studies, it is unclear what causal null hypothesis is tested. Here, we consider different versions of causal null hypotheses for time-varying exposures, show that the instrumental variable conditions alone are insufficient to test some of them, and describe additional assumptions that can be made to test a wider range of causal null hypotheses, including both sharp and average causal null hypotheses. Implications for interpretation and reporting of instrumental variable results are discussed.

  3. Optimal Inference for Instrumental Variables Regression with non-Gaussian Errors

    Cattaneo, Matias D.; Crump, Richard K.; Jansson, Michael

    This paper is concerned with inference on the coefficient on the endogenous regressor in a linear instrumental variables model with a single endogenous regressor, nonrandom exogenous regressors and instruments, and i.i.d. errors whose distribution is unknown. It is shown that under mild smoothness...

  4. Bias and Bias Correction in Multisite Instrumental Variables Analysis of Heterogeneous Mediator Effects

    Reardon, Sean F.; Unlu, Fatih; Zhu, Pei; Bloom, Howard S.

    2014-01-01

    We explore the use of instrumental variables (IV) analysis with a multisite randomized trial to estimate the effect of a mediating variable on an outcome in cases where it can be assumed that the observed mediator is the only mechanism linking treatment assignment to outcomes, an assumption known in the IV literature as the exclusion restriction.…

  5. Bayesian methods for meta-analysis of causal relationships estimated using genetic instrumental variables

    Burgess, Stephen; Thompson, Simon G; Thompson, Grahame

    2010-01-01

    Genetic markers can be used as instrumental variables, in an analogous way to randomization in a clinical trial, to estimate the causal relationship between a phenotype and an outcome variable. Our purpose is to extend the existing methods for such Mendelian randomization studies to the context o...

  6. Testing concordance of instrumental variable effects in generalized linear models with application to Mendelian randomization

    Dai, James Y.; Chan, Kwun Chuen Gary; Hsu, Li

    2014-01-01

    Instrumental variable regression is one way to overcome unmeasured confounding and estimate causal effect in observational studies. Built on structural mean models, there has been considerale work recently developed for consistent estimation of causal relative risk and causal odds ratio. Such models can sometimes suffer from identification issues for weak instruments. This hampered the applicability of Mendelian randomization analysis in genetic epidemiology. When there are multiple genetic variants available as instrumental variables, and causal effect is defined in a generalized linear model in the presence of unmeasured confounders, we propose to test concordance between instrumental variable effects on the intermediate exposure and instrumental variable effects on the disease outcome, as a means to test the causal effect. We show that a class of generalized least squares estimators provide valid and consistent tests of causality. For causal effect of a continuous exposure on a dichotomous outcome in logistic models, the proposed estimators are shown to be asymptotically conservative. When the disease outcome is rare, such estimators are consistent due to the log-linear approximation of the logistic function. Optimality of such estimators relative to the well-known two-stage least squares estimator and the double-logistic structural mean model is further discussed. PMID:24863158

  7. Statistical Analysis for Multisite Trials Using Instrumental Variables with Random Coefficients

    Raudenbush, Stephen W.; Reardon, Sean F.; Nomi, Takako

    2012-01-01

    Multisite trials can clarify the average impact of a new program and the heterogeneity of impacts across sites. Unfortunately, in many applications, compliance with treatment assignment is imperfect. For these applications, we propose an instrumental variable (IV) model with person-specific and site-specific random coefficients. Site-specific IV…

  8. Finite-sample instrumental variables inference using an asymptotically pivotal statistic

    Bekker, P; Kleibergen, F

    2003-01-01

    We consider the K-statistic, Kleibergen's (2002, Econometrica 70, 1781-1803) adaptation of the Anderson-Rubin (AR) statistic in instrumental variables regression. Whereas Kleibergen (2002) especially analyzes the asymptotic behavior of the statistic, we focus on finite-sample properties in, a

  9. Finite-sample instrumental variables Inference using an Asymptotically Pivotal Statistic

    Bekker, P.; Kleibergen, F.R.

    2001-01-01

    The paper considers the K-statistic, Kleibergen’s (2000) adaptation ofthe Anderson-Rubin (AR) statistic in instrumental variables regression.Compared to the AR-statistic this K-statistic shows improvedasymptotic efficiency in terms of degrees of freedom in overidentifiedmodels and yet it shares,

  10. Finite-sample instrumental variables inference using an asymptotically pivotal statistic

    Bekker, Paul A.; Kleibergen, Frank

    2001-01-01

    The paper considers the K-statistic, Kleibergen’s (2000) adaptation of the Anderson-Rubin (AR) statistic in instrumental variables regression. Compared to the AR-statistic this K-statistic shows improved asymptotic efficiency in terms of degrees of freedom in overidenti?ed models and yet it shares,

  11. Health insurance and the demand for medical care: Instrumental variable estimates using health insurer claims data.

    Dunn, Abe

    2016-07-01

    This paper takes a different approach to estimating demand for medical care that uses the negotiated prices between insurers and providers as an instrument. The instrument is viewed as a textbook "cost shifting" instrument that impacts plan offerings, but is unobserved by consumers. The paper finds a price elasticity of demand of around -0.20, matching the elasticity found in the RAND Health Insurance Experiment. The paper also studies within-market variation in demand for prescription drugs and other medical care services and obtains comparable price elasticity estimates. Published by Elsevier B.V.

  12. Gene Variants Associated with Antisocial Behaviour: A Latent Variable Approach

    Bentley, Mary Jane; Lin, Haiqun; Fernandez, Thomas V.; Lee, Maria; Yrigollen, Carolyn M.; Pakstis, Andrew J.; Katsovich, Liliya; Olds, David L.; Grigorenko, Elena L.; Leckman, James F.

    2013-01-01

    Objective: The aim of this study was to determine if a latent variable approach might be useful in identifying shared variance across genetic risk alleles that is associated with antisocial behaviour at age 15 years. Methods: Using a conventional latent variable approach, we derived an antisocial phenotype in 328 adolescents utilizing data from a…

  13. The contextual effects of social capital on health: a cross-national instrumental variable analysis.

    Kim, Daniel; Baum, Christopher F; Ganz, Michael L; Subramanian, S V; Kawachi, Ichiro

    2011-12-01

    Past research on the associations between area-level/contextual social capital and health has produced conflicting evidence. However, interpreting this rapidly growing literature is difficult because estimates using conventional regression are prone to major sources of bias including residual confounding and reverse causation. Instrumental variable (IV) analysis can reduce such bias. Using data on up to 167,344 adults in 64 nations in the European and World Values Surveys and applying IV and ordinary least squares (OLS) regression, we estimated the contextual effects of country-level social trust on individual self-rated health. We further explored whether these associations varied by gender and individual levels of trust. Using OLS regression, we found higher average country-level trust to be associated with better self-rated health in both women and men. Instrumental variable analysis yielded qualitatively similar results, although the estimates were more than double in size in both sexes when country population density and corruption were used as instruments. The estimated health effects of raising the percentage of a country's population that trusts others by 10 percentage points were at least as large as the estimated health effects of an individual developing trust in others. These findings were robust to alternative model specifications and instruments. Conventional regression and to a lesser extent IV analysis suggested that these associations are more salient in women and in women reporting social trust. In a large cross-national study, our findings, including those using instrumental variables, support the presence of beneficial effects of higher country-level trust on self-rated health. Previous findings for contextual social capital using traditional regression may have underestimated the true associations. Given the close linkages between self-rated health and all-cause mortality, the public health gains from raising social capital within and across

  14. Association of Body Mass Index with Depression, Anxiety and Suicide-An Instrumental Variable Analysis of the HUNT Study.

    Johan Håkon Bjørngaard

    Full Text Available While high body mass index is associated with an increased risk of depression and anxiety, cumulative evidence indicates that it is a protective factor for suicide. The associations from conventional observational studies of body mass index with mental health outcomes are likely to be influenced by reverse causality or confounding by ill-health. In the present study, we investigated the associations between offspring body mass index and parental anxiety, depression and suicide in order to avoid problems with reverse causality and confounding by ill-health.We used data from 32,457 mother-offspring and 27,753 father-offspring pairs from the Norwegian HUNT-study. Anxiety and depression were assessed using the Hospital Anxiety and Depression Scale and suicide death from national registers. Associations between offspring and own body mass index and symptoms of anxiety and depression and suicide mortality were estimated using logistic and Cox regression. Causal effect estimates were estimated with a two sample instrument variable approach using offspring body mass index as an instrument for parental body mass index.Both own and offspring body mass index were positively associated with depression, while the results did not indicate any substantial association between body mass index and anxiety. Although precision was low, suicide mortality was inversely associated with own body mass index and the results from the analysis using offspring body mass index supported these results. Adjusted odds ratios per standard deviation body mass index from the instrumental variable analysis were 1.22 (95% CI: 1.05, 1.43 for depression, 1.10 (95% CI: 0.95, 1.27 for anxiety, and the instrumental variable estimated hazard ratios for suicide was 0.69 (95% CI: 0.30, 1.63.The present study's results indicate that suicide mortality is inversely associated with body mass index. We also found support for a positive association between body mass index and depression, but not

  15. Virtual instrumentation: a new approach for control and instrumentation - application in containment studies facility

    Gole, N.V.; Shanware, V.M.; Sebastian, A.; Subramaniam, K.

    2001-01-01

    PC based data-acquisition has emerged as a rapidly developing area particularly with respect to process instrumentation. Computer based data acquisition in process instrumentation combined with Supervisory Control and Data Acquisition (SCADA) software has introduced extensive possibilities with respect to formats for presentation of information. The concept of presenting data using any instrument format with the help of software tools to simulate the instrument on screen, needs to be understood, in order to be able to make use of its vast potential. The purpose of this paper is to present the significant features of the Virtual Instrumentation concept and discuss its application in the instrumentation and control system of containment studies facility (CSF). Factors involved in the development of the virtual instrumentation based I and C system for CSF are detailed and a functional overview of the system configuration is given. (author)

  16. Institution, Financial Sector, and Economic Growth: Use The Institutions As An Instrument Variable

    Albertus Girik Allo

    2016-01-01

    Institution has been investigated having indirect role on economic growth. This paper aims to evaluate whether the quality of institution matters for economic growth. By applying institution as instrumental variable at Foreign Direct Investment (FDI), quality of institution significantly influence economic growth. This study applies two set of data period, namely 1985-2013 and 2000-2013, available online in the World Bank (WB). The first data set, 1985-2013 is used to estimate the role of fin...

  17. The XRF spectrometer and the selection of analysis conditions (instrumental variables)

    Willis, J.P.

    2002-01-01

    Full text: This presentation will begin with a brief discussion of EDXRF and flat- and curved-crystal WDXRF spectrometers, contrasting the major differences between the three types. The remainder of the presentation will contain a detailed overview of the choice and settings of the many instrumental variables contained in a modern WDXRF spectrometer, and will discuss critically the choices facing the analyst in setting up a WDXRF spectrometer for different elements and applications. In particular it will discuss the choice of tube target (when a choice is possible), the kV and mA settings, tube filters, collimator masks, collimators, analyzing crystals, secondary collimators, detectors, pulse height selection, X-ray path medium (air, nitrogen, vacuum or helium), counting times for peak and background positions and their effect on counting statistics and lower limit of detection (LLD). The use of Figure of Merit (FOM) calculations to objectively choose the best combination of instrumental variables also will be discussed. This presentation will be followed by a shorter session on a subsequent day entitled - A Selection of XRF Conditions - Practical Session, where participants will be given the opportunity to discuss in groups the selection of the best instrumental variables for three very diverse applications. Copyright (2002) Australian X-ray Analytical Association Inc

  18. Instrumental variables estimation of exposure effects on a time-to-event endpoint using structural cumulative survival models

    Martinussen, Torben; Vansteelandt, Stijn; Tchetgen Tchetgen, Eric J.

    2017-01-01

    The use of instrumental variables for estimating the effect of an exposure on an outcome is popular in econometrics, and increasingly so in epidemiology. This increasing popularity may be attributed to the natural occurrence of instrumental variables in observational studies that incorporate elem...

  19. Exploring mouthfeel in model wines: Sensory-to-instrumental approaches.

    Laguna, Laura; Sarkar, Anwesha; Bryant, Michael G; Beadling, Andrew R; Bartolomé, Begoña; Victoria Moreno-Arribas, M

    2017-12-01

    /absence of tannins. It is therefore necessary to use an integrative approach that combines complementary instrumental techniques for mouthfeel perception characterization. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Instrumentation

    Prieur, G.; Nadi, M.; Hedjiedj, A.; Weber, S.

    1995-01-01

    This second chapter on instrumentation gives little general consideration on history and classification of instrumentation, and two specific states of the art. The first one concerns NMR (block diagram of instrumentation chain with details on the magnets, gradients, probes, reception unit). The first one concerns precision instrumentation (optical fiber gyro-meter and scanning electron microscope), and its data processing tools (programmability, VXI standard and its history). The chapter ends with future trends on smart sensors and Field Emission Displays. (D.L.). Refs., figs

  1. Knowledge based expert system approach to instrumentation selection (INSEL

    S. Barai

    2004-08-01

    Full Text Available The selection of appropriate instrumentation for any structural measurement of civil engineering structure is a complex task. Recent developments in Artificial Intelligence (AI can help in an organized use of experiential knowledge available on instrumentation for laboratory and in-situ measurement. Usually, the instrumentation decision is based on the experience and judgment of experimentalists. The heuristic knowledge available for different types of measurement is domain dependent and the information is scattered in varied knowledge sources. The knowledge engineering techniques can help in capturing the experiential knowledge. This paper demonstrates a prototype knowledge based system for INstrument SELection (INSEL assistant where the experiential knowledge for various structural domains can be captured and utilized for making instrumentation decision. In particular, this Knowledge Based Expert System (KBES encodes the heuristics on measurement and demonstrates the instrument selection process with reference to steel bridges. INSEL runs on a microcomputer and uses an INSIGHT 2+ environment.

  2. Instrumentation

    Decreton, M.

    2000-01-01

    SCK-CEN's research and development programme on instrumentation aims at evaluating the potentials of new instrumentation technologies under the severe constraints of a nuclear application. It focuses on the tolerance of sensors to high radiation doses, including optical fibre sensors, and on the related intelligent data processing needed to cope with the nuclear constraints. Main achievements in these domains in 1999 are summarised

  3. Robust best linear estimation for regression analysis using surrogate and instrumental variables.

    Wang, C Y

    2012-04-01

    We investigate methods for regression analysis when covariates are measured with errors. In a subset of the whole cohort, a surrogate variable is available for the true unobserved exposure variable. The surrogate variable satisfies the classical measurement error model, but it may not have repeated measurements. In addition to the surrogate variables that are available among the subjects in the calibration sample, we assume that there is an instrumental variable (IV) that is available for all study subjects. An IV is correlated with the unobserved true exposure variable and hence can be useful in the estimation of the regression coefficients. We propose a robust best linear estimator that uses all the available data, which is the most efficient among a class of consistent estimators. The proposed estimator is shown to be consistent and asymptotically normal under very weak distributional assumptions. For Poisson or linear regression, the proposed estimator is consistent even if the measurement error from the surrogate or IV is heteroscedastic. Finite-sample performance of the proposed estimator is examined and compared with other estimators via intensive simulation studies. The proposed method and other methods are applied to a bladder cancer case-control study.

  4. Instrumentation

    Decreton, M

    2001-04-01

    SCK-CEN's research and development programme on instrumentation involves the assessment and the development of sensitive measurement systems used within a radiation environment. Particular emphasis is on the assessment of optical fibre components and their adaptability to radiation environments. The evaluation of ageing processes of instrumentation in fission plants, the development of specific data evaluation strategies to compensate for ageing induced degradation of sensors and cable performance form part of these activities. In 2000, particular emphasis was on in-core reactor instrumentation applied to fusion, accelerator driven and water-cooled fission reactors. This involved the development of high performance instrumentation for irradiation experiments in the BR2 reactor in support of new instrumentation needs for MYRRHA, and for diagnostic systems for the ITER reactor.

  5. Instrumentation

    Decreton, M.

    2001-01-01

    SCK-CEN's research and development programme on instrumentation involves the assessment and the development of sensitive measurement systems used within a radiation environment. Particular emphasis is on the assessment of optical fibre components and their adaptability to radiation environments. The evaluation of ageing processes of instrumentation in fission plants, the development of specific data evaluation strategies to compensate for ageing induced degradation of sensors and cable performance form part of these activities. In 2000, particular emphasis was on in-core reactor instrumentation applied to fusion, accelerator driven and water-cooled fission reactors. This involved the development of high performance instrumentation for irradiation experiments in the BR2 reactor in support of new instrumentation needs for MYRRHA, and for diagnostic systems for the ITER reactor

  6. LARF: Instrumental Variable Estimation of Causal Effects through Local Average Response Functions

    Weihua An

    2016-07-01

    Full Text Available LARF is an R package that provides instrumental variable estimation of treatment effects when both the endogenous treatment and its instrument (i.e., the treatment inducement are binary. The method (Abadie 2003 involves two steps. First, pseudo-weights are constructed from the probability of receiving the treatment inducement. By default LARF estimates the probability by a probit regression. It also provides semiparametric power series estimation of the probability and allows users to employ other external methods to estimate the probability. Second, the pseudo-weights are used to estimate the local average response function conditional on treatment and covariates. LARF provides both least squares and maximum likelihood estimates of the conditional treatment effects.

  7. Data, instruments, and theory a dialectical approach to understanding science

    Ackermann, Robert John

    1985-01-01

    Robert John Ackermann deals decisively with the problem of relativism that has plagued post-empiricist philosophy of science. Recognizing that theory and data are mediated by data domains (bordered data sets produced by scientific instruments), he argues that the use of instruments breaks the dependency of observation on theory and thus creates a reasoned basis for scientific objectivity.

  8. International Approaches to Financial Instruments and Their Application in Ukraine

    Viktor Zamlynskyy

    2013-01-01

    Introduction of International Financial Reporting Standards in Ukraine requires scientific and methodological study of their specific use in national practice. The essence and types of financial instruments have been researched. The regulatory support for their accounting in Ukraine has been established. The authors have analyzed the provisions of the International Financial Reporting Standards governing the financial instruments accounting, worked out characteristics of existing methodology ...

  9. Impact of instrumental response on observed ozonesonde profiles: First-order estimates and implications for measures of variability

    Clifton, G. T.; Merrill, J. T.; Johnson, B. J.; Oltmans, S. J.

    2009-12-01

    Ozonesondes provide information on the ozone distribution up to the middle stratosphere. Ozone profiles often feature layers, with vertically discrete maxima and minima in the mixing ratio. Layers are especially common in the UT/LS regions and originate from wave breaking, shearing and other transport processes. ECC sondes, however, have a moderate response time to significant changes in ozone. A sonde can ascend over 350 meters before it responds fully to a step change in ozone. This results in an overestimate of the altitude assigned to layers and an underestimate of the underlying variability in the amount of ozone. An estimate of the response time is made for each instrument during the preparation for flight, but the profile data are typically not processed to account for the response. Here we present a method of categorizing the response time of ECC instruments and an analysis of a low-pass filter approximation to the effects on profile data. Exponential functions were fit to the step-up and step-down responses using laboratory data. The resulting response time estimates were consistent with results from standard procedures, with the up-step response time exceeding the down-step value somewhat. A single-pole Butterworth filter that approximates the instrumental effect was used with synthetic layered profiles to make first-order estimates of the impact of the finite response time. Using a layer analysis program previously applied to observed profiles we find that instrumental effects can attenuate ozone variability by 20-45% in individual layers, but that the vertical offset in layer altitudes is moderate, up to about 150 meters. We will present results obtained using this approach, coupled with data on the distribution of layer characteristics found using the layer analysis procedure on profiles from Narragansett, Rhode Island and other US sites to quantify the impact on overall variability estimates given ambient distributions of layer occurrence, thickness

  10. The Effect of Birth Weight on Academic Performance: Instrumental Variable Analysis.

    Lin, Shi Lin; Leung, Gabriel Matthew; Schooling, C Mary

    2017-05-01

    Observationally, lower birth weight is usually associated with poorer academic performance; whether this association is causal or the result of confounding is unknown. To investigate this question, we obtained an effect estimate, which can have a causal interpretation under specific assumptions, of birth weight on educational attainment using instrumental variable analysis based on single nucleotide polymorphisms determining birth weight combined with results from the Social Science Genetic Association Consortium study of 126,559 Caucasians. We similarly obtained an estimate of the effect of birth weight on academic performance in 4,067 adolescents from Hong Kong's (Chinese) Children of 1997 birth cohort (1997-2016), using twin status as an instrumental variable. Birth weight was not associated with years of schooling (per 100-g increase in birth weight, -0.006 years, 95% confidence interval (CI): -0.02, 0.01) or college completion (odds ratio = 1.00, 95% CI: 0.96, 1.03). Birth weight was also unrelated to academic performance in adolescents (per 100-g increase in birth weight, -0.004 grade, 95% CI: -0.04, 0.04) using instrumental variable analysis, although conventional regression gave a small positive association (0.02 higher grade, 95% CI: 0.01, 0.03). Observed associations of birth weight with academic performance may not be causal, suggesting that interventions should focus on the contextual factors generating this correlation. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Instrumentation

    Decreton, M

    2000-07-01

    SCK-CEN's research and development programme on instrumentation aims at evaluating the potentials of new instrumentation technologies under the severe constraints of a nuclear application. It focuses on the tolerance of sensors to high radiation doses, including optical fibre sensors, and on the related intelligent data processing needed to cope with the nuclear constraints. Main achievements in these domains in 1999 are summarised.

  12. Climate Informed Economic Instruments to Enhance Urban Water Supply Resilience to Hydroclimatological Variability and Change

    Brown, C.; Carriquiry, M.; Souza Filho, F. A.

    2006-12-01

    Hydroclimatological variability presents acute challenges to urban water supply providers. The impact is often most severe in developing nations where hydrologic and climate variability can be very high, water demand is unmet and increasing, and the financial resources to mitigate the social effects of that variability are limited. Furthermore, existing urban water systems face a reduced solution space, constrained by competing and conflicting interests, such as irrigation demand, recreation and hydropower production, and new (relative to system design) demands to satisfy environmental flow requirements. These constraints magnify the impacts of hydroclimatic variability and increase the vulnerability of urban areas to climate change. The high economic and social costs of structural responses to hydrologic variability, such as groundwater utilization and the construction or expansion of dams, create a need for innovative alternatives. Advances in hydrologic and climate forecasting, and the increasing sophistication and acceptance of incentive-based mechanisms for achieving economically efficient water allocation offer potential for improving the resilience of existing water systems to the challenge of variable supply. This presentation will explore the performance of a system of climate informed economic instruments designed to facilitate the reduction of hydroclimatologic variability-induced impacts on water-sensitive stakeholders. The system is comprised of bulk water option contracts between urban water suppliers and agricultural users and insurance indexed on reservoir inflows designed to cover the financial needs of the water supplier in situations where the option is likely to be exercised. Contract and insurance parameters are linked to forecasts and the evolution of seasonal precipitation and streamflow and designed for financial and political viability. A simulation of system performance is presented based on ongoing work in Metro Manila, Philippines. The

  13. Institution, Financial Sector, and Economic Growth: Use The Institutions As An Instrument Variable

    Albertus Girik Allo

    2016-06-01

    Full Text Available Institution has been investigated having indirect role on economic growth. This paper aims to evaluate whether the quality of institution matters for economic growth. By applying institution as instrumental variable at Foreign Direct Investment (FDI, quality of institution significantly influence economic growth. This study applies two set of data period, namely 1985-2013 and 2000-2013, available online in the World Bank (WB. The first data set, 1985-2013 is used to estimate the role of financial sector on economic growth, focuses on 67 countries. The second data set, 2000-2013 determine the role of institution on financial sector and economic growth by applying 2SLS estimation method. We define institutional variables as set of indicators: Control of Corruption, Political Stability and Absence of Violence, and Voice and Accountability provide declining impact of FDI to economic growth.

  14. Instrumentation

    Umminger, K.

    2008-01-01

    A proper measurement of the relevant single and two-phase flow parameters is the basis for the understanding of many complex thermal-hydraulic processes. Reliable instrumentation is therefore necessary for the interaction between analysis and experiment especially in the field of nuclear safety research where postulated accident scenarios have to be simulated in experimental facilities and predicted by complex computer code systems. The so-called conventional instrumentation for the measurement of e. g. pressures, temperatures, pressure differences and single phase flow velocities is still a solid basis for the investigation and interpretation of many phenomena and especially for the understanding of the overall system behavior. Measurement data from such instrumentation still serves in many cases as a database for thermal-hydraulic system codes. However some special instrumentation such as online concentration measurement for boric acid in the water phase or for non-condensibles in steam atmosphere as well as flow visualization techniques were further developed and successfully applied during the recent years. Concerning the modeling needs for advanced thermal-hydraulic codes, significant advances have been accomplished in the last few years in the local instrumentation technology for two-phase flow by the application of new sensor techniques, optical or beam methods and electronic technology. This paper will give insight into the current state of instrumentation technology for safety-related thermohydraulic experiments. Advantages and limitations of some measurement processes and systems will be indicated as well as trends and possibilities for further development. Aspects of instrumentation in operating reactors will also be mentioned.

  15. A selective review of the first 20 years of instrumental variables models in health-services research and medicine.

    Cawley, John

    2015-01-01

    The method of instrumental variables (IV) is useful for estimating causal effects. Intuitively, it exploits exogenous variation in the treatment, sometimes called natural experiments or instruments. This study reviews the literature in health-services research and medical research that applies the method of instrumental variables, documents trends in its use, and offers examples of various types of instruments. A literature search of the PubMed and EconLit research databases for English-language journal articles published after 1990 yielded a total of 522 original research articles. Citations counts for each article were derived from the Web of Science. A selective review was conducted, with articles prioritized based on number of citations, validity and power of the instrument, and type of instrument. The average annual number of papers in health services research and medical research that apply the method of instrumental variables rose from 1.2 in 1991-1995 to 41.8 in 2006-2010. Commonly-used instruments (natural experiments) in health and medicine are relative distance to a medical care provider offering the treatment and the medical care provider's historic tendency to administer the treatment. Less common but still noteworthy instruments include randomization of treatment for reasons other than research, randomized encouragement to undertake the treatment, day of week of admission as an instrument for waiting time for surgery, and genes as an instrument for whether the respondent has a heritable condition. The use of the method of IV has increased dramatically in the past 20 years, and a wide range of instruments have been used. Applications of the method of IV have in several cases upended conventional wisdom that was based on correlations and led to important insights about health and healthcare. Future research should pursue new applications of existing instruments and search for new instruments that are powerful and valid.

  16. Instruments

    Buehrer, W.

    1996-01-01

    The present paper mediates a basic knowledge of the most commonly used experimental techniques. We discuss the principles and concepts necessary to understand what one is doing if one performs an experiment on a certain instrument. (author) 29 figs., 1 tab., refs

  17. Developing an instrument to identify MBChB students' approaches ...

    The constructs of deep, surface and achieving approaches to learning are well defined in the literature and amply supported by research. Quality learning results from a deep approach to learning, and a deep-achieving approach to learning is regarded as the most adaptive approach institutionally. It is therefore felt that ...

  18. A novel variable selection approach that iteratively optimizes variable space using weighted binary matrix sampling.

    Deng, Bai-chuan; Yun, Yong-huan; Liang, Yi-zeng; Yi, Lun-zhao

    2014-10-07

    In this study, a new optimization algorithm called the Variable Iterative Space Shrinkage Approach (VISSA) that is based on the idea of model population analysis (MPA) is proposed for variable selection. Unlike most of the existing optimization methods for variable selection, VISSA statistically evaluates the performance of variable space in each step of optimization. Weighted binary matrix sampling (WBMS) is proposed to generate sub-models that span the variable subspace. Two rules are highlighted during the optimization procedure. First, the variable space shrinks in each step. Second, the new variable space outperforms the previous one. The second rule, which is rarely satisfied in most of the existing methods, is the core of the VISSA strategy. Compared with some promising variable selection methods such as competitive adaptive reweighted sampling (CARS), Monte Carlo uninformative variable elimination (MCUVE) and iteratively retaining informative variables (IRIV), VISSA showed better prediction ability for the calibration of NIR data. In addition, VISSA is user-friendly; only a few insensitive parameters are needed, and the program terminates automatically without any additional conditions. The Matlab codes for implementing VISSA are freely available on the website: https://sourceforge.net/projects/multivariateanalysis/files/VISSA/.

  19. Instrumental Approach and Diagnosis of Total Inorganics in a ...

    Typical Carbonaceous matter are often widespread and reveal relatively wide range of dominant organic types. The instrumental diagnosis to subject them to oxidation, combustion and/or incineration serve till date mandatory fundamental requirement in the further pursuance of their mineralogical and wet Chemical ...

  20. Instrumental approach and diagnosis of total inorganics in a typical ...

    The instrumental diagnosis to subject them to oxidation, combustion and/or incineration serves till date mandatory fundamental requirement in the further pursuance of their mineralogical and wet chemical analysis. The results of the ash content from three (3) carbonaceous coal and shaly coal samples from Northeastern ...

  1. Instrumentation

    Muehllehner, G.; Colsher, J.G.

    1982-01-01

    This chapter reviews the parameters which are important to positron-imaging instruments. It summarizes the options which various groups have explored in designing tomographs and the methods which have been developed to overcome some of the limitations inherent in the technique as well as in present instruments. The chapter is not presented as a defense of positron imaging versus single-photon or other imaging modality, neither does it contain a description of various existing instruments, but rather stresses their common properties and problems. Design parameters which are considered are resolution, sampling requirements, sensitivity, methods of eliminating scattered radiation, random coincidences and attenuation. The implementation of these parameters is considered, with special reference to sampling, choice of detector material, detector ring diameter and shielding and variations in point spread function. Quantitation problems discussed are normalization, and attenuation and random corrections. Present developments mentioned are noise reduction through time-of-flight-assisted tomography and signal to noise improvements through high intrinsic resolution. Extensive bibliography. (U.K.)

  2. Control approach development for variable recruitment artificial muscles

    Jenkins, Tyler E.; Chapman, Edward M.; Bryant, Matthew

    2016-04-01

    This study characterizes hybrid control approaches for the variable recruitment of fluidic artificial muscles with double acting (antagonistic) actuation. Fluidic artificial muscle actuators have been explored by researchers due to their natural compliance, high force-to-weight ratio, and low cost of fabrication. Previous studies have attempted to improve system efficiency of the actuators through variable recruitment, i.e. using discrete changes in the number of active actuators. While current variable recruitment research utilizes manual valve switching, this paper details the current development of an online variable recruitment control scheme. By continuously controlling applied pressure and discretely controlling the number of active actuators, operation in the lowest possible recruitment state is ensured and working fluid consumption is minimized. Results provide insight into switching control scheme effects on working fluids, fabrication material choices, actuator modeling, and controller development decisions.

  3. VIMOS Instrument Control Software Design: an Object Oriented Approach

    Brau-Nogué, Sylvie; Lucuix, Christian

    2002-12-01

    The Franco-Italian VIMOS instrument is a VIsible imaging Multi-Object Spectrograph with outstanding multiplex capabilities, allowing to take spectra of more than 800 objects simultaneously, or integral field spectroscopy mode in a 54x54 arcsec area. VIMOS is being installed at the Nasmyth focus of the third Unit Telescope of the European Southern Observatory Very Large Telescope (VLT) at Mount Paranal in Chile. This paper will describe the analysis, the design and the implementation of the VIMOS Instrument Control System, using UML notation. Our Control group followed an Object Oriented software process while keeping in mind the ESO VLT standard control concepts. At ESO VLT a complete software library is available. Rather than applying waterfall lifecycle, ICS project used iterative development, a lifecycle consisting of several iterations. Each iteration consisted in : capture and evaluate the requirements, visual modeling for analysis and design, implementation, test, and deployment. Depending of the project phases, iterations focused more or less on specific activity. The result is an object model (the design model), including use-case realizations. An implementation view and a deployment view complement this product. An extract of VIMOS ICS UML model will be presented and some implementation, integration and test issues will be discussed.

  4. Instrumental variables estimation of exposure effects on a time-to-event endpoint using structural cumulative survival models.

    Martinussen, Torben; Vansteelandt, Stijn; Tchetgen Tchetgen, Eric J; Zucker, David M

    2017-12-01

    The use of instrumental variables for estimating the effect of an exposure on an outcome is popular in econometrics, and increasingly so in epidemiology. This increasing popularity may be attributed to the natural occurrence of instrumental variables in observational studies that incorporate elements of randomization, either by design or by nature (e.g., random inheritance of genes). Instrumental variables estimation of exposure effects is well established for continuous outcomes and to some extent for binary outcomes. It is, however, largely lacking for time-to-event outcomes because of complications due to censoring and survivorship bias. In this article, we make a novel proposal under a class of structural cumulative survival models which parameterize time-varying effects of a point exposure directly on the scale of the survival function; these models are essentially equivalent with a semi-parametric variant of the instrumental variables additive hazards model. We propose a class of recursive instrumental variable estimators for these exposure effects, and derive their large sample properties along with inferential tools. We examine the performance of the proposed method in simulation studies and illustrate it in a Mendelian randomization study to evaluate the effect of diabetes on mortality using data from the Health and Retirement Study. We further use the proposed method to investigate potential benefit from breast cancer screening on subsequent breast cancer mortality based on the HIP-study. © 2017, The International Biometric Society.

  5. [Severe idiopathic scoliosis. Does the approach and the instruments used modify the results?].

    Sánchez-Márquez, J M; Sánchez Pérez-Grueso, F J; Pérez Martín-Buitrago, M; Fernández-Baíllo, N; García-Fernández, A; Quintáns-Rodríguez, J

    2014-01-01

    The aim of this work is to evaluate and compare the radiographic results and complications of the surgical treatment of adolescents with idiopathic scoliosis greater than 75 degrees, using a double approach (DA) or an isolated posterior approach with hybrid instruments (posterior hybrid [PH]), or with «all-pedicle screws» (posterior screws [PS]). A retrospective review was performed on 69 patients with idiopathic scoliosis greater than 75°, with a follow-up of more than 2 years, to analyze the flexibility of the curves, the correction obtained, and the complications depending on the type of surgery. The Kruskal-Wallis test for non-parametric variables was used for the statistical analysis. There were no statistically significant differences between the 3 patient groups in the pre-surgical Cobb angle values (DA=89°, PH=83°, PS=83°), in the immediate post-surgical (DA=34°, PH=33°, PS=30°), nor at the end of follow-up (DA=36°, PH=36°, PS=33°) (P>.05). The percentage correction (DA=60%, PH=57%, PS=60%) was similar between groups (P>.05). The percentage of complications associated with the procedure was 20.8% in DA, 10% in PH and 20% in PS. Two patients in the PS group showed changes, with no neurological lesions, in the spinal cord monitoring, and one patient in the same group suffered a delayed and transient incomplete lesion. No significant differences were observed in the correction of severe idiopathic scoliosis between patients operated using the double or isolated posterior approach, regardless of the type of instrumentation used. Copyright © 2013 SECOT. Published by Elsevier Espana. All rights reserved.

  6. Gene variants associated with antisocial behaviour: a latent variable approach.

    Bentley, Mary Jane; Lin, Haiqun; Fernandez, Thomas V; Lee, Maria; Yrigollen, Carolyn M; Pakstis, Andrew J; Katsovich, Liliya; Olds, David L; Grigorenko, Elena L; Leckman, James F

    2013-10-01

    The aim of this study was to determine if a latent variable approach might be useful in identifying shared variance across genetic risk alleles that is associated with antisocial behaviour at age 15 years. Using a conventional latent variable approach, we derived an antisocial phenotype in 328 adolescents utilizing data from a 15-year follow-up of a randomized trial of a prenatal and infancy nurse-home visitation programme in Elmira, New York. We then investigated, via a novel latent variable approach, 450 informative genetic polymorphisms in 71 genes previously associated with antisocial behaviour, drug use, affiliative behaviours and stress response in 241 consenting individuals for whom DNA was available. Haplotype and Pathway analyses were also performed. Eight single-nucleotide polymorphisms (SNPs) from eight genes contributed to the latent genetic variable that in turn accounted for 16.0% of the variance within the latent antisocial phenotype. The number of risk alleles was linearly related to the latent antisocial variable scores. Haplotypes that included the putative risk alleles for all eight genes were also associated with higher latent antisocial variable scores. In addition, 33 SNPs from 63 of the remaining genes were also significant when added to the final model. Many of these genes interact on a molecular level, forming molecular networks. The results support a role for genes related to dopamine, norepinephrine, serotonin, glutamate, opioid and cholinergic signalling as well as stress response pathways in mediating susceptibility to antisocial behaviour. This preliminary study supports use of relevant behavioural indicators and latent variable approaches to study the potential 'co-action' of gene variants associated with antisocial behaviour. It also underscores the cumulative relevance of common genetic variants for understanding the aetiology of complex behaviour. If replicated in future studies, this approach may allow the identification of a

  7. Using and Developing Measurement Instruments in Science Education: A Rasch Modeling Approach. Science & Engineering Education Sources

    Liu, Xiufeng

    2010-01-01

    This book meets a demand in the science education community for a comprehensive and introductory measurement book in science education. It describes measurement instruments reported in refereed science education research journals, and introduces the Rasch modeling approach to developing measurement instruments in common science assessment domains,…

  8. Approaches of High School Instrumental Music Educators in Response to Student Challenges

    Edgar, Scott N.

    2016-01-01

    The purpose of this multiple instrumental case study was to explore approaches of four high school instrumental music educators assuming the role of facilitative teacher in responding to challenges affecting the social and emotional well-being of their students. This study utilized the framework of social emotional learning as a lens to view the…

  9. Fasting Glucose and the Risk of Depressive Symptoms: Instrumental-Variable Regression in the Cardiovascular Risk in Young Finns Study.

    Wesołowska, Karolina; Elovainio, Marko; Hintsa, Taina; Jokela, Markus; Pulkki-Råback, Laura; Pitkänen, Niina; Lipsanen, Jari; Tukiainen, Janne; Lyytikäinen, Leo-Pekka; Lehtimäki, Terho; Juonala, Markus; Raitakari, Olli; Keltikangas-Järvinen, Liisa

    2017-12-01

    Type 2 diabetes (T2D) has been associated with depressive symptoms, but the causal direction of this association and the underlying mechanisms, such as increased glucose levels, remain unclear. We used instrumental-variable regression with a genetic instrument (Mendelian randomization) to examine a causal role of increased glucose concentrations in the development of depressive symptoms. Data were from the population-based Cardiovascular Risk in Young Finns Study (n = 1217). Depressive symptoms were assessed in 2012 using a modified Beck Depression Inventory (BDI-I). Fasting glucose was measured concurrently with depressive symptoms. A genetic risk score for fasting glucose (with 35 single nucleotide polymorphisms) was used as an instrumental variable for glucose. Glucose was not associated with depressive symptoms in the standard linear regression (B = -0.04, 95% CI [-0.12, 0.04], p = .34), but the instrumental-variable regression showed an inverse association between glucose and depressive symptoms (B = -0.43, 95% CI [-0.79, -0.07], p = .020). The difference between the estimates of standard linear regression and instrumental-variable regression was significant (p = .026) CONCLUSION: Our results suggest that the association between T2D and depressive symptoms is unlikely to be caused by increased glucose concentrations. It seems possible that T2D might be linked to depressive symptoms due to low glucose levels.

  10. Bayesian approach to errors-in-variables in regression models

    Rozliman, Nur Aainaa; Ibrahim, Adriana Irawati Nur; Yunus, Rossita Mohammad

    2017-05-01

    In many applications and experiments, data sets are often contaminated with error or mismeasured covariates. When at least one of the covariates in a model is measured with error, Errors-in-Variables (EIV) model can be used. Measurement error, when not corrected, would cause misleading statistical inferences and analysis. Therefore, our goal is to examine the relationship of the outcome variable and the unobserved exposure variable given the observed mismeasured surrogate by applying the Bayesian formulation to the EIV model. We shall extend the flexible parametric method proposed by Hossain and Gustafson (2009) to another nonlinear regression model which is the Poisson regression model. We shall then illustrate the application of this approach via a simulation study using Markov chain Monte Carlo sampling methods.

  11. Estimating variability in functional images using a synthetic resampling approach

    Maitra, R.; O'Sullivan, F.

    1996-01-01

    Functional imaging of biologic parameters like in vivo tissue metabolism is made possible by Positron Emission Tomography (PET). Many techniques, such as mixture analysis, have been suggested for extracting such images from dynamic sequences of reconstructed PET scans. Methods for assessing the variability in these functional images are of scientific interest. The nonlinearity of the methods used in the mixture analysis approach makes analytic formulae for estimating variability intractable. The usual resampling approach is infeasible because of the prohibitive computational effort in simulating a number of sinogram. datasets, applying image reconstruction, and generating parametric images for each replication. Here we introduce an approach that approximates the distribution of the reconstructed PET images by a Gaussian random field and generates synthetic realizations in the imaging domain. This eliminates the reconstruction steps in generating each simulated functional image and is therefore practical. Results of experiments done to evaluate the approach on a model one-dimensional problem are very encouraging. Post-processing of the estimated variances is seen to improve the accuracy of the estimation method. Mixture analysis is used to estimate functional images; however, the suggested approach is general enough to extend to other parametric imaging methods

  12. Assessing Mucoadhesion in Polymer Gels: The Effect of Method Type and Instrument Variables

    Jéssica Bassi da Silva

    2018-03-01

    Full Text Available The process of mucoadhesion has been widely studied using a wide variety of methods, which are influenced by instrumental variables and experiment design, making the comparison between the results of different studies difficult. The aim of this work was to standardize the conditions of the detachment test and the rheological methods of mucoadhesion assessment for semisolids, and introduce a texture profile analysis (TPA method. A factorial design was developed to suggest standard conditions for performing the detachment force method. To evaluate the method, binary polymeric systems were prepared containing poloxamer 407 and Carbopol 971P®, Carbopol 974P®, or Noveon® Polycarbophil. The mucoadhesion of systems was evaluated, and the reproducibility of these measurements investigated. This detachment force method was demonstrated to be reproduceable, and gave different adhesion when mucin disk or ex vivo oral mucosa was used. The factorial design demonstrated that all evaluated parameters had an effect on measurements of mucoadhesive force, but the same was not observed for the work of adhesion. It was suggested that the work of adhesion is a more appropriate metric for evaluating mucoadhesion. Oscillatory rheology was more capable of investigating adhesive interactions than flow rheology. TPA method was demonstrated to be reproducible and can evaluate the adhesiveness interaction parameter. This investigation demonstrates the need for standardized methods to evaluate mucoadhesion and makes suggestions for a standard study design.

  13. RECOVERY OF LARGE ANGULAR SCALE CMB POLARIZATION FOR INSTRUMENTS EMPLOYING VARIABLE-DELAY POLARIZATION MODULATORS

    Miller, N. J.; Marriage, T. A.; Appel, J. W.; Bennett, C. L.; Eimer, J.; Essinger-Hileman, T.; Harrington, K.; Rostem, K.; Watts, D. J. [Department of Physics and Astronomy, Johns Hopkins University, 3400 N. Charles St., Baltimore, MD 21218 (United States); Chuss, D. T. [Department of Physics, Villanova University, 800 E Lancaster, Villanova, PA 19085 (United States); Wollack, E. J.; Fixsen, D. J.; Moseley, S. H.; Switzer, E. R., E-mail: Nathan.J.Miller@nasa.gov [Observational Cosmology Laboratory, Code 665, NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States)

    2016-02-20

    Variable-delay Polarization Modulators (VPMs) are currently being implemented in experiments designed to measure the polarization of the cosmic microwave background on large angular scales because of their capability for providing rapid, front-end polarization modulation and control over systematic errors. Despite the advantages provided by the VPM, it is important to identify and mitigate any time-varying effects that leak into the synchronously modulated component of the signal. In this paper, the effect of emission from a 300 K VPM on the system performance is considered and addressed. Though instrument design can greatly reduce the influence of modulated VPM emission, some residual modulated signal is expected. VPM emission is treated in the presence of rotational misalignments and temperature variation. Simulations of time-ordered data are used to evaluate the effect of these residual errors on the power spectrum. The analysis and modeling in this paper guides experimentalists on the critical aspects of observations using VPMs as front-end modulators. By implementing the characterizations and controls as described, front-end VPM modulation can be very powerful for mitigating 1/f noise in large angular scale polarimetric surveys. None of the systematic errors studied fundamentally limit the detection and characterization of B-modes on large scales for a tensor-to-scalar ratio of r = 0.01. Indeed, r < 0.01 is achievable with commensurately improved characterizations and controls.

  14. Inverse Ising problem in continuous time: A latent variable approach

    Donner, Christian; Opper, Manfred

    2017-12-01

    We consider the inverse Ising problem: the inference of network couplings from observed spin trajectories for a model with continuous time Glauber dynamics. By introducing two sets of auxiliary latent random variables we render the likelihood into a form which allows for simple iterative inference algorithms with analytical updates. The variables are (1) Poisson variables to linearize an exponential term which is typical for point process likelihoods and (2) Pólya-Gamma variables, which make the likelihood quadratic in the coupling parameters. Using the augmented likelihood, we derive an expectation-maximization (EM) algorithm to obtain the maximum likelihood estimate of network parameters. Using a third set of latent variables we extend the EM algorithm to sparse couplings via L1 regularization. Finally, we develop an efficient approximate Bayesian inference algorithm using a variational approach. We demonstrate the performance of our algorithms on data simulated from an Ising model. For data which are simulated from a more biologically plausible network with spiking neurons, we show that the Ising model captures well the low order statistics of the data and how the Ising couplings are related to the underlying synaptic structure of the simulated network.

  15. A new approach for modelling variability in residential construction projects

    Mehrdad Arashpour

    2013-06-01

    Full Text Available The construction industry is plagued by long cycle times caused by variability in the supply chain. Variations or undesirable situations are the result of factors such as non-standard practices, work site accidents, inclement weather conditions and faults in design. This paper uses a new approach for modelling variability in construction by linking relative variability indicators to processes. Mass homebuilding sector was chosen as the scope of the analysis because data is readily available. Numerous simulation experiments were designed by varying size of capacity buffers in front of trade contractors, availability of trade contractors, and level of variability in homebuilding processes. The measurements were shown to lead to an accurate determination of relationships between these factors and production parameters. The variability indicator was found to dramatically affect the tangible performance measures such as home completion rates. This study provides for future analysis of the production homebuilding sector, which may lead to improvements in performance and a faster product delivery to homebuyers.

  16. A new approach for modelling variability in residential construction projects

    Mehrdad Arashpour

    2013-06-01

    Full Text Available The construction industry is plagued by long cycle times caused by variability in the supply chain. Variations or undesirable situations are the result of factors such as non-standard practices, work site accidents, inclement weather conditions and faults in design. This paper uses a new approach for modelling variability in construction by linking relative variability indicators to processes. Mass homebuilding sector was chosen as the scope of the analysis because data is readily available. Numerous simulation experiments were designed by varying size of capacity buffers in front of trade contractors, availability of trade contractors, and level of variability in homebuilding processes. The measurements were shown to lead to an accurate determination of relationships between these factors and production parameters. The variability indicator was found to dramatically affect the tangible performance measures such as home completion rates. This study provides for future analysis of the production homebuilding sector, which may lead to improvements in performance and a faster product delivery to homebuyers. 

  17. Cross-cultural adaptation of instruments assessing breastfeeding determinants: a multi-step approach

    2014-01-01

    Background Cross-cultural adaptation is a necessary process to effectively use existing instruments in other cultural and language settings. The process of cross-culturally adapting, including translation, of existing instruments is considered a critical set to establishing a meaningful instrument for use in another setting. Using a multi-step approach is considered best practice in achieving cultural and semantic equivalence of the adapted version. We aimed to ensure the content validity of our instruments in the cultural context of KwaZulu-Natal, South Africa. Methods The Iowa Infant Feeding Attitudes Scale, Breastfeeding Self-Efficacy Scale-Short Form and additional items comprise our consolidated instrument, which was cross-culturally adapted utilizing a multi-step approach during August 2012. Cross-cultural adaptation was achieved through steps to maintain content validity and attain semantic equivalence in the target version. Specifically, Lynn’s recommendation to apply an item-level content validity index score was followed. The revised instrument was translated and back-translated. To ensure semantic equivalence, Brislin’s back-translation approach was utilized followed by the committee review to address any discrepancies that emerged from translation. Results Our consolidated instrument was adapted to be culturally relevant and translated to yield more reliable and valid results for use in our larger research study to measure infant feeding determinants effectively in our target cultural context. Conclusions Undertaking rigorous steps to effectively ensure cross-cultural adaptation increases our confidence that the conclusions we make based on our self-report instrument(s) will be stronger. In this way, our aim to achieve strong cross-cultural adaptation of our consolidated instruments was achieved while also providing a clear framework for other researchers choosing to utilize existing instruments for work in other cultural, geographic and population

  18. A variable resolution right TIN approach for gridded oceanographic data

    Marks, David; Elmore, Paul; Blain, Cheryl Ann; Bourgeois, Brian; Petry, Frederick; Ferrini, Vicki

    2017-12-01

    Many oceanographic applications require multi resolution representation of gridded data such as for bathymetric data. Although triangular irregular networks (TINs) allow for variable resolution, they do not provide a gridded structure. Right TINs (RTINs) are compatible with a gridded structure. We explored the use of two approaches for RTINs termed top-down and bottom-up implementations. We illustrate why the latter is most appropriate for gridded data and describe for this technique how the data can be thinned. While both the top-down and bottom-up approaches accurately preserve the surface morphology of any given region, the top-down method of vertex placement can fail to match the actual vertex locations of the underlying grid in many instances, resulting in obscured topology/bathymetry. Finally we describe the use of the bottom-up approach and data thinning in two applications. The first is to provide thinned, variable resolution bathymetry data for tests of storm surge and inundation modeling, in particular hurricane Katrina. Secondly we consider the use of the approach for an application to an oceanographic data grid of 3-D ocean temperature.

  19. Consumer Perception of Competitiveness – Theoretical-Instrumental Approach

    Duralia Oana

    2016-04-01

    Full Text Available Behaviorist economic approach has recorded a quantum leap in a relatively short period of time, as studying the relationship between consumer behavior and companies’ strategic decisions based on market competitiveness are no longer an unknown area. However, this issue remains actual in view of the fact that during the decision process of purchase, consumers do not always behave rationally, as they are the only ones who can appreciate if the offer of the company, in terms of range, quality, price and auxiliary services meet their needs or not. In this context, this paper aims to deepen the existing interconnection between the market decisions of the enterprise and consumer behavior, as measure standard for the competitiveness of a firm on a certain market.

  20. A Novel Approach to model EPIC variable background

    Marelli, M.; De Luca, A.; Salvetti, D.; Belfiore, A.

    2017-10-01

    One of the main aim of the EXTraS (Exploring the X-ray Transient and variable Sky) project is to characterise the variability of serendipitous XMM-Newton sources within each single observation. Unfortunately, 164 Ms out of the 774 Ms of cumulative exposure considered (21%) are badly affected by soft proton flares, hampering any classical analysis of field sources. De facto, the latest releases of the 3XMM catalog, as well as most of the analysis in literature, simply exclude these 'high background' periods from analysis. We implemented a novel SAS-indipendent approach to produce background-subtracted light curves, which allows to treat the case of very faint sources and very bright proton flares. EXTraS light curves of 3XMM-DR5 sources will be soon released to the community, together with new tools we are developing.

  1. Variability of floods, droughts and windstorms over the past 500 years in Central Europe based on documentary and instrumental data

    Brazdil, Rudolf

    2016-04-01

    Hydrological and meteorological extremes (HMEs) in Central Europe during the past 500 years can be reconstructed based on instrumental and documentary data. Documentary data about weather and related phenomena represent the basic source of information for historical climatology and hydrology, dealing with reconstruction of past climate and HMEs, their perception and impacts on human society. The paper presents the basic distribution of documentary data on (i) direct descriptions of HMEs and their proxies on the one hand and on (ii) individual and institutional data sources on the other. Several groups of documentary evidence such as narrative written records (annals, chronicles, memoirs), visual daily weather records, official and personal correspondence, special prints, financial and economic records (with particular attention to taxation data), newspapers, pictorial documentation, chronograms, epigraphic data, early instrumental observations, early scientific papers and communications are demonstrated with respect to extraction of information about HMEs, which concerns usually of their occurrence, severity, seasonality, meteorological causes, perception and human impacts. The paper further presents the analysis of 500-year variability of floods, droughts and windstorms on the base of series, created by combination of documentary and instrumental data. Results, advantages and drawbacks of such approach are documented on the examples from the Czech Lands. The analysis of floods concentrates on the River Vltava (Prague) and the River Elbe (Děčín) which show the highest frequency of floods occurring in the 19th century (mainly of winter synoptic type) and in the second half of the 16th century (summer synoptic type). Reported are also the most disastrous floods (August 1501, March and August 1598, February 1655, June 1675, February 1784, March 1845, February 1862, September 1890, August 2002) and the European context of floods in the severe winter 1783/84. Drought

  2. Reconstructing pre-instrumental streamflow in Eastern Australia using a water balance approach

    Tozer, C. R.; Kiem, A. S.; Vance, T. R.; Roberts, J. L.; Curran, M. A. J.; Moy, A. D.

    2018-03-01

    Streamflow reconstructions based on paleoclimate proxies provide much longer records than the short instrumental period records on which water resource management plans are currently based. In Australia there is a lack of in-situ high resolution paleoclimate proxy records, but remote proxies with teleconnections to Australian climate have utility in producing streamflow reconstructions. Here we investigate, via a case study for a catchment in eastern Australia, the novel use of an Antarctic ice-core based rainfall reconstruction within a Budyko-framework to reconstruct ∼1000 years of annual streamflow. The resulting streamflow reconstruction captures interannual to decadal variability in the instrumental streamflow, validating both the use of the ice core rainfall proxy record and the Budyko-framework method. In the preinstrumental era the streamflow reconstruction shows longer wet and dry epochs and periods of streamflow variability that are higher than observed in the instrumental era. Importantly, for both the instrumental record and preinstrumental reconstructions, the wet (dry) epochs in the rainfall record are shorter (longer) in the streamflow record and this non-linearity must be considered when inferring hydroclimatic risk or historical water availability directly from rainfall proxy records alone. These insights provide a better understanding of present infrastructure vulnerability in the context of past climate variability for eastern Australia. The streamflow reconstruction presented here also provides a better understanding of the range of hydroclimatic variability possible, and therefore represents a more realistic baseline on which to quantify the potential impacts of anthropogenic climate change on water security.

  3. Job demands and job strain as risk factors for employee wellbeing in elderly care: an instrumental-variables analysis.

    Elovainio, Marko; Heponiemi, Tarja; Kuusio, Hannamaria; Jokela, Markus; Aalto, Anna-Mari; Pekkarinen, Laura; Noro, Anja; Finne-Soveri, Harriet; Kivimäki, Mika; Sinervo, Timo

    2015-02-01

    The association between psychosocial work environment and employee wellbeing has repeatedly been shown. However, as environmental evaluations have typically been self-reported, the observed associations may be attributable to reporting bias. Applying instrumental-variable regression, we used staffing level (the ratio of staff to residents) as an unconfounded instrument for self-reported job demands and job strain to predict various indicators of wellbeing (perceived stress, psychological distress and sleeping problems) among 1525 registered nurses, practical nurses and nursing assistants working in elderly care wards. In ordinary regression, higher self-reported job demands and job strain were associated with increased risk of perceived stress, psychological distress and sleeping problems. The effect estimates for the associations of these psychosocial factors with perceived stress and psychological distress were greater, but less precisely estimated, in an instrumental-variables analysis which took into account only the variation in self-reported job demands and job strain that was explained by staffing level. No association between psychosocial factors and sleeping problems was observed with the instrumental-variable analysis. These results support a causal interpretation of high self-reported job demands and job strain being risk factors for employee wellbeing. © The Author 2014. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  4. Integrated variable projection approach (IVAPA) for parallel magnetic resonance imaging.

    Zhang, Qiao; Sheng, Jinhua

    2012-10-01

    Parallel magnetic resonance imaging (pMRI) is a fast method which requires algorithms for the reconstructing image from a small number of measured k-space lines. The accurate estimation of the coil sensitivity functions is still a challenging problem in parallel imaging. The joint estimation of the coil sensitivity functions and the desired image has recently been proposed to improve the situation by iteratively optimizing both the coil sensitivity functions and the image reconstruction. It regards both the coil sensitivities and the desired images as unknowns to be solved for jointly. In this paper, we propose an integrated variable projection approach (IVAPA) for pMRI, which integrates two individual processing steps (coil sensitivity estimation and image reconstruction) into a single processing step to improve the accuracy of the coil sensitivity estimation using the variable projection approach. The method is demonstrated to be able to give an optimal solution with considerably reduced artifacts for high reduction factors and a low number of auto-calibration signal (ACS) lines, and our implementation has a fast convergence rate. The performance of the proposed method is evaluated using a set of in vivo experiment data. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. A pilot's opinion - VTOL control design requirements for the instrument approach task.

    Patton, J. M., Jr.

    1972-01-01

    This paper presents pilot opinion supported by test data concerning flight control and display concepts and control system design requirements for VTOL aircraft in the instrument approach task. Material presented is drawn from research flights in the following aircraft: Dornier DO-31, Short SC-1, LTV XC-142A, and Boeing-Vertol CH-46. The control system concepts and mechanizations employed in the above aircraft are discussed, and the effect of control system augmentation is shown on performance. Operational procedures required in the instrument approach task are described, with comments on need for automation and combining of control functions.

  6. Pilot dynamics for instrument approach tasks: Full panel multiloop and flight director operations

    Weir, D. H.; Mcruer, D. T.

    1972-01-01

    Measurements and interpretations of single and mutiloop pilot response properties during simulated instrument approach are presented. Pilot subjects flew Category 2-like ILS approaches in a fixed base DC-8 simulaton. A conventional instrument panel and controls were used, with simulated vertical gust and glide slope beam bend forcing functions. Reduced and interpreted pilot describing functions and remmant are given for pitch attitude, flight director, and multiloop (longitudinal) control tasks. The response data are correlated with simultaneously recorded eye scanning statistics, previously reported in NASA CR-1535. The resulting combined response and scanning data and their interpretations provide a basis for validating and extending the theory of manual control displays.

  7. S-variable approach to LMI-based robust control

    Ebihara, Yoshio; Arzelier, Denis

    2015-01-01

    This book shows how the use of S-variables (SVs) in enhancing the range of problems that can be addressed with the already-versatile linear matrix inequality (LMI) approach to control can, in many cases, be put on a more unified, methodical footing. Beginning with the fundamentals of the SV approach, the text shows how the basic idea can be used for each problem (and when it should not be employed at all). The specific adaptations of the method necessitated by each problem are also detailed. The problems dealt with in the book have the common traits that: analytic closed-form solutions are not available; and LMIs can be applied to produce numerical solutions with a certain amount of conservatism. Typical examples are robustness analysis of linear systems affected by parametric uncertainties and the synthesis of a linear controller satisfying multiple, often  conflicting, design specifications. For problems in which LMI methods produce conservative results, the SV approach is shown to achieve greater accuracy...

  8. XML for nuclear instrument control and monitoring: an approach towards standardisation

    Bharade, S.K.; Ananthakrishnan, T.S.; Kataria, S.K.; Singh, S.K.

    2004-01-01

    Communication among heterogeneous system with applications running under different operating systems and applications developed under different platforms has undergone rapid changes due to the adoption of XML standards. These are being developed for different industries like Chemical, Medical, Commercial etc. The High Energy Physics community has already a standard for exchange of data among different applications , under heterogeneous distributed systems like the CMS Data Acquisition System. There are a large number of Nuclear Instruments supplied by different manufactures which are increasingly getting connected. This approach is getting wider acceptance in instruments at reactor sites, accelerator sites and complex nuclear experiments -especially at centres like CERN. In order for these instruments to be able to describe the data which is available from them in a platform independent manner XML approach has been developed. This paper is the first attempt at Electronics Division for proposing an XML standard for control, monitoring, Data Acquisition and Analysis generated by Nuclear Instruments at Accelerator sites, Nuclear Reactor plant and Laboratory. The gamut of Nuclear Instruments include Multichannel Analysers, Health Physics Instruments, Accelerator Control Systems, Reactor Regulating systems, Flux mapping Systems etc. (author)

  9. Decoupling Solar Variability and Instrument Trends Using the Multiple Same-Irradiance-Level (MuSIL) Analysis Technique

    Woods, Thomas N.; Eparvier, Francis G.; Harder, Jerald; Snow, Martin

    2018-05-01

    The solar spectral irradiance (SSI) dataset is a key record for studying and understanding the energetics and radiation balance in Earth's environment. Understanding the long-term variations of the SSI over timescales of the 11-year solar activity cycle and longer is critical for many Sun-Earth research topics. Satellite measurements of the SSI have been made since the 1970s, most of them in the ultraviolet, but recently also in the visible and near-infrared. A limiting factor for the accuracy of previous solar variability results is the uncertainties for the instrument degradation corrections, which need fairly large corrections relative to the amount of solar cycle variability at some wavelengths. The primary objective of this investigation has been to separate out solar cycle variability and any residual uncorrected instrumental trends in the SSI measurements from the Solar Radiation and Climate Experiment (SORCE) mission and the Thermosphere, Mesosphere, Ionosphere, Energetic, and Dynamics (TIMED) mission. A new technique called the Multiple Same-Irradiance-Level (MuSIL) analysis has been developed, which examines an SSI time series at different levels of solar activity to provide long-term trends in an SSI record, and the most common result is a downward trend that most likely stems from uncorrected instrument degradation. This technique has been applied to each wavelength in the SSI records from SORCE (2003 - present) and TIMED (2002 - present) to provide new solar cycle variability results between 27 nm and 1600 nm with a resolution of about 1 nm at most wavelengths. This technique, which was validated with the highly accurate total solar irradiance (TSI) record, has an estimated relative uncertainty of about 5% of the measured solar cycle variability. The MuSIL results are further validated with the comparison of the new solar cycle variability results from different solar cycles.

  10. Does the Early Bird Catch the Worm? Instrumental Variable Estimates of Educational Effects of Age of School Entry in Germany

    Puhani, Patrick A.; Weber, Andrea M.

    2006-01-01

    We estimate the effect of age of school entry on educational outcomes using two different data sets for Germany, sampling pupils at the end of primary school and in the middle of secondary school. Results are obtained based on instrumental variable estimation exploiting the exogenous variation in month of birth. We find robust and significant positive effects on educational outcomes for pupils who enter school at seven instead of six years of age: Test scores at the end of primary school incr...

  11. The Structure of Character Strengths: Variable- and Person-Centered Approaches

    Małgorzata Najderska

    2018-02-01

    Full Text Available This article examines the structure of character strengths (Peterson and Seligman, 2004 following both variable-centered and person-centered approaches. We used the International Personality Item Pool-Values in Action (IPIP-VIA questionnaire. The IPIP-VIA measures 24 character strengths and consists of 213 direct and reversed items. The present study was conducted in a heterogeneous group of N = 908 Poles (aged 18–78, M = 28.58. It was part of a validation project of a Polish version of the IPIP-VIA questionnaire. The variable-centered approach was used to examine the structure of character strengths on both the scale and item levels. The scale-level results indicated a four-factor structure that can be interpreted based on four of the five personality traits from the Big Five theory (excluding neuroticism. The item-level analysis suggested a slightly different and limited set of character strengths (17 not 24. After conducting a second-order analysis, a four-factor structure emerged, and three of the factors could be interpreted as being consistent with the scale-level factors. Three character strength profiles were found using the person-centered approach. Two of them were consistent with alpha and beta personality metatraits. The structure of character strengths can be described by using categories from the Five Factor Model of personality and metatraits. They form factors similar to some personality traits and occur in similar constellations as metatraits. The main contributions of this paper are: (1 the validation of IPIP-VIA conducted in variable-centered approach in a new research group (Poles using a different measurement instrument; (2 introducing the person-centered approach to the study of the structure of character strengths.

  12. 8 years of Solar Spectral Irradiance Variability Observed from the ISS with the SOLAR/SOLSPEC Instrument

    Damé, Luc; Bolsée, David; Meftah, Mustapha; Irbah, Abdenour; Hauchecorne, Alain; Bekki, Slimane; Pereira, Nuno; Cessateur, Marchand; Gäel; , Marion; et al.

    2016-10-01

    Accurate measurements of Solar Spectral Irradiance (SSI) are of primary importance for a better understanding of solar physics and of the impact of solar variability on climate (via Earth's atmospheric photochemistry). The acquisition of a top of atmosphere reference solar spectrum and of its temporal and spectral variability during the unusual solar cycle 24 is of prime interest for these studies. These measurements are performed since April 2008 with the SOLSPEC spectro-radiometer from the far ultraviolet to the infrared (166 nm to 3088 nm). This instrument, developed under a fruitful LATMOS/BIRA-IASB collaboration, is part of the Solar Monitoring Observatory (SOLAR) payload, externally mounted on the Columbus module of the International Space Station (ISS). The SOLAR mission, with its actual 8 years duration, will cover almost the entire solar cycle 24. We present here the in-flight operations and performances of the SOLSPEC instrument, including the engineering corrections, calibrations and improved know-how procedure for aging corrections. Accordingly, a SSI reference spectrum from the UV to the NIR will be presented, together with its variability in the UV, as measured by SOLAR/SOLSPEC for 8 years. Uncertainties on these measurements and comparisons with other instruments will be briefly discussed.

  13. Spectral Kernel Approach to Study Radiative Response of Climate Variables and Interannual Variability of Reflected Solar Spectrum

    Jin, Zhonghai; Wielicki, Bruce A.; Loukachine, Constantin; Charlock, Thomas P.; Young, David; Noeel, Stefan

    2011-01-01

    The radiative kernel approach provides a simple way to separate the radiative response to different climate parameters and to decompose the feedback into radiative and climate response components. Using CERES/MODIS/Geostationary data, we calculated and analyzed the solar spectral reflectance kernels for various climate parameters on zonal, regional, and global spatial scales. The kernel linearity is tested. Errors in the kernel due to nonlinearity can vary strongly depending on climate parameter, wavelength, surface, and solar elevation; they are large in some absorption bands for some parameters but are negligible in most conditions. The spectral kernels are used to calculate the radiative responses to different climate parameter changes in different latitudes. The results show that the radiative response in high latitudes is sensitive to the coverage of snow and sea ice. The radiative response in low latitudes is contributed mainly by cloud property changes, especially cloud fraction and optical depth. The large cloud height effect is confined to absorption bands, while the cloud particle size effect is found mainly in the near infrared. The kernel approach, which is based on calculations using CERES retrievals, is then tested by direct comparison with spectral measurements from Scanning Imaging Absorption Spectrometer for Atmospheric Cartography (SCIAMACHY) (a different instrument on a different spacecraft). The monthly mean interannual variability of spectral reflectance based on the kernel technique is consistent with satellite observations over the ocean, but not over land, where both model and data have large uncertainty. RMS errors in kernel ]derived monthly global mean reflectance over the ocean compared to observations are about 0.001, and the sampling error is likely a major component.

  14. Modern spinal instrumentation. Part 2: Multimodality imaging approach for assessment of complications

    Allouni, A.K.; Davis, W.; Mankad, K.; Rankine, J.; Davagnanam, I.

    2013-01-01

    Radiologists frequently encounter studies demonstrating spinal instrumentation, either as part of the patient's postoperative evaluation, or as incidental to a study performed for another purpose. It is important for the reporting radiologist to identify potential complications of commonly used spinal implants. Part 1 of this review examined both the surgical approaches used and the normal appearances of these spinal implants and bone grafting techniques. This second part of the review will focus on the multimodal imaging strategy adopted in the assessment of the instrumented spine and the demonstration of imaging findings of common postoperative complications.

  15. STATUS SOSIAL EKONOMI DAN FERTILITAS: A Latent Variable Approach

    Suandi -

    2012-11-01

    Full Text Available The main problems faced by developing countries including Indonesia are not onlyeconomic problems that tend to harm, but still met the high fertility rate. The purpose ofwriting to find out the relationship between socioeconomic status to the level of fertilitythrough the "A Latent Variable Approach." The study adopts the approach of fertility oneconomic development. Economic development based on the theories of Malthus: anincrease in "income" is slower than the increase in births (fertility and is the root ofpeople falling into poverty. However, Becker made linkage model or the influence ofchildren income and price. According to Becker, viewed from the aspect of demand thatthe price of children is greater than the income effect.The study shows that (1 level of education correlates positively on income andnegatively affect fertility, (2 age structure of women (control contraceptives adverselyaffect fertility. That is, the older the age, the level of individual productivity and lowerfertility or declining, and (3 husband's employment status correlated positively to theearnings (income. Through a permanent factor income or household income referred toas a negative influence on fertility. There are differences in value orientation of childrenbetween advanced society (rich with a backward society (the poor. The poor, forexample, the value of children is more production of goods. That is, children born moreemphasis on aspects of the number or the number of children owned (quantity, numberof children born by the poor is expected to help their parents at the age of retirement orno longer productive so that the child is expected to assist them in economic, security,and social security (insurance, while the developed (rich children are moreconsumption value or quality of the child.

  16. How to get rid of W: a latent variables approach to modelling spatially lagged variables

    Folmer, H.; Oud, J.

    2008-01-01

    In this paper we propose a structural equation model (SEM) with latent variables to model spatial dependence. Rather than using the spatial weights matrix W, we propose to use latent variables to represent spatial dependence and spillover effects, of which the observed spatially lagged variables are

  17. How to get rid of W : a latent variables approach to modelling spatially lagged variables

    Folmer, Henk; Oud, Johan

    2008-01-01

    In this paper we propose a structural equation model (SEM) with latent variables to model spatial dependence. Rather than using the spatial weights matrix W, we propose to use latent variables to represent spatial dependence and spillover effects, of which the observed spatially lagged variables are

  18. Financial and testamentary capacity evaluations: procedures and assessment instruments underneath a functional approach.

    Sousa, Liliana B; Simões, Mário R; Firmino, Horácio; Peisah, Carmelle

    2014-02-01

    Mental health professionals are frequently involved in mental capacity determinations. However, there is a lack of specific measures and well-defined procedures for these evaluations. The main purpose of this paper is to provide a review of financial and testamentary capacity evaluation procedures, including not only the traditional neuropsychological and functional assessment but also the more recently developed forensic assessment instruments (FAIs), which have been developed to provide a specialized answer to legal systems regarding civil competencies. Here the main guidelines, papers, and other references are reviewed in order to achieve a complete and comprehensive selection of instruments used in the assessment of financial and testamentary capacity. Although some specific measures for financial abilities have been developed recently, the same is not true for testamentary capacity. Here are presented several instruments or methodologies for assessing financial and testamentary capacity, including neuropsychological assessment, functional assessment scales, performance based functional assessment instruments, and specific FAIs. FAIs are the only specific instruments intended to provide a specific and direct answer to the assessment of financial capacity based on legal systems. Considering the need to move from a diagnostic to a functional approach in financial and testamentary capacity evaluations, it is essential to consider both general functional examination as well as cognitive functioning.

  19. The radiation budget of stratocumulus clouds measured by tethered balloon instrumentation: Variability of flux measurements

    Duda, David P.; Stephens, Graeme L.; Cox, Stephen K.

    1990-01-01

    Measurements of longwave and shortwave radiation were made using an instrument package on the NASA tethered balloon during the FIRE Marine Stratocumulus experiment. Radiation data from two pairs of pyranometers were used to obtain vertical profiles of the near-infrared and total solar fluxes through the boundary layer, while a pair of pyrgeometers supplied measurements of the longwave fluxes in the cloud layer. The radiation observations were analyzed to determine heating rates and to measure the radiative energy budget inside the stratocumulus clouds during several tethered balloon flights. The radiation fields in the cloud layer were also simulated by a two-stream radiative transfer model, which used cloud optical properties derived from microphysical measurements and Mie scattering theory.

  20. Measuring Instrument Constructs of Return Factors for Green Office Building Investments Variables Using Rasch Measurement Model

    Isa Mona

    2016-01-01

    Full Text Available This paper is a preliminary study on rationalising green office building investments in Malaysia. The aim of this paper is attempt to introduce the application of Rasch measurement model analysis to determine the validity and reliability of each construct in the questionnaire. In achieving this objective, a questionnaire survey was developed consists of 6 sections and a total of 106 responses were received from various investors who own and lease office buildings in Kuala Lumpur. The Rasch Measurement analysis is used to measure the quality control of item constructs in the instrument by measuring the specific objectivity within the same dimension, to reduce ambiguous measures, and a realistic estimation of precision and implicit quality. The Rasch analysis consists of the summary statistics, item unidimensionality and item measures. A result shows the items and respondent (person reliability is at 0.91 and 0.95 respectively.

  1. ExoMars Trace Gas Orbiter Instrument Modelling Approach to Streamline Science Operations

    Munoz Fernandez, Michela; Frew, David; Ashman, Michael; Cardesin Moinelo, Alejandro; Garcia Beteta, Juan Jose; Geiger, Bernhard; Metcalfe, Leo; Nespoli, Federico; Muniz Solaz, Carlos

    2018-05-01

    ExoMars Trace Gas Orbiter (TGO) science operations activities are centralised at ESAC's Science Operations Centre (SOC). The SOC receives the inputs from the principal investigators (PIs) in order to implement and deliver the spacecraft pointing requests and instrument timelines to the Mission Operations Centre (MOC). The high number of orbits per planning cycle has made it necessary to abstract the planning interactions between the SOC and the PI teams at the observation level. This paper describes the modelling approach we have conducted for TGOís instruments to streamline science operations. We have created dynamic observation types that scale to adapt to the conditions specified by the PI teams including observation timing, and pointing block parameters calculated from observation geometry. This approach is considered and improvement with respect to previous missions where the generation of the observation pointing and commanding requests was performed manually by the instrument teams. Automation software assists us to effectively handle the high density of planned orbits with increasing volume of scientific data and to successfully meet opportunistic scientific goals and objectives. Our planning tool combines the instrument observation definition files provided by the PIs together with the flight dynamics products to generate the Pointing Requests and the instrument timeline (ITL). The ITL contains all the validated commands at the TC sequence level and computes the resource envelopes (data rate, power, data volume) within the constraints. At the SOC, our main goal is to maximise the science output while minimising the number of iterations among the teams, ensuring that the timeline does not violate the state transitions allowed in the Mission Operations Rules and Constraints Document.

  2. A Diagrammatic Exposition of Regression and Instrumental Variables for the Beginning Student

    Foster, Gigi

    2009-01-01

    Some beginning students of statistics and econometrics have difficulty with traditional algebraic approaches to explaining regression and related techniques. For these students, a simple and intuitive diagrammatic introduction as advocated by Kennedy (2008) may prove a useful framework to support further study. The author presents a series of…

  3. Testing Causal Impacts of a School-Based SEL Intervention Using Instrumental Variable Techniques

    Torrente, Catalina; Nathanson, Lori; Rivers, Susan; Brackett, Marc

    2015-01-01

    Children's social-emotional skills, such as conflict resolution and emotion regulation, have been linked to a number of highly regarded academic and social outcomes. The current study presents preliminary results from a causal test of the theory of change of RULER, a universal school-based approach to social and emotional learning (SEL).…

  4. Introducing instrumental variables in the LS-SVM based identification framework

    Laurain, V.; Zheng, W-X.; Toth, R.

    2011-01-01

    Least-Squares Support Vector Machines (LS-SVM) represent a promising approach to identify nonlinear systems via nonparametric estimation of the nonlinearities in a computationally and stochastically attractive way. All the methods dedicated to the solution of this problem rely on the minimization of

  5. A variable stiffness mechanism for steerable percutaneous instruments: integration in a needle.

    De Falco, Iris; Culmone, Costanza; Menciassi, Arianna; Dankelman, Jenny; van den Dobbelsteen, John J

    2018-06-04

    Needles are advanced tools commonly used in minimally invasive medical procedures. The accurate manoeuvrability of flexible needles through soft tissues is strongly determined by variations in tissue stiffness, which affects the needle-tissue interaction and thus causes needle deflection. This work presents a variable stiffness mechanism for percutaneous needles capable of compensating for variations in tissue stiffness and undesirable trajectory changes. It is composed of compliant segments and rigid plates alternately connected in series and longitudinally crossed by four cables. The tensioning of the cables allows the omnidirectional steering of the tip and the stiffness tuning of the needle. The mechanism was tested separately under different working conditions, demonstrating a capability to exert up to 3.6 N. Afterwards, the mechanism was integrated into a needle, and the overall device was tested in gelatine phantoms simulating the stiffness of biological tissues. The needle demonstrated the capability to vary deflection (from 11.6 to 4.4 mm) and adapt to the inhomogeneity of the phantoms (from 21 to 80 kPa) depending on the activation of the variable stiffness mechanism. Graphical abstract ᅟ.

  6. On the shape of posterior densities and credible sets in instrumental variable regression models with reduced rank: an application of flexible sampling methods using neural networks

    Hoogerheide, L.F.; Kaashoek, J.F.; van Dijk, H.K.

    2007-01-01

    Likelihoods and posteriors of instrumental variable (IV) regression models with strong endogeneity and/or weak instruments may exhibit rather non-elliptical contours in the parameter space. This may seriously affect inference based on Bayesian credible sets. When approximating posterior

  7. On the shape of posterior densities and credible sets in instrumental variable regression models with reduced rank: an application of flexible sampling methods using neural networks

    L.F. Hoogerheide (Lennart); J.F. Kaashoek (Johan); H.K. van Dijk (Herman)

    2005-01-01

    textabstractLikelihoods and posteriors of instrumental variable regression models with strong endogeneity and/or weak instruments may exhibit rather non-elliptical contours in the parameter space. This may seriously affect inference based on Bayesian credible sets. When approximating such contours

  8. DESIGNING AFFECTIVE INSTRUMENT BASED ON SCIENTIFIC APPROACH FOR ENGLISH LANGUAGE LEARNING

    Maisarah Ira

    2018-01-01

    Full Text Available This research was describing the designing of instrument for affective assessment in English language teaching. The focus of the designing was only for observation sheet that will be used by English teachers during the teaching and learning process. The instrument was designed based on scientific approach that has five stages namely observing, questioning, experimenting, associating, and communicating. In the designing process, ADDIE Model was used as the method of research. The designing of instrument was considering the gap between the reality and the teachers’ need. The result showed that the designing was also notice to the affective taxonomy such as receiving, responding, valuing, organization, and characterization. Then, three key words were used as the indicator to show the five levels of affective taxonomy such as seriously, volunteer, and without asked by teacher. Furthermore, eighteen types of affective such as religious, honesty, responsible, discipline, hard work, self confidence, logical thinking, critical thinking, creative, innovative, independent, curiosity, love knowledge, respect, polite, democracy, emotional intelligence, and pluralist were put on each stage of scientific approach. So, it is hoped that can be implemented in all of context of English language teaching at schools and can assess the students’ affective comprehensively.

  9. Breastfeeding and the risk of childhood asthma: A two-stage instrumental variable analysis to address endogeneity.

    Sharma, Nivita D

    2017-09-01

    Several explanations for the inconsistent results on the effects of breastfeeding on childhood asthma have been suggested. The purpose of this study was to investigate one unexplored explanation, which is the presence of a potential endogenous relationship between breastfeeding and childhood asthma. Endogeneity exists when an explanatory variable is correlated with the error term for reasons such as selection bias, reverse causality, and unmeasured confounders. Unadjusted endogeneity will bias the effect of breastfeeding on childhood asthma. To investigate potential endogeneity, a cross-sectional study of breastfeeding practices and incidence of childhood asthma in 87 pediatric patients in Georgia, the USA, was conducted using generalized linear modeling and a two-stage instrumental variable analysis. First, the relationship between breastfeeding and childhood asthma was analyzed without considering endogeneity. Second, tests for presence of endogeneity were performed and having detected endogeneity between breastfeeding and childhood asthma, a two-stage instrumental variable analysis was performed. The first stage of this analysis estimated the duration of breastfeeding and the second-stage estimated the risk of childhood asthma. When endogeneity was not taken into account, duration of breastfeeding was found to significantly increase the risk of childhood asthma (relative risk ratio [RR]=2.020, 95% confidence interval [CI]: [1.143-3.570]). After adjusting for endogeneity, duration of breastfeeding significantly reduced the risk of childhood asthma (RR=0.003, 95% CI: [0.000-0.240]). The findings suggest that researchers should consider evaluating how the presence of endogeneity could affect the relationship between duration of breastfeeding and the risk of childhood asthma. © 2017 EAACI and John Wiley and Sons A/S. Published by John Wiley and Sons Ltd.

  10. An integrated approach to endoscopic instrument tracking for augmented reality applications in surgical simulation training.

    Loukas, Constantinos; Lahanas, Vasileios; Georgiou, Evangelos

    2013-12-01

    Despite the popular use of virtual and physical reality simulators in laparoscopic training, the educational potential of augmented reality (AR) has not received much attention. A major challenge is the robust tracking and three-dimensional (3D) pose estimation of the endoscopic instrument, which are essential for achieving interaction with the virtual world and for realistic rendering when the virtual scene is occluded by the instrument. In this paper we propose a method that addresses these issues, based solely on visual information obtained from the endoscopic camera. Two different tracking algorithms are combined for estimating the 3D pose of the surgical instrument with respect to the camera. The first tracker creates an adaptive model of a colour strip attached to the distal part of the tool (close to the tip). The second algorithm tracks the endoscopic shaft, using a combined Hough-Kalman approach. The 3D pose is estimated with perspective geometry, using appropriate measurements extracted by the two trackers. The method has been validated on several complex image sequences for its tracking efficiency, pose estimation accuracy and applicability in AR-based training. Using a standard endoscopic camera, the absolute average error of the tip position was 2.5 mm for working distances commonly found in laparoscopic training. The average error of the instrument's angle with respect to the camera plane was approximately 2°. The results are also supplemented by video segments of laparoscopic training tasks performed in a physical and an AR environment. The experiments yielded promising results regarding the potential of applying AR technologies for laparoscopic skills training, based on a computer vision framework. The issue of occlusion handling was adequately addressed. The estimated trajectory of the instruments may also be used for surgical gesture interpretation and assessment. Copyright © 2013 John Wiley & Sons, Ltd.

  11. Autonomous optical navigation using nanosatellite-class instruments: a Mars approach case study

    Enright, John; Jovanovic, Ilija; Kazemi, Laila; Zhang, Harry; Dzamba, Tom

    2018-02-01

    This paper examines the effectiveness of small star trackers for orbital estimation. Autonomous optical navigation has been used for some time to provide local estimates of orbital parameters during close approach to celestial bodies. These techniques have been used extensively on spacecraft dating back to the Voyager missions, but often rely on long exposures and large instrument apertures. Using a hyperbolic Mars approach as a reference mission, we present an EKF-based navigation filter suitable for nanosatellite missions. Observations of Mars and its moons allow the estimator to correct initial errors in both position and velocity. Our results show that nanosatellite-class star trackers can produce good quality navigation solutions with low position (<300 {m}) and velocity (<0.15 {m/s}) errors as the spacecraft approaches periapse.

  12. An automated approach for finding variable-constant pairing bugs

    Lawall, Julia; Lo, David

    2010-01-01

    program-analysis and data-mining based approach to identify the uses of named constants and to identify anomalies in these uses.  We have applied our approach to a recent version of the Linux kernel and have found a number of bugs affecting both correctness and software maintenance.  Many of these bugs...... have been validated by the Linux developers....

  13. In search of control variables : A systems approach

    Dalenoort, GJ

    1997-01-01

    Motor processes cannot be modeled by a single (unified) model. Instead, a number of models at different levels of description are needed. The concepts of control and control variable only make sense at the functional level. A clear distinction must be made between external models and internal

  14. A Systematic Review of Statistical Methods Used to Test for Reliability of Medical Instruments Measuring Continuous Variables

    Rafdzah Zaki

    2013-06-01

    Full Text Available   Objective(s: Reliability measures precision or the extent to which test results can be replicated. This is the first ever systematic review to identify statistical methods used to measure reliability of equipment measuring continuous variables. This studyalso aims to highlight the inappropriate statistical method used in the reliability analysis and its implication in the medical practice.   Materials and Methods: In 2010, five electronic databases were searched between 2007 and 2009 to look for reliability studies. A total of 5,795 titles were initially identified. Only 282 titles were potentially related, and finally 42 fitted the inclusion criteria. Results: The Intra-class Correlation Coefficient (ICC is the most popular method with 25 (60% studies having used this method followed by the comparing means (8 or 19%. Out of 25 studies using the ICC, only 7 (28% reported the confidence intervals and types of ICC used. Most studies (71% also tested the agreement of instruments. Conclusion: This study finds that the Intra-class Correlation Coefficient is the most popular method used to assess the reliability of medical instruments measuring continuous outcomes. There are also inappropriate applications and interpretations of statistical methods in some studies. It is important for medical researchers to be aware of this issue, and be able to correctly perform analysis in reliability studies.

  15. The Formation of Instruments of Management of Industrial Enterprises According to the Theoretical and Functional Approaches

    Raiko Diana V.

    2018-03-01

    Full Text Available The article is aimed at the substantiation based on the analysis of the company theories of the basic theoretical provisions on the formation of industrial enterprise management instruments. The article determines that the subject of research in theories is enterprise, the object is the process of management of potential according to the forms of business organization and technology of partnership relations, the goal is high financial results, stabilization of the activity, and social responsibility. The publication carries out an analysis of enterprise theories on the determining of its essence as a socio-economic system in the following directions: technical preparation of production, economic theory and law, theory of systems, marketing-management. As a result of the research, the general set of functions has been identified – the socio-economic functions of enterprise by groups: information-legal, production, marketing-management, social responsibility. When building management instruments, it is suggested to take into consideration the direct and inverse relationships of enterprise at all levels of management – micro, meso and macro. On this ground, the authors have developed provisions on formation of instruments of management of industrial enterprises according to two approaches – theoretical and functional.

  16. An algebraic geometric approach to separation of variables

    Schöbel, Konrad

    2015-01-01

    Konrad Schöbel aims to lay the foundations for a consequent algebraic geometric treatment of variable separation, which is one of the oldest and most powerful methods to construct exact solutions for the fundamental equations in classical and quantum physics. The present work reveals a surprising algebraic geometric structure behind the famous list of separation coordinates, bringing together a great range of mathematics and mathematical physics, from the late 19th century theory of separation of variables to modern moduli space theory, Stasheff polytopes and operads. "I am particularly impressed by his mastery of a variety of techniques and his ability to show clearly how they interact to produce his results.”   (Jim Stasheff)   Contents The Foundation: The Algebraic Integrability Conditions The Proof of Concept: A Complete Solution for the 3-Sphere The Generalisation: A Solution for Spheres of Arbitrary Dimension The Perspectives: Applications and Generalisations   Target Groups Scientists in the fie...

  17. Approaches for modeling within subject variability in pharmacometric count data analysis: dynamic inter-occasion variability and stochastic differential equations.

    Deng, Chenhui; Plan, Elodie L; Karlsson, Mats O

    2016-06-01

    Parameter variation in pharmacometric analysis studies can be characterized as within subject parameter variability (WSV) in pharmacometric models. WSV has previously been successfully modeled using inter-occasion variability (IOV), but also stochastic differential equations (SDEs). In this study, two approaches, dynamic inter-occasion variability (dIOV) and adapted stochastic differential equations, were proposed to investigate WSV in pharmacometric count data analysis. These approaches were applied to published count models for seizure counts and Likert pain scores. Both approaches improved the model fits significantly. In addition, stochastic simulation and estimation were used to explore further the capability of the two approaches to diagnose and improve models where existing WSV is not recognized. The results of simulations confirmed the gain in introducing WSV as dIOV and SDEs when parameters vary randomly over time. Further, the approaches were also informative as diagnostics of model misspecification, when parameters changed systematically over time but this was not recognized in the structural model. The proposed approaches in this study offer strategies to characterize WSV and are not restricted to count data.

  18. An Instrumental Variable Probit (IVP Analysis on Depressed Mood in Korea: The Impact of Gender Differences and Other Socio-Economic Factors

    Lara Gitto

    2015-08-01

    Full Text Available Background Depression is a mental health state whose frequency has been increasing in modern societies. It imposes a great burden, because of the strong impact on people’s quality of life and happiness. Depression can be reliably diagnosed and treated in primary care: if more people could get effective treatments earlier, the costs related to depression would be reversed. The aim of this study was to examine the influence of socio-economic factors and gender on depressed mood, focusing on Korea. In fact, in spite of the great amount of empirical studies carried out for other countries, few epidemiological studies have examined the socio-economic determinants of depression in Korea and they were either limited to samples of employed women or did not control for individual health status. Moreover, as the likely data endogeneity (i.e. the possibility of correlation between the dependent variable and the error term as a result of autocorrelation or simultaneity, such as, in this case, the depressed mood due to health factors that, in turn might be caused by depression, might bias the results, the present study proposes an empirical approach, based on instrumental variables, to deal with this problem. Methods Data for the year 2008 from the Korea National Health and Nutrition Examination Survey (KNHANES were employed. About seven thousands of people (N= 6,751, of which 43% were males and 57% females, aged from 19 to 75 years old, were included in the sample considered in the analysis. In order to take into account the possible endogeneity of some explanatory variables, two Instrumental Variables Probit (IVP regressions were estimated; the variables for which instrumental equations were estimated were related to the participation of women to the workforce and to good health, as reported by people in the sample. Explanatory variables were related to age, gender, family factors (such as the number of family members and marital status and socio

  19. An Instrumental Variable Probit (IVP) analysis on depressed mood in Korea: the impact of gender differences and other socio-economic factors.

    Gitto, Lara; Noh, Yong-Hwan; Andrés, Antonio Rodríguez

    2015-04-16

    Depression is a mental health state whose frequency has been increasing in modern societies. It imposes a great burden, because of the strong impact on people's quality of life and happiness. Depression can be reliably diagnosed and treated in primary care: if more people could get effective treatments earlier, the costs related to depression would be reversed. The aim of this study was to examine the influence of socio-economic factors and gender on depressed mood, focusing on Korea. In fact, in spite of the great amount of empirical studies carried out for other countries, few epidemiological studies have examined the socio-economic determinants of depression in Korea and they were either limited to samples of employed women or did not control for individual health status. Moreover, as the likely data endogeneity (i.e. the possibility of correlation between the dependent variable and the error term as a result of autocorrelation or simultaneity, such as, in this case, the depressed mood due to health factors that, in turn might be caused by depression), might bias the results, the present study proposes an empirical approach, based on instrumental variables, to deal with this problem. Data for the year 2008 from the Korea National Health and Nutrition Examination Survey (KNHANES) were employed. About seven thousands of people (N= 6,751, of which 43% were males and 57% females), aged from 19 to 75 years old, were included in the sample considered in the analysis. In order to take into account the possible endogeneity of some explanatory variables, two Instrumental Variables Probit (IVP) regressions were estimated; the variables for which instrumental equations were estimated were related to the participation of women to the workforce and to good health, as reported by people in the sample. Explanatory variables were related to age, gender, family factors (such as the number of family members and marital status) and socio-economic factors (such as education

  20. An integrated approach for integrated intelligent instrumentation and control system (I3CS)

    Jung, C.H.; Kim, J.T.; Kwon, K.C.

    1997-01-01

    Nuclear power plants to guarantee the safety of public should be designed to reduce the operator intervention resulting in operating human errors, identify the process states in transients, and aid to make a decision of their tasks and guide operator actions. For the sake of this purpose, MMIS(MAN-Machine Interface System) in NPPs should be the integrated top-down approach tightly focused on the function-based task analysis including an advanced digital technology, an operator support function, and so on. The advanced I and C research team in KAERI has embarked on developing an Integrated Intelligent Instrumentation and Control System (I 3 CS) for Korea's next generation nuclear power plants. I 3 CS bases the integrated top-down approach on the function-based task analysis, modern digital technology, standardization and simplification, availability and reliability, and protection of investment. (author). 4 refs, 6 figs

  1. An integrated approach for integrated intelligent instrumentation and control system (I{sup 3}CS)

    Jung, C H; Kim, J T; Kwon, K C [Korea Atomic Energy Research Inst., Yusong, Taejon (Korea, Republic of)

    1997-07-01

    Nuclear power plants to guarantee the safety of public should be designed to reduce the operator intervention resulting in operating human errors, identify the process states in transients, and aid to make a decision of their tasks and guide operator actions. For the sake of this purpose, MMIS(MAN-Machine Interface System) in NPPs should be the integrated top-down approach tightly focused on the function-based task analysis including an advanced digital technology, an operator support function, and so on. The advanced I and C research team in KAERI has embarked on developing an Integrated Intelligent Instrumentation and Control System (I{sup 3}CS) for Korea`s next generation nuclear power plants. I{sup 3}CS bases the integrated top-down approach on the function-based task analysis, modern digital technology, standardization and simplification, availability and reliability, and protection of investment. (author). 4 refs, 6 figs.

  2. College quality and hourly wages: evidence from the self-revelation model, sibling models and instrumental variables.

    Borgen, Nicolai T

    2014-11-01

    This paper addresses the recent discussion on confounding in the returns to college quality literature using the Norwegian case. The main advantage of studying Norway is the quality of the data. Norwegian administrative data provide information on college applications, family relations and a rich set of control variables for all Norwegian citizens applying to college between 1997 and 2004 (N = 141,319) and their succeeding wages between 2003 and 2010 (676,079 person-year observations). With these data, this paper uses a subset of the models that have rendered mixed findings in the literature in order to investigate to what extent confounding biases the returns to college quality. I compare estimates obtained using standard regression models to estimates obtained using the self-revelation model of Dale and Krueger (2002), a sibling fixed effects model and the instrumental variable model used by Long (2008). Using these methods, I consistently find increasing returns to college quality over the course of students' work careers, with positive returns only later in students' work careers. I conclude that the standard regression estimate provides a reasonable estimate of the returns to college quality. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. The effects of competition on premiums: using United Healthcare's 2015 entry into Affordable Care Act's marketplaces as an instrumental variable.

    Agirdas, Cagdas; Krebs, Robert J; Yano, Masato

    2018-01-08

    One goal of the Affordable Care Act is to increase insurance coverage by improving competition and lowering premiums. To facilitate this goal, the federal government enacted online marketplaces in the 395 rating areas spanning 34 states that chose not to establish their own state-run marketplaces. Few multivariate regression studies analyzing the effects of competition on premiums suffer from endogeneity, due to simultaneity and omitted variable biases. However, United Healthcare's decision to enter these marketplaces in 2015 provides the researcher with an opportunity to address this endogeneity problem. Exploiting the variation caused by United Healthcare's entry decision as an instrument for competition, we study the impact of competition on premiums during the first 2 years of these marketplaces. Combining panel data from five different sources and controlling for 12 variables, we find that one more insurer in a rating area leads to a 6.97% reduction in the second-lowest-priced silver plan premium, which is larger than the estimated effects in existing literature. Furthermore, we run a threshold analysis and find that competition's effects on premiums become statistically insignificant if there are four or more insurers in a rating area. These findings are robust to alternative measures of premiums, inclusion of a non-linear term in the regression models and a county-level analysis.

  4. A Variable Flow Modelling Approach To Military End Strength Planning

    2016-12-01

    function. The MLRPS is more complex than the variable flow model as it has to cater for a force structure that is much larger than just the MT branch...essential positions in a Ship’s complement, or by the biggest current deficit in forecast end strength. The model can be adjusted to cater for any of these...is unlikely that the RAN will be able to cater for such an increase in hires, so this scenario is not likely to solve their problem. Each transition

  5. Thermodynamic approach to the inelastic state variable theories

    Dashner, P.A.

    1978-06-01

    A continuum model is proposed as a theoretical foundation for the inelastic state variable theory of Hart. The model is based on the existence of a free energy function and the assumption that a strained material element recalls two other local configurations which are, in some specified manner, descriptive of prior deformation. A precise formulation of these material hypotheses within the classical thermodynamical framework leads to the recovery of a generalized elastic law and the specification of evolutionary laws for the remembered configurations which are frame invariant and formally valid for finite strains. Moreover, the precise structure of Hart's theory is recovered when strains are assumed to be small

  6. Variability in personality expression across contexts: a social network approach.

    Clifton, Allan

    2014-04-01

    The current research investigated how the contextual expression of personality differs across interpersonal relationships. Two related studies were conducted with college samples (Study 1: N = 52, 38 female; Study 2: N = 111, 72 female). Participants in each study completed a five-factor measure of personality and constructed a social network detailing their 30 most important relationships. Participants used a brief Five-Factor Model scale to rate their personality as they experience it when with each person in their social network. Multiple informants selected from each social network then rated the target participant's personality (Study 1: N = 227, Study 2: N = 777). Contextual personality ratings demonstrated incremental validity beyond standard global self-report in predicting specific informants' perceptions. Variability in these contextualized personality ratings was predicted by the position of the other individuals within the social network. Across both studies, participants reported being more extraverted and neurotic, and less conscientious, with more central members of their social networks. Dyadic social network-based assessments of personality provide incremental validity in understanding personality, revealing dynamic patterns of personality variability unobservable with standard assessment techniques. © 2013 Wiley Periodicals, Inc.

  7. New variable separation approach: application to nonlinear diffusion equations

    Zhang Shunli; Lou, S Y; Qu Changzheng

    2003-01-01

    The concept of the derivative-dependent functional separable solution (DDFSS), as a generalization to the functional separable solution, is proposed. As an application, it is used to discuss the generalized nonlinear diffusion equations based on the generalized conditional symmetry approach. As a consequence, a complete list of canonical forms for such equations which admit the DDFSS is obtained and some exact solutions to the resulting equations are described

  8. A thermodynamic approach to fatigue damage accumulation under variable loading

    Naderi, M.; Khonsari, M.M.

    2010-01-01

    We put forward a general procedure for assessment of damage evolution based on the concept of entropy production. The procedure is applicable to both constant- and variable amplitude loading. The results of a series of bending fatigue tests under both two-stage and three-stage loadings are reported to investigate the validity of the proposed methodology. Also presented are the results of experiments involving bending, torsion, and tension-compression fatigue tests with Al 6061-T6 and SS 304 specimens. It is shown that, within the range of parameters tested, the evolution of fatigue damage for these materials in terms of entropy production is independent of load, frequency, size, loading sequence and loading history. Furthermore, entropy production fractions of individual amplitudes sums to unity.

  9. Characteristics of quantum open systems: free random variables approach

    Gudowska-Nowak, E.; Papp, G.; Brickmann, J.

    1998-01-01

    Random Matrix Theory provides an interesting tool for modelling a number of phenomena where noises (fluctuations) play a prominent role. Various applications range from the theory of mesoscopic systems in nuclear and atomic physics to biophysical models, like Hopfield-type models of neural networks and protein folding. Random Matrix Theory is also used to study dissipative systems with broken time-reversal invariance providing a setup for analysis of dynamic processes in condensed, disordered media. In the paper we use the Random Matrix Theory (RMT) within the formalism of Free Random Variables (alias Blue's functions), which allows to characterize spectral properties of non-Hermitean ''Hamiltonians''. The relevance of using the Blue's function method is discussed in connection with application of non-Hermitean operators in various problems of physical chemistry. (author)

  10. Instrumental variable analysis

    Stel, Vianda S.; Dekker, Friedo W.; Zoccali, Carmine; Jager, Kitty J.

    2013-01-01

    The main advantage of the randomized controlled trial (RCT) is the random assignment of treatment that prevents selection by prognosis. Nevertheless, only few RCTs can be performed given their high cost and the difficulties in conducting such studies. Therefore, several analytical methods for

  11. Treatment of thoracolumbar burst fractures with variable screw placement or Isola instrumentation and arthrodesis: case series and literature review.

    Alvine, Gregory F; Swain, James M; Asher, Marc A; Burton, Douglas C

    2004-08-01

    The controversy of burst fracture surgical management is addressed in this retrospective case study and literature review. The series consisted of 40 consecutive patients, index included, with 41 fractures treated with stiff, limited segment transpedicular bone-anchored instrumentation and arthrodesis from 1987 through 1994. No major acute complications such as death, paralysis, or infection occurred. For the 30 fractures with pre- and postoperative computed tomography studies, spinal canal compromise was 61% and 32%, respectively. Neurologic function improved in 7 of 14 patients (50%) and did not worsen in any. The principal problem encountered was screw breakage, which occurred in 16 of the 41 (39%) instrumented fractures. As we have previously reported, transpedicular anterior bone graft augmentation significantly decreased variable screw placement (VSP) implant breakage. However, it did not prevent Isola implant breakage in two-motion segment constructs. Compared with VSP, Isola provided better sagittal plane realignment and constructs that have been found to be significantly stiffer. Unplanned reoperation was necessary in 9 of the 40 patients (23%). At 1- and 2-year follow-up, 95% and 79% of patients were available for study, and a satisfactory outcome was achieved in 84% and 79%, respectively. These satisfaction and reoperation rates are consistent with the literature of the time. Based on these observations and the loads to which implant constructs are exposed following posterior realignment and stabilization of burst fractures, we recommend that three- or four-motion segment constructs, rather than two motion, be used. To save valuable motion segments, planned construct shortening can be used. An alternative is sequential or staged anterior corpectomy and structural grafting.

  12. Software design for the VIS instrument onboard the Euclid mission: a multilayer approach

    Galli, E.; Di Giorgio, A. M.; Pezzuto, S.; Liu, S. J.; Giusi, G.; Li Causi, G.; Farina, M.; Cropper, M.; Denniston, J.; Niemi, S.

    2014-07-01

    The Euclid mission scientific payload is composed of two instruments: a VISible Imaging Instrument (VIS) and a Near Infrared Spectrometer and Photometer instrument (NISP). Each instrument has its own control unit. The Instrument Command and Data Processing Unit (VI-CDPU) is the control unit of the VIS instrument. The VI-CDPU is connected directly to the spacecraft by means of a MIL-STD-1553B bus and to the satellite Mass Memory Unit via a SpaceWire link. All the internal interfaces are implemented via SpaceWire links and include 12 high speed lines for the data provided by the 36 focal plane CCDs readout electronics (ROEs) and one link to the Power and Mechanisms Control Unit (VI-PMCU). VI-CDPU is in charge of distributing commands to the instrument sub-systems, collecting their housekeeping parameters and monitoring their health status. Moreover, the unit has the task of acquiring, reordering, compressing and transferring the science data to the satellite Mass Memory. This last feature is probably the most challenging one for the VI-CDPU, since stringent constraints about the minimum lossless compression ratio, the maximum time for the compression execution and the maximum power consumption have to be satisfied. Therefore, an accurate performance analysis at hardware layer is necessary, which could delay too much the design and development of software. In order to mitigate this risk, in the multilayered design of software we decided to design a middleware layer that provides a set of APIs with the aim of hiding the implementation of the HW connected layer to the application one. The middleware is built on top of the Operating System layer (which includes the Real-Time OS that will be adopted) and the onboard Computer Hardware. The middleware itself has a multi-layer architecture composed of 4 layers: the Abstract RTOS Adapter Layer (AOSAL), the Speci_c RTOS Adapter Layer (SOSAL), the Common Patterns Layer (CPL), the Service Layer composed of two subgroups which

  13. Innovative approach to implementation of FPGA-based NPP instrumentation and control systems

    Andrashov, Anton; Kharchenko, Vyacheslav; Sklyar, Volodymir; Siora, Alexander

    2011-01-01

    Advantages of application of Field Programmable Gates Arrays (FPGA) technology for implementation of Instrumentation and Control (I and C) systems for Nuclear Power Plants (NPP) are outlined. Specific features of FPGA technology in the context of cyber security threats for NPPs I and C systems are analyzed. Description of FPGA-based platform used for implementation of different safety I and C systems for NPPs is presented. Typical architecture of NPPs safety I and C system based on the platform, as well as approach to implementation of I and C systems using FPGA-based platform are discussed. Data on implementation experience of application of the platform for NPP safety I and C systems modernization projects are finalizing the paper. (author)

  14. Innovative Approach to Implementation of FPGA-based NPP Instrumentation and Control Systems

    Andrashov, Anton; Kharchenko, Vyacheslav; Sklyar, Volodymir; SIORA Alexander

    2011-01-01

    Advantages of application of Field Programmable Gates Arrays (FPGA) technology for implementation of Instrumentation and Control (I and C) systems for Nuclear Power Plants (NPP) are outlined. Specific features of FPGA technology in the context of cyber security threats for NPPs I and C systems are analyzed. Description of FPGA-based platform used for implementation of different safety I and C systems for NPPs is presented. Typical architecture of NPPs safety I and C system based on the platform, as well as approach to implementation of I and C systems using FPGA-based platform are discussed. Data on implementation experience of application of the platform for NPP safety I and C systems modernization projects are finalizing the paper

  15. Innovative Approach to Implementation of FPGA-based NPP Instrumentation and Control Systems

    Andrashov, Anton; Kharchenko, Vyacheslav; Sklyar, Volodymir [Centre for Safety Infrastructure-Oriented Research and Analysis, Kharkov (Ukraine); SIORA Alexander [Research and Production Corporation Radiy, Kirovograd (Ukraine)

    2011-08-15

    Advantages of application of Field Programmable Gates Arrays (FPGA) technology for implementation of Instrumentation and Control (I and C) systems for Nuclear Power Plants (NPP) are outlined. Specific features of FPGA technology in the context of cyber security threats for NPPs I and C systems are analyzed. Description of FPGA-based platform used for implementation of different safety I and C systems for NPPs is presented. Typical architecture of NPPs safety I and C system based on the platform, as well as approach to implementation of I and C systems using FPGA-based platform are discussed. Data on implementation experience of application of the platform for NPP safety I and C systems modernization projects are finalizing the paper.

  16. An automated calibration laboratory for flight research instrumentation: Requirements and a proposed design approach

    Oneill-Rood, Nora; Glover, Richard D.

    1990-01-01

    NASA's Dryden Flight Research Facility (Ames-Dryden), operates a diverse fleet of research aircraft which are heavily instrumented to provide both real time data for in-flight monitoring and recorded data for postflight analysis. Ames-Dryden's existing automated calibration (AUTOCAL) laboratory is a computerized facility which tests aircraft sensors to certify accuracy for anticipated harsh flight environments. Recently, a major AUTOCAL lab upgrade was initiated; the goal of this modernization is to enhance productivity and improve configuration management for both software and test data. The new system will have multiple testing stations employing distributed processing linked by a local area network to a centralized database. The baseline requirements for the new AUTOCAL lab and the design approach being taken for its mechanization are described.

  17. Semiparametric methods for estimation of a nonlinear exposure‐outcome relationship using instrumental variables with application to Mendelian randomization

    Staley, James R.

    2017-01-01

    ABSTRACT Mendelian randomization, the use of genetic variants as instrumental variables (IV), can test for and estimate the causal effect of an exposure on an outcome. Most IV methods assume that the function relating the exposure to the expected value of the outcome (the exposure‐outcome relationship) is linear. However, in practice, this assumption may not hold. Indeed, often the primary question of interest is to assess the shape of this relationship. We present two novel IV methods for investigating the shape of the exposure‐outcome relationship: a fractional polynomial method and a piecewise linear method. We divide the population into strata using the exposure distribution, and estimate a causal effect, referred to as a localized average causal effect (LACE), in each stratum of population. The fractional polynomial method performs metaregression on these LACE estimates. The piecewise linear method estimates a continuous piecewise linear function, the gradient of which is the LACE estimate in each stratum. Both methods were demonstrated in a simulation study to estimate the true exposure‐outcome relationship well, particularly when the relationship was a fractional polynomial (for the fractional polynomial method) or was piecewise linear (for the piecewise linear method). The methods were used to investigate the shape of relationship of body mass index with systolic blood pressure and diastolic blood pressure. PMID:28317167

  18. Infectious complications in head and neck cancer patients treated with cetuximab: propensity score and instrumental variable analysis.

    Ching-Chih Lee

    Full Text Available BACKGROUND: To compare the infection rates between cetuximab-treated patients with head and neck cancers (HNC and untreated patients. METHODOLOGY: A national cohort of 1083 HNC patients identified in 2010 from the Taiwan National Health Insurance Research Database was established. After patients were followed for one year, propensity score analysis and instrumental variable analysis were performed to assess the association between cetuximab therapy and the infection rates. RESULTS: HNC patients receiving cetuximab (n = 158 were older, had lower SES, and resided more frequently in rural areas as compared to those without cetuximab therapy. 125 patients, 32 (20.3% in the group using cetuximab and 93 (10.1% in the group not using it presented infections. The propensity score analysis revealed a 2.3-fold (adjusted odds ratio [OR] = 2.27; 95% CI, 1.46-3.54; P = 0.001 increased risk for infection in HNC patients treated with cetuximab. However, using IVA, the average treatment effect of cetuximab was not statistically associated with increased risk of infection (OR, 0.87; 95% CI, 0.61-1.14. CONCLUSIONS: Cetuximab therapy was not statistically associated with infection rate in HNC patients. However, older HNC patients using cetuximab may incur up to 33% infection rate during one year. Particular attention should be given to older HNC patients treated with cetuximab.

  19. Instrumental variable estimation of the causal effect of plasma 25-hydroxy-vitamin D on colorectal cancer risk: a mendelian randomization analysis.

    Evropi Theodoratou

    Full Text Available Vitamin D deficiency has been associated with several common diseases, including cancer and is being investigated as a possible risk factor for these conditions. We reported the striking prevalence of vitamin D deficiency in Scotland. Previous epidemiological studies have reported an association between low dietary vitamin D and colorectal cancer (CRC. Using a case-control study design, we tested the association between plasma 25-hydroxy-vitamin D (25-OHD and CRC (2,001 cases, 2,237 controls. To determine whether plasma 25-OHD levels are causally linked to CRC risk, we applied the control function instrumental variable (IV method of the mendelian randomization (MR approach using four single nucleotide polymorphisms (rs2282679, rs12785878, rs10741657, rs6013897 previously shown to be associated with plasma 25-OHD. Low plasma 25-OHD levels were associated with CRC risk in the crude model (odds ratio (OR: 0.76, 95% Confidence Interval (CI: 0.71, 0.81, p: 1.4×10(-14 and after adjusting for age, sex and other confounding factors. Using an allele score that combined all four SNPs as the IV, the estimated causal effect was OR 1.16 (95% CI 0.60, 2.23, whilst it was 0.94 (95% CI 0.46, 1.91 and 0.93 (0.53, 1.63 when using an upstream (rs12785878, rs10741657 and a downstream allele score (rs2282679, rs6013897, respectively. 25-OHD levels were inversely associated with CRC risk, in agreement with recent meta-analyses. The fact that this finding was not replicated when the MR approach was employed might be due to weak instruments, giving low power to demonstrate an effect (<0.35. The prevalence and degree of vitamin D deficiency amongst individuals living in northerly latitudes is of considerable importance because of its relationship to disease. To elucidate the effect of vitamin D on CRC cancer risk, additional large studies of vitamin D and CRC risk are required and/or the application of alternative methods that are less sensitive to weak instrument

  20. Assessment of the quality and variability of health information on chronic pain websites using the DISCERN instrument

    Buckley Norman

    2010-10-01

    Full Text Available Abstract Background The Internet is used increasingly by providers as a tool for disseminating pain-related health information and by patients as a resource about health conditions and treatment options. However, health information on the Internet remains unregulated and varies in quality, accuracy and readability. The objective of this study was to determine the quality of pain websites, and explain variability in quality and readability between pain websites. Methods Five key terms (pain, chronic pain, back pain, arthritis, and fibromyalgia were entered into the Google, Yahoo and MSN search engines. Websites were assessed using the DISCERN instrument as a quality index. Grade level readability ratings were assessed using the Flesch-Kincaid Readability Algorithm. Univariate (using alpha = 0.20 and multivariable regression (using alpha = 0.05 analyses were used to explain the variability in DISCERN scores and grade level readability using potential for commercial gain, health related seals of approval, language(s and multimedia features as independent variables. Results A total of 300 websites were assessed, 21 excluded in accordance with the exclusion criteria and 110 duplicate websites, leaving 161 unique sites. About 6.8% (11/161 websites of the websites offered patients' commercial products for their pain condition, 36.0% (58/161 websites had a health related seal of approval, 75.8% (122/161 websites presented information in English only and 40.4% (65/161 websites offered an interactive multimedia experience. In assessing the quality of the unique websites, of a maximum score of 80, the overall average DISCERN Score was 55.9 (13.6 and readability (grade level of 10.9 (3.9. The multivariable regressions demonstrated that website seals of approval (P = 0.015 and potential for commercial gain (P = 0.189 were contributing factors to higher DISCERN scores, while seals of approval (P = 0.168 and interactive multimedia (P = 0.244 contributed to

  1. A socio-cultural instrumental approach to emotion regulation: Culture and the regulation of positive emotions.

    Ma, Xiaoming; Tamir, Maya; Miyamoto, Yuri

    2018-02-01

    We propose a sociocultural instrumental approach to emotion regulation. According to this approach, cultural differences in the tendency to savor rather than dampen positive emotions should be more pronounced when people are actively pursuing goals (i.e., contexts requiring higher cognitive effort) than when they are not (i.e., contexts requiring lower cognitive efforts), because cultural beliefs about the utility of positive emotions should become most relevant when people are engaging in active goal pursuit. Four studies provided support for our theory. First, European Americans perceived more utility and less harm of positive emotions than Japanese did (Study 1). Second, European Americans reported a stronger relative preference for positive emotions than Asians, but this cultural difference was larger in high cognitive effort contexts than in moderate or low cognitive effort contexts (Study 2). Third, European Americans reported trying to savor rather than dampen positive emotions more than Asians did when preparing to take an exam, a typical high cognitive effort context (Studies 3-4), but these cultural differences were attenuated when an exam was not expected (Study 3) and disappeared when participants expected to interact with a stranger (Study 4). These findings suggest that cultural backgrounds and situational demands interact to shape how people regulate positive emotions. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  2. Instrumental variable analysis as a complementary analysis in studies of adverse effects : venous thromboembolism and second-generation versus third-generation oral contraceptives

    Boef, Anna G C; Souverein, Patrick C|info:eu-repo/dai/nl/243074948; Vandenbroucke, Jan P; van Hylckama Vlieg, Astrid; de Boer, Anthonius|info:eu-repo/dai/nl/075097346; le Cessie, Saskia; Dekkers, Olaf M

    2016-01-01

    PURPOSE: A potentially useful role for instrumental variable (IV) analysis may be as a complementary analysis to assess the presence of confounding when studying adverse drug effects. There has been discussion on whether the observed increased risk of venous thromboembolism (VTE) for

  3. On the Integrity of Online Testing for Introductory Statistics Courses: A Latent Variable Approach

    Alan Fask

    2015-04-01

    Full Text Available There has been a remarkable growth in distance learning courses in higher education. Despite indications that distance learning courses are more vulnerable to cheating behavior than traditional courses, there has been little research studying whether online exams facilitate a relatively greater level of cheating. This article examines this issue by developing an approach using a latent variable to measure student cheating. This latent variable is linked to both known student mastery related variables and variables unrelated to student mastery. Grade scores from a proctored final exam and an unproctored final exam are used to test for increased cheating behavior in the unproctored exam

  4. The Matrix model, a driven state variables approach to non-equilibrium thermodynamics

    Jongschaap, R.J.J.

    2001-01-01

    One of the new approaches in non-equilibrium thermodynamics is the so-called matrix model of Jongschaap. In this paper some features of this model are discussed. We indicate the differences with the more common approach based upon internal variables and the more sophisticated Hamiltonian and GENERIC

  5. MODERN TENDENCIES OF USING INTEGRATIVE APPROACH TO ORGANIZING ART TEACHERS’ INSTRUMENTAL AND PERFORMING TRAINING

    Zhanna Kartashova

    2016-04-01

    Full Text Available In the article the modern tendencies of using integrative approach to organizing art teachers’ instrumental and performing training. The concept “integration” is singled out; it is defined as the process of recovery, replenishment, combining previously isolated parts; moving of the system to a great organic integrity. It is disclosed that the integration means considering multidimensionality of element features which are being integrated while accumulating quantitative features and emerging a new quality and individual features of integral elements are saved. It is proved that integrating is interrelation of art varieties in the process of their aesthetic and educational impact on pupils that is the whole perception of a work (its content and its emotional, rational, ethical and aesthetic form in the unity of tasks of developing artistic and aesthetic senses, thoughts, tastes and pupils’ ideals. It is thought that integration in art pedagogy is held at three levels: internal artistic and aesthetic synthesis of various arts organically combined with students creative activity; interdisciplinary humanistic synthesis of the arts, the native language, literature, and folklore; looking for semantic blocks, images, concepts that have universal meaning, which, entering all spheres of human consciousness, such as science and mathematics, seamlessly combining them into a coherent system. It is noted that the most efficient approach is appeal to the learning subjects of Humanities cycle – music, literature and art. It is concluded that designing of training should be started with the analyzing prospective art teacher’s activity. It should be understood what the teacher has to do, not in general formulation, but at the level of actions and operations.

  6. MUG-OBS - Multiparameter Geophysical Ocean Bottom System : a new instrumental approach to monitor earthquakes.

    hello, yann; Charvis, Philippe; Yegikyan, Manuk; verfaillie, Romain; Rivet, Diane

    2016-04-01

    Real time monitoring of seismic activity is a major issue for early warning of earthquakes and tsunamis. It can be done using regional scale wired nodes, such as Neptune in Canada and in the U.S, or DONET in Japan. Another approach to monitor seismic activity at sea is to deploying repeatedly OBS array like during the amphibious Cascadia Initiative (four time 1-year deployments), the Japanese Pacific Array (broadband OBSs "ocean-bottom broadband dispersion survey" with 2-years autonomy), the Obsismer program in the French Lesser Antilles (eight time 6-months deployments) and the Osisec program in Ecuador (four time 6-months deployments). These autonomous OBSs are self-recovered or recovered using an ROV. These systems are costly including ship time, and require to recover the OBS before to start working on data. Among the most recent alternative we developed a 3/4 years autonomy ocean bottom system with 9 channels (?) allowing the acquisition of different seismic or environmental parameters. MUG-OBS is a free falling instrument rated down to 6000 m. The installation of the sensor is monitored by acoustic commands from the surface and a health bulletin with data checking is recovered by acoustic during the installation. The major innovation is that it is possible to recover the data any time on demand (regularly every 6-months or after a crisis) using one of the 6 data-shuttles released from the surface by acoustic command using a one day fast cruise boat of opportunity. Since sensors stayed at the same location for 3 years, it is a perfect tool to monitor large seismic events, background seismic activity and aftershock distribution. Clock, drift measurement and GPS localization is automatic when the shuttle reaches the surface. For remote areas, shuttles released automatically and a seismic events bulletin is transmitted. Selected data can be recovered by two-way Iridium satellite communication. After a period of 3 years the main station is self-recovered by

  7. Characterization of Whole Grain Pasta: Integrating Physical, Chemical, Molecular, and Instrumental Sensory Approaches.

    Marti, Alessandra; Cattaneo, Stefano; Benedetti, Simona; Buratti, Susanna; Abbasi Parizad, Parisa; Masotti, Fabio; Iametti, Stefania; Pagani, Maria Ambrogina

    2017-11-01

    The consumption of whole-grain food-including pasta-has been increasing steadily. In the case of whole-grain pasta, given the many different producers, it seems important to have some objective parameters to define its overall quality. In this study, commercial whole-grain pasta samples representative of the Italian market have been characterized from both molecular and electronic-senses (electronic nose and electronic tongue) standpoint in order to provide a survey of the properties of different commercial samples. Only 1 pasta product showed very low levels of heat damage markers (furosine and pyrraline), suggesting that this sample underwent to low temperature dry treatment. In all samples, the furosine content was directly correlated to protein structural indices, since protein structure compactness increased with increasing levels of heat damage markers. Electronic senses were able to discriminate among pasta samples according to the intensity of heat treatment during the drying step. Pasta sample with low furosine content was discriminated by umami taste and by sensors responding to aliphatic and inorganic compounds. Data obtained with this multidisciplinary approach are meant to provide hints for identifying useful indices for pasta quality. As observed for semolina pasta, objective parameters based on heat-damage were best suited to define the overall quality of wholegrain pasta, almost independently of compositional differences among commercial samples. Drying treatments of different intensity also had an impact on instrumental sensory traits that may provide a reliable alternative to analytical determination of chemical markers of heat damage in all cases where there is a need for avoiding time-consuming procedures. © 2017 Institute of Food Technologists®.

  8. Improved installation approach for variable spring setting on a pipe yet to be insulated

    Shah, H.H.; Chitnis, S.S.; Rencher, D.

    1993-01-01

    This paper provides an approach in setting of variable spring supports for noninsulated or partially insulated piping systems so that resetting these supports is not required when the insulation is fully installed. This approach shows a method of deriving the spring coldload setting tolerance values that can be readily utilized by craft personnel. This method is based on the percentage of the weight of the insulation compared to the total weight of the pipe and the applicable tolerance. Use of these setting tolerances eliminates reverification of the original cold-load settings, for the majority of variable springs when the insulation is fully installed

  9. Effect of corruption on healthcare satisfaction in post-soviet nations: A cross-country instrumental variable analysis of twelve countries.

    Habibov, Nazim

    2016-03-01

    There is the lack of consensus about the effect of corruption on healthcare satisfaction in transitional countries. Interpreting the burgeoning literature on this topic has proven difficult due to reverse causality and omitted variable bias. In this study, the effect of corruption on healthcare satisfaction is investigated in a set of 12 Post-Socialist countries using instrumental variable regression on the sample of 2010 Life in Transition survey (N = 8655). The results indicate that experiencing corruption significantly reduces healthcare satisfaction. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.

  10. Fair Value Accounting for Financial Instruments – Conceptual Approach and Implications

    Dumitru MATIS; Carmen Giorgiana BONACI

    2008-01-01

    This study complements the growing literature on the value relevance of fair value by examining the validity of the hypothesis that fair value is more informative than historical cost as a financial reporting standard for financial instruments. We therefore compare the relative explanatory power of fair value and historical cost in explaining equity values. In order to reflect fair values’ role in offering the fair view where financial instruments are concerned we briefly reviewed capital mar...

  11. Emerging adulthood features and criteria for adulthood : Variable- and person-centered approaches

    Tagliabue, Semira; Crocetti, Elisabetta; Lanz, Margherita

    Reaching adulthood is the aim of the transition to adulthood; however, emerging adults differently define both adulthood and the transitional period they are living. Variable-centered and person-centered approaches were integrated in the present paper to investigate if the criteria used to define

  12. Cognitive Preconditions of Early Reading and Spelling: A Latent-Variable Approach with Longitudinal Data

    Preßler, Anna-Lena; Könen, Tanja; Hasselhorn, Marcus; Krajewski, Kristin

    2014-01-01

    The aim of the present study was to empirically disentangle the interdependencies of the impact of nonverbal intelligence, working memory capacities, and phonological processing skills on early reading decoding and spelling within a latent variable approach. In a sample of 127 children, these cognitive preconditions were assessed before the onset…

  13. Confidence Intervals for a Semiparametric Approach to Modeling Nonlinear Relations among Latent Variables

    Pek, Jolynn; Losardo, Diane; Bauer, Daniel J.

    2011-01-01

    Compared to parametric models, nonparametric and semiparametric approaches to modeling nonlinearity between latent variables have the advantage of recovering global relationships of unknown functional form. Bauer (2005) proposed an indirect application of finite mixtures of structural equation models where latent components are estimated in the…

  14. The Relationship between Executive Functions and Language Abilities in Children: A Latent Variables Approach

    Kaushanskaya, Margarita; Park, Ji Sook; Gangopadhyay, Ishanti; Davidson, Meghan M.; Weismer, Susan Ellis

    2017-01-01

    Purpose: We aimed to outline the latent variables approach for measuring nonverbal executive function (EF) skills in school-age children, and to examine the relationship between nonverbal EF skills and language performance in this age group. Method: Seventy-one typically developing children, ages 8 through 11, participated in the study. Three EF…

  15. Integrating Cost as an Independent Variable Analysis with Evolutionary Acquisition - A Multiattribute Design Evaluation Approach

    2003-03-01

    within the Automated Cost Estimating Integrated Tools ( ACEIT ) software suite (version 5.x). With this capability, one can set cost targets or time...not allow the user to vary more than one decision variable. This limitation of the ACEIT approach thus hinders a holistic view when attempting to

  16. A labview approach to instrumentation for the TFTR bumper limiter alignment project

    Skelly, G.N.; Owens, D.K.

    1992-01-01

    This paper reports on a project recently undertaken to measure the alignment of the TFTR bumper limiter in relation to the toroidal magnetic field axis. The process involved the measurement of the toroidal magnetic field, and the positions of the tiles that make up the bumper limiter. The basis for the instrument control and data acquisition system was National Instrument's LabVIEW 2. LabVIEW is a graphical programming system for developing scientific and engineering applications on a Macintosh. For this project, a Macintosh IIci controlled the IEEE-488 GPIB programmable instruments via an interface box connected to the SCSI port of the computer. With LabVIEW, users create graphical software modules called virtual instruments instead of writing conventional text-based code. To measure the magnetic field, the control system acquired data from two nuclear magnetic resonance magnetometers while the torroidal field coils were pulsed. To measure the position of the tiles on the limiter, an instrumented mechanical arm was used inside the vessel

  17. Physical Interactions with Digital Strings - A hybrid approach to a digital keyboard instrument

    Dahlstedt, Palle

    2017-01-01

    of stopping and muting the strings at arbitrary positions. The parameters of the string model are controlled through TouchKeys multitouch sensors on each key, combined with MIDI data and acoustic signals from the digital keyboard frame, using a novel mapping. The instrument is evaluated from a performing...... of control. The contributions are two-fold. First, the use of acoustic sounds from a physical keyboard for excitations and resonances results in a novel hybrid keyboard instrument in itself. Second, the digital model of "inside piano" playing, using multitouch keyboard data, allows for performance techniques...

  18. One-stage posterior approaches for treatment of thoracic spinal infection: Transforaminal and costotransversectomy, compared with anterior approach with posterior instrumentation.

    Kao, Fu-Cheng; Tsai, Tsung-Ting; Niu, Chi-Chien; Lai, Po-Liang; Chen, Lih-Huei; Chen, Wen-Jer

    2017-10-01

    Treating thoracic infective spondylodiscitis with anterior surgical approaches carry a relatively high risk of perioperative and postoperative complications. Posterior approaches have been reported to result in lower complication rates than anterior procedures, but more evidence is needed to demonstrate the safety and efficacy of 1-stage posterior approaches for treating infectious thoracic spondylodiscitis.Preoperative and postoperative clinical data, of 18 patients who underwent 2 types of 1-stage posterior procedures, costotransversectomy and transforaminal thoracic interbody debridement and fusion and 7 patients who underwent anterior debridement and reconstruction with posterior instrumentation, were retrospectively assessed.The clinical outcomes of patients treated with 1-stage posterior approaches were generally good, with good infection control, back pain relief, kyphotic angle correction, and either partial or solid union for fusion status. Furthermore, they achieved shorter surgical time, fewer postoperative complications, and shorter hospital stay than the patients underwent anterior debridement with posterior instrumentation.The results suggested that treating thoracic spondylodiscitis with a single-stage posterior approach might prevent postoperative complications and avoid respiratory problems associated with anterior approaches. Single-stage posterior approaches would be recommended for thoracic spine infection, especially for patients with medical comorbidities.

  19. 76 FR 16689 - Standard Instrument Approach Procedures, and Takeoff Minimums and Obstacle Departure Procedures...

    2011-03-25

    ... safe and efficient use of the navigable airspace and to promote safe flight operations under instrument... Reading, PA, Reading Rgnl/Carl A Spaatz Field, ILS OR LOC RWY 13, Amdt 1A Reading, PA, Reading Rgnl/Carl A Spaatz Field, ILS OR LOC RWY 36, Amdt 30A Reading, PA, Reading Rgnl/Carl A Spaatz Field, NDB RWY 36, Amdt...

  20. New approaches for examining associations with latent categorical variables: applications to substance abuse and aggression.

    Feingold, Alan; Tiberio, Stacey S; Capaldi, Deborah M

    2014-03-01

    Assessments of substance use behaviors often include categorical variables that are frequently related to other measures using logistic regression or chi-square analysis. When the categorical variable is latent (e.g., extracted from a latent class analysis [LCA]), classification of observations is often used to create an observed nominal variable from the latent one for use in a subsequent analysis. However, recent simulation studies have found that this classical 3-step analysis championed by the pioneers of LCA produces underestimates of the associations of latent classes with other variables. Two preferable but underused alternatives for examining such linkages-each of which is most appropriate under certain conditions-are (a) 3-step analysis, which corrects the underestimation bias of the classical approach, and (b) 1-step analysis. The purpose of this article is to dissuade researchers from conducting classical 3-step analysis and to promote the use of the 2 newer approaches that are described and compared. In addition, the applications of these newer models-for use when the independent, the dependent, or both categorical variables are latent-are illustrated through substantive analyses relating classes of substance abusers to classes of intimate partner aggressors.

  1. A new approach to hazardous materials transportation risk analysis: decision modeling to identify critical variables.

    Clark, Renee M; Besterfield-Sacre, Mary E

    2009-03-01

    We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk.

  2. A Quantitative Approach to Variables Affecting Production of Short Films in Turkey

    Vedat Akman

    2011-08-01

    Full Text Available This study aims to explore the influence of various variables affecting the production of migration themed short films in Turkey. We proceeded to our analysis using descriptive statistics to describe the main futures of the sample data quantitatively. Due to non-uniformity of the data available, we were unable to use inductive statistics. Our basic sample statistical results indicated that short film producers prefered to produce short films on domestic migration theme rather than international. Gender and university seemed on surface as significant determinants to the production of migration themed short films in Turkey. We also looked at the demografic variables to provide more insights into our quantitative approach.

  3. Alternative approaches to pollution control and waste management: Regulatory and economic instruments

    Bernstein, J.D.

    1993-01-01

    The purpose of the paper is to present an overview of the most common strategies and policy instruments (that is, regulatory and economic) used in developed and developing countries to achieve pollution control and waste management objectives. Although this topic has been at the center of theoretical controversy both within and outside the World Bank, the paper is not intended to contribute to this debate. Rather, its purpose is to explore how regulatory and economic instruments are used to control air and water pollution, protect ground water, and manage solid and hazardous wastes. The paper is directed to policy makers at the national, state, and local levels of government, as well as to other parties responsible for pollution control and waste management programs

  4. Principles relating to the digital instrumentation and control design approach 2017

    2017-01-01

    The design of the instrumentation and control of nuclear facilities uses digital systems that offer increasing computation and interconnection capabilities. They enable advanced functions to be carried out, such as calculation of the critical heat flux ratio, help to detect hardware failures in real time and provide operators with rich, flexible interfaces. However, these evolved functions may be affected by faults that make their logic systematically inadequate in certain cases, which introduces sources of failure other than random hardware failures and raises questions about the informal concept of the increased 'complexity' of instrumentation and control. Appropriate design principles shall therefore be applied so that this logic is as fault-free as possible and can be assessed by an independent body such as IRSN. This document presents the main problems associated with the design of the digital instrumentation and control of a complex facility, as well as the general principles to follow to demonstrate that a satisfactory safety level has been achieved. The doctrine elements presented in this document are the result of the experience acquired during assessments carried out for the French nuclear power plants, enhanced by exchanges with experts from the nuclear sector, and reflect French practice; they apply in other sectors in which a high level of confidence can be attributed to instrumentation and control. The normative texts cited in this document provide detailed requirements that are open to considerable interpretation, as the nature of the problem posed does not enable relevant and measurable criteria to be defined in all cases. This document aims to explain the principles underlying these detailed requirements and to give the means for interpreting them in each situation. (authors)

  5. Novel Harmonic Regularization Approach for Variable Selection in Cox’s Proportional Hazards Model

    Ge-Jin Chu

    2014-01-01

    Full Text Available Variable selection is an important issue in regression and a number of variable selection methods have been proposed involving nonconvex penalty functions. In this paper, we investigate a novel harmonic regularization method, which can approximate nonconvex Lq  (1/2approach, which can produce solutions that closely approximate those for the convex loss function and the nonconvex regularization. Simulation results based on the artificial datasets and four real microarray gene expression datasets, such as real diffuse large B-cell lymphoma (DCBCL, the lung cancer, and the AML datasets, show that the harmonic regularization method can be more accurate for variable selection than existing Lasso series methods.

  6. R Package multiPIM: A Causal Inference Approach to Variable Importance Analysis

    Stephan J Ritter

    2014-04-01

    Full Text Available We describe the R package multiPIM, including statistical background, functionality and user options. The package is for variable importance analysis, and is meant primarily for analyzing data from exploratory epidemiological studies, though it could certainly be applied in other areas as well. The approach taken to variable importance comes from the causal inference field, and is different from approaches taken in other R packages. By default, multiPIM uses a double robust targeted maximum likelihood estimator (TMLE of a parameter akin to the attributable risk. Several regression methods/machine learning algorithms are available for estimating the nuisance parameters of the models, including super learner, a meta-learner which combines several different algorithms into one. We describe a simulation in which the double robust TMLE is compared to the graphical computation estimator. We also provide example analyses using two data sets which are included with the package.

  7. A Comprehensive Approach Towards Optimizing the Xenon Plasma Focused Ion Beam Instrument for Semiconductor Failure Analysis Applications.

    Subramaniam, Srinivas; Huening, Jennifer; Richards, John; Johnson, Kevin

    2017-08-01

    The xenon plasma focused ion beam instrument (PFIB), holds significant promise in expanding the applications of focused ion beams in new technology thrust areas. In this paper, we have explored the operational characteristics of a Tescan FERA3 XMH PFIB instrument with the aim of meeting current and future challenges in the semiconductor industry. A two part approach, with the first part aimed at optimizing the ion column and the second optimizing specimen preparation, has been undertaken. Detailed studies characterizing the ion column, optimizing for high-current/high mill rate activities, have been described to support a better understanding of the PFIB. In addition, a novel single-crystal sacrificial mask method has been developed and implemented for use in the PFIB. Using this combined approach, we have achieved high-quality images with minimal artifacts, while retaining the shorter throughput times of the PFIB. Although the work presented in this paper has been performed on a specific instrument, the authors hope that these studies will provide general insight to direct further improvement of PFIB design and applications.

  8. On the relationship between optical variability, visual saliency, and eye fixations: a computational approach.

    Garcia-Diaz, Antón; Leborán, Víctor; Fdez-Vidal, Xosé R; Pardo, Xosé M

    2012-06-12

    A hierarchical definition of optical variability is proposed that links physical magnitudes to visual saliency and yields a more reductionist interpretation than previous approaches. This definition is shown to be grounded on the classical efficient coding hypothesis. Moreover, we propose that a major goal of contextual adaptation mechanisms is to ensure the invariance of the behavior that the contribution of an image point to optical variability elicits in the visual system. This hypothesis and the necessary assumptions are tested through the comparison with human fixations and state-of-the-art approaches to saliency in three open access eye-tracking datasets, including one devoted to images with faces, as well as in a novel experiment using hyperspectral representations of surface reflectance. The results on faces yield a significant reduction of the potential strength of semantic influences compared to previous works. The results on hyperspectral images support the assumptions to estimate optical variability. As well, the proposed approach explains quantitative results related to a visual illusion observed for images of corners, which does not involve eye movements.

  9. Non-invasive identification of organic materials in historical stringed musical instruments by reflection infrared spectroscopy: a methodological approach.

    Invernizzi, Claudia; Daveri, Alessia; Vagnini, Manuela; Malagodi, Marco

    2017-05-01

    The analysis of historical musical instruments is becoming more relevant and the interest is increasingly moving toward the non-invasive reflection FTIR spectroscopy, especially for the analysis of varnishes. In this work, a specific infrared reflectance spectral library of organic compounds was created with the aim of identifying musical instrument materials in a totally non-invasive way. The analyses were carried out on pure organic compounds, as bulk samples and laboratory wooden models, to evaluate the diagnostic reflection mid-infrared (MIR) bands of proteins, polysaccharides, lipids, and resins by comparing reflection spectra before and after the KK correction. This methodological approach was applied to real case studies represented by four Stradivari violins and a Neapolitan mandolin.

  10. Drugs as instruments: Describing and testing a behavioral approach to the study of neuroenhancement

    Ralf Brand

    2016-08-01

    Full Text Available Neuroenhancement (NE is the non-medical use of psychoactive substances to produce a subjective enhancement in psychological functioning and experience. So far empirical investigations of individuals’ motivation for NE however have been hampered by the lack of theoretical foundation. This study aimed to apply drug instrumentalization theory to user motivation for NE. We argue that NE should be defined and analyzed from a behavioral perspective rather than in terms of the characteristics of substances used for NE. In the empirical study we explored user behavior by analyzing relationships between drug options (use over-the-counter products, prescription drugs, illicit drugs and postulated drug instrumentalization goals (e.g. improved cognitive performance, counteracting fatigue, improved social interaction. Questionnaire data from 1,438 university students were subjected to exploratory and confirmatory factor analysis to address the question of whether analysis of drug instrumentalization should be based on the assumption that users are aiming to achieve a certain goal and choose their drug accordingly or whether NE behavior is more strongly rooted in a decision to try or use a certain drug option. We used factor mixture modeling to explore whether users could be separated into qualitatively different groups defined by a shared ‘goal × drug option’ configuration. Our results indicate, first, that individuals’ decisions about NE are eventually based on personal attitude to drug options (e.g. willingness to use an over-the-counter product but not to abuse prescription drugs rather than motivated by desire to achieve a specific goal (e.g. fighting tiredness for which different drug options might be tried. Second, data analyses suggested two qualitatively different classes of users. Both predominantly used over-the-counter products, but ‘neuroenhancers’ might be characterized by a higher propensity to instrumentalize over

  11. Tackling regional health inequalities in france by resource allocation : a case for complementary instrumental and process-based approaches?

    Bellanger, Martine M; Jourdain, Alain

    2004-01-01

    This article aims to evaluate the results of two different approaches underlying the attempts to reduce health inequalities in France. In the 'instrumental' approach, resource allocation is based on an indicator to assess the well-being or the quality of life associated with healthcare provision, the argument being that additional resources would respond to needs that could then be treated quickly and efficiently. This governs the distribution of regional hospital budgets. In the second approach, health professionals and users in a given region are involved in a consensus process to define those priorities to be included in programme formulation. This 'procedural' approach is employed in the case of the regional health programmes. In this second approach, the evaluation of the results runs parallel with an analysis of the process using Rawlsian principles, whereas the first approach is based on the classical economic model.At this stage, a pragmatic analysis based on both the comparison of regional hospital budgets during the period 1992-2003 (calculated using a 'RAWP [resource allocation working party]-like' formula) and the evolution of regional health policies through the evaluation of programmes for the prevention of suicide, alcohol-related diseases and cancers provides a partial assessment of the impact of the two types of approaches, the second having a greater effect on the reduction of regional inequalities.

  12. Assessing compositional variability through graphical analysis and Bayesian statistical approaches: case studies on transgenic crops.

    Harrigan, George G; Harrison, Jay M

    2012-01-01

    New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.

  13. Optimal speech motor control and token-to-token variability: a Bayesian modeling approach.

    Patri, Jean-François; Diard, Julien; Perrier, Pascal

    2015-12-01

    The remarkable capacity of the speech motor system to adapt to various speech conditions is due to an excess of degrees of freedom, which enables producing similar acoustical properties with different sets of control strategies. To explain how the central nervous system selects one of the possible strategies, a common approach, in line with optimal motor control theories, is to model speech motor planning as the solution of an optimality problem based on cost functions. Despite the success of this approach, one of its drawbacks is the intrinsic contradiction between the concept of optimality and the observed experimental intra-speaker token-to-token variability. The present paper proposes an alternative approach by formulating feedforward optimal control in a probabilistic Bayesian modeling framework. This is illustrated by controlling a biomechanical model of the vocal tract for speech production and by comparing it with an existing optimal control model (GEPPETO). The essential elements of this optimal control model are presented first. From them the Bayesian model is constructed in a progressive way. Performance of the Bayesian model is evaluated based on computer simulations and compared to the optimal control model. This approach is shown to be appropriate for solving the speech planning problem while accounting for variability in a principled way.

  14. Comparing Diagnostic Accuracy of Cognitive Screening Instruments: A Weighted Comparison Approach

    A.J. Larner

    2013-03-01

    Full Text Available Background/Aims: There are many cognitive screening instruments available to clinicians when assessing patients' cognitive function, but the best way to compare the diagnostic utility of these tests is uncertain. One method is to undertake a weighted comparison which takes into account the difference in sensitivity and specificity of two tests, the relative clinical misclassification costs of true- and false-positive diagnosis, and also disease prevalence. Methods: Data were examined from four pragmatic diagnostic accuracy studies from one clinic which compared the Mini-Mental State Examination (MMSE with the Addenbrooke's Cognitive Examination-Revised (ACE-R, the Montreal Cognitive Assessment (MoCA, the Test Your Memory (TYM test, and the Mini-Mental Parkinson (MMP, respectively. Results: Weighted comparison calculations suggested a net benefit for ACE-R, MoCA, and MMP compared to MMSE, but a net loss for TYM test compared to MMSE. Conclusion: Routine incorporation of weighted comparison or other similar net benefit measures into diagnostic accuracy studies merits consideration to better inform clinicians of the relative value of cognitive screening instruments.

  15. Development of Authentic Assessment instruments for Critical Thinking skills in Global Warming with a Scientific Approach

    R. Surya Damayanti

    2017-12-01

    Full Text Available This study aims to develop an authentic assessment instrument to measure critical thinking skills in global warming learning and to describe the suitability, easiness, and usefulness of the use instruments which are developed base on the teacher’s opinion.   The development design is carried out by Borg & Gall (2003 development model, which is conducted with seven stages: information gathering stage, planning stage, product development stage, product test stage, product revision stage, field trial stage, and final product. The test subjects are students and teachers in SMA Lampung Tengah by using purposive sampling technique.  Global warming learning using authentic assessment consists of a series of learning activities, including observing, discussing, exploring, associating and communicating.  The results show the authentic assessment techniques global warming to measure and cultivate critical thinking skills consisting of written tests, performance, portfolios, projects, and attitudes.  The developed assessment model meets content and constructs validity, and effectively improves students' critical thinking skills and has a high level of suitability, easiness, and usefulness well-being. The assessment techniques are used in global warming learning are performance assessment techniques, portfolios, projects, products, and attitude that together contribute to the improvement of critical thinking skills on 97,4% of global warming learning.

  16. Assessing variable rate nitrogen fertilizer strategies within an extensively instrument field site using the MicroBasin model

    Ward, N. K.; Maureira, F.; Yourek, M. A.; Brooks, E. S.; Stockle, C. O.

    2014-12-01

    The current use of synthetic nitrogen fertilizers in agriculture has many negative environmental and economic costs, necessitating improved nitrogen management. In the highly heterogeneous landscape of the Palouse region in eastern Washington and northern Idaho, crop nitrogen needs vary widely within a field. Site-specific nitrogen management is a promising strategy to reduce excess nitrogen lost to the environment while maintaining current yields by matching crop needs with inputs. This study used in-situ hydrologic, nutrient, and crop yield data from a heavily instrumented field site in the high precipitation zone of the wheat-producing Palouse region to assess the performance of the MicroBasin model. MicroBasin is a high-resolution watershed-scale ecohydrologic model with nutrient cycling and cropping algorithms based on the CropSyst model. Detailed soil mapping conducted at the site was used to parameterize the model and the model outputs were evaluated with observed measurements. The calibrated MicroBasin model was then used to evaluate the impact of various nitrogen management strategies on crop yield and nitrate losses. The strategies include uniform application as well as delineating the field into multiple zones of varying nitrogen fertilizer rates to optimize nitrogen use efficiency. We present how coupled modeling and in-situ data sets can inform agricultural management and policy to encourage improved nitrogen management.

  17. Current status of the Essential Variables as an instrument to assess the Earth Observation Networks in Europe

    Blonda, Palma; Maso, Joan; Bombelli, Antonio; Plag, Hans Peter; McCallum, Ian; Serral, Ivette; Nativi, Stefano Stefano

    2016-04-01

    ConnectinGEO (Coordinating an Observation Network of Networks EnCompassing saTellite and IN-situ to fill the Gaps in European Observations" is an H2020 Coordination and Support Action with the primary goal of linking existing Earth Observation networks with science and technology (S&T) communities, the industry sector, the Group on Earth Observations (GEO), and Copernicus. The project will end in February 2017. Essential Variables (EVs) are defined by ConnectinGEO as "a minimal set of variables that determine the system's state and developments, are crucial for predicting system developments, and allow us to define metrics that measure the trajectory of the system". . Specific application-dependent characteristics, such as spatial and temporal resolution of observations and data quality thresholds, are not generally included in the EV definition. This definition and the present status of EV developments in different societal benefit areas was elaborated at the ConnectinGEO workshop "Towards a sustainability process for GEOSS Essential Variables (EVs)," which was held in Bari on June 11-12, 2015 (http://www.gstss.org/2015_Bari/). Presentations and reports contributed by a wide range of communities provided important inputs from different sectors for assessing the status of the EV development. In most thematic areas, the development of sets of EVs is a community process leading to an agreement on what is essential for the goals of the community. While there are many differences across the communities in the details of the criteria, methodologies and processes used to develop sets of EVs, there is also a considerable common core across the communities, particularly those with a more advanced discussion. In particular, there is some level of overlap in different topics (e.g., Climate and Water), and there is a potential to develop an integrated set of EVs common to several thematic areas as well as specific ones that satisfy only one community. The thematic areas with

  18. Approaches for developing a sizing method for stand-alone PV systems with variable demand

    Posadillo, R. [Grupo de Investigacion en Energias y Recursos Renovables, Dpto. de Fisica Aplicada, E.P.S., Universidad de Cordoba, Avda. Menendez Pidal s/n, 14004 Cordoba (Spain); Lopez Luque, R. [Grupo de Investigacion de Fisica para las Energias y Recursos Renovables, Dpto. de Fisica Aplicada. Edificio C2 Campus de Rabanales, 14071 Cordoba (Spain)

    2008-05-15

    Accurate sizing is one of the most important aspects to take into consideration when designing a stand-alone photovoltaic system (SAPV). Various methods, which differ in terms of their simplicity or reliability, have been developed for this purpose. Analytical methods, which seek functional relationships between variables of interest to the sizing problem, are one of these approaches. A series of rational considerations are presented in this paper with the aim of shedding light upon the basic principles and results of various sizing methods proposed by different authors. These considerations set the basis for a new analytical method that has been designed for systems with variable monthly energy demands. Following previous approaches, the method proposed is based on the concept of loss of load probability (LLP) - a parameter that is used to characterize system design. The method includes information on the standard deviation of loss of load probability ({sigma}{sub LLP}) and on two new parameters: annual number of system failures (f) and standard deviation of annual number of failures ({sigma}{sub f}). The method proves useful for sizing a PV system in a reliable manner and serves to explain the discrepancies found in the research on systems with LLP<10{sup -2}. We demonstrate that reliability depends not only on the sizing variables and on the distribution function of solar radiation, but on the minimum value as well, which in a given location and with a monthly average clearness index, achieves total solar radiation on the receiver surface. (author)

  19. Application of Safety Instrumented System (SIS) approach in older nuclear power plants

    Nasimi, Elnara; Gabbar, Hossam A., E-mail: hossam.gabbar@uoit.ca

    2016-05-15

    Highlights: • Study Safety Instrumented System (SIS) design for older nuclear power plant. • Apply SIS on Reheater Drains (RD) system. • Apply IEC 61508/61511 to design safety system. • Evaluate risk reduction based on proposed SIS design. - Abstract: In order to remain economically effective and financially profitable, the modern industries have to take their safety culture to a higher level and consider production losses in addition to simple accident prevention techniques. Ideally, compliance with safety requirements start during early design stages, but in some older facilities provisions for Safety Instrumented Systems (SIS) may not have been originally included. In this paper, a case study of a Reheater Drains (RD) system is used to illustrate such an example. Frequent failures of tank level controller lead to transients where the operation of shutting down RD pumps requires operators to manually isolate the quenching water and to close the main steam admission valves. Water in this system is at saturation temperature for the reheater steam side pressure, and any manual operation of the system is highly undesirable due to hazards of working with wet steam at approximately 758 kPa(g) pressure, preheated to 237 °C. Additionally, losses of inventory are highly undesirable as well and challenge other systems in the plant. In this paper, it is suggested that RD system can benefit from installation of an independent SIS system in order to address current challenges. This idea is being explored using IEC 61508 framework for “Functional safety of electrical/electronic/programmable electronic safety-related systems” to provide assurance that the SIS will offer the necessary risk reduction required to achieve required safety for the equipment.

  20. Density dependence and climate effects in Rocky Mountain elk: an application of regression with instrumental variables for population time series with sampling error.

    Creel, Scott; Creel, Michael

    2009-11-01

    1. Sampling error in annual estimates of population size creates two widely recognized problems for the analysis of population growth. First, if sampling error is mistakenly treated as process error, one obtains inflated estimates of the variation in true population trajectories (Staples, Taper & Dennis 2004). Second, treating sampling error as process error is thought to overestimate the importance of density dependence in population growth (Viljugrein et al. 2005; Dennis et al. 2006). 2. In ecology, state-space models are used to account for sampling error when estimating the effects of density and other variables on population growth (Staples et al. 2004; Dennis et al. 2006). In econometrics, regression with instrumental variables is a well-established method that addresses the problem of correlation between regressors and the error term, but requires fewer assumptions than state-space models (Davidson & MacKinnon 1993; Cameron & Trivedi 2005). 3. We used instrumental variables to account for sampling error and fit a generalized linear model to 472 annual observations of population size for 35 Elk Management Units in Montana, from 1928 to 2004. We compared this model with state-space models fit with the likelihood function of Dennis et al. (2006). We discuss the general advantages and disadvantages of each method. Briefly, regression with instrumental variables is valid with fewer distributional assumptions, but state-space models are more efficient when their distributional assumptions are met. 4. Both methods found that population growth was negatively related to population density and winter snow accumulation. Summer rainfall and wolf (Canis lupus) presence had much weaker effects on elk (Cervus elaphus) dynamics [though limitation by wolves is strong in some elk populations with well-established wolf populations (Creel et al. 2007; Creel & Christianson 2008)]. 5. Coupled with predictions for Montana from global and regional climate models, our results

  1. Study The role of latent variables in lost working days by Structural Equation Modeling Approach

    Meysam Heydari

    2016-12-01

    Full Text Available Background: Based on estimations, each year about 250 million work-related injuries and many temporary or permanent disabilities occur which most are preventable. Oil and Gas industries are among industries with high incidence of injuries in the world. The aim of this study has investigated  the role and effect of different risk management variables on lost working days (LWD in the seismic projects. Methods: This study was a retrospective, cross-sectional and systematic analysis, which was carried out on occupational accidents between 2008-2015(an 8 years period in different seismic projects for oilfield exploration at Dana Energy (Iranian Seismic Company. The preliminary sample size of the study were 487accidents. A systems analysis approach were applied by using root case analysis (RCA and structural equation modeling (SEM. Tools for the data analysis were included, SPSS23 and AMOS23  software. Results: The mean of lost working days (LWD, was calculated 49.57, the final model of structural equation modeling showed that latent variables of, safety and health training factor(-0.33, risk assessment factor(-0.55 and risk control factor (-0.61 as direct causes significantly affected of lost working days (LWD in the seismic industries (p< 0.05. Conclusion: The finding of present study revealed that combination of variables affected in lost working days (LWD. Therefore,the role of these variables in accidents should be investigated and suitable programs should be considered for them.

  2. Incorporating Latent Variables into Discrete Choice Models - A Simultaneous Estimation Approach Using SEM Software

    Dirk Temme

    2008-12-01

    Full Text Available Integrated choice and latent variable (ICLV models represent a promising new class of models which merge classic choice models with the structural equation approach (SEM for latent variables. Despite their conceptual appeal, applications of ICLV models in marketing remain rare. We extend previous ICLV applications by first estimating a multinomial choice model and, second, by estimating hierarchical relations between latent variables. An empirical study on travel mode choice clearly demonstrates the value of ICLV models to enhance the understanding of choice processes. In addition to the usually studied directly observable variables such as travel time, we show how abstract motivations such as power and hedonism as well as attitudes such as a desire for flexibility impact on travel mode choice. Furthermore, we show that it is possible to estimate such a complex ICLV model with the widely available structural equation modeling package Mplus. This finding is likely to encourage more widespread application of this appealing model class in the marketing field.

  3. A regression modeling approach for studying carbonate system variability in the northern Gulf of Alaska

    Evans, Wiley; Mathis, Jeremy T.; Winsor, Peter; Statscewich, Hank; Whitledge, Terry E.

    2013-01-01

    northern Gulf of Alaska (GOA) shelf experiences carbonate system variability on seasonal and annual time scales, but little information exists to resolve higher frequency variability in this region. To resolve this variability using platforms-of-opportunity, we present multiple linear regression (MLR) models constructed from hydrographic data collected along the Northeast Pacific Global Ocean Ecosystems Dynamics (GLOBEC) Seward Line. The empirical algorithms predict dissolved inorganic carbon (DIC) and total alkalinity (TA) using observations of nitrate (NO3-), temperature, salinity and pressure from the surface to 500 m, with R2s > 0.97 and RMSE values of 11 µmol kg-1 for DIC and 9 µmol kg-1 for TA. We applied these relationships to high-resolution NO3- data sets collected during a novel 20 h glider flight and a GLOBEC mesoscale SeaSoar survey. Results from the glider flight demonstrated time/space along-isopycnal variability of aragonite saturations (Ωarag) associated with a dicothermal layer (a cold near-surface layer found in high latitude oceans) that rivaled changes seen vertically through the thermocline. The SeaSoar survey captured the uplift to aragonite saturation horizon (depth where Ωarag = 1) shoaled to a previously unseen depth in the northern GOA. This work is similar to recent studies aimed at predicting the carbonate system in continental margin settings, albeit demonstrates that a NO3--based approach can be applied to high-latitude data collected from platforms capable of high-frequency measurements.

  4. MEANINGFUL VARIABILITY: A SOCIOLINGUISTICALLY-GROUNDED APPROACH TO VARIATION IN OPTIMALITY THEORY

    Juan Antonio Cutillas Espinosa

    2004-12-01

    Full Text Available Most approaches to variability in Optimality Theory have attempted to make variation possible within the OT framework, i.e. to reformulate constraints and rankings to accommodate variable and gradient linguistic facts. Sociolinguists have attempted to apply these theoretical advances to the study of language variation, with an emphasis on language-interna1 variables (Auger 2001, Cardoso 2001. Little attention has been paid to the array of externa1 factors that influence the patterning of variation. In this paper, we argue that some variation pattems-specially those that are socially meaningful- are actually the result of a three-grarnmar system. G, is the standard grammar, which has to be available to the speaker to obtain these variation patterns. G; is the vernacular grammar, which the speaker is likely to have acquired in his local community. Finally, G, is an intergrammar, which is used by the speaker as his 'default' constraint set. G is a continuous ranking (Boersma & Hayes 2001 and domination relations are consciously altered by the speakers to shape the appropriate and variable linguistic output. We illustrate this model with analyses of English and Spanish.

  5. Sensitivity analysis on uncertainty variables affecting the NPP's LUEC with probabilistic approach

    Nuryanti; Akhmad Hidayatno; Erlinda Muslim

    2013-01-01

    One thing that is quite crucial to be reviewed prior to any investment decision on the nuclear power plant (NPP) project is the calculation of project economic, including calculation of Levelized Unit Electricity Cost (LUEC). Infrastructure projects such as NPP’s project are vulnerable to a number of uncertainty variables. Information on the uncertainty variables which makes LUEC’s value quite sensitive due to the changes of them is necessary in order the cost overrun can be avoided. Therefore this study aimed to do the sensitivity analysis on variables that affect LUEC with probabilistic approaches. This analysis was done by using Monte Carlo technique that simulate the relationship between the uncertainty variables and visible impact on LUEC. The sensitivity analysis result shows the significant changes on LUEC value of AP1000 and OPR due to the sensitivity of investment cost and capacity factors. While LUEC changes due to sensitivity of U 3 O 8 ’s price looks not quite significant. (author)

  6. A variational conformational dynamics approach to the selection of collective variables in metadynamics

    McCarty, James; Parrinello, Michele

    2017-11-01

    In this paper, we combine two powerful computational techniques, well-tempered metadynamics and time-lagged independent component analysis. The aim is to develop a new tool for studying rare events and exploring complex free energy landscapes. Metadynamics is a well-established and widely used enhanced sampling method whose efficiency depends on an appropriate choice of collective variables. Often the initial choice is not optimal leading to slow convergence. However by analyzing the dynamics generated in one such run with a time-lagged independent component analysis and the techniques recently developed in the area of conformational dynamics, we obtain much more efficient collective variables that are also better capable of illuminating the physics of the system. We demonstrate the power of this approach in two paradigmatic examples.

  7. Children's Learning in Scientific Thinking: Instructional Approaches and Roles of Variable Identification and Executive Function

    Blums, Angela

    The present study examines instructional approaches and cognitive factors involved in elementary school children's thinking and learning the Control of Variables Strategy (CVS), a critical aspect of scientific reasoning. Previous research has identified several features related to effective instruction of CVS, including using a guided learning approach, the use of self-reflective questions, and learning in individual and group contexts. The current study examined the roles of procedural and conceptual instruction in learning CVS and investigated the role of executive function in the learning process. Additionally, this study examined how learning to identify variables is a part of the CVS process. In two studies (individual and classroom experiments), 139 third, fourth, and fifth grade students participated in hands-on and paper and pencil CVS learning activities and, in each study, were assigned to either a procedural instruction, conceptual instruction, or control (no instruction) group. Participants also completed a series of executive function tasks. The study was carried out with two parts--Study 1 used an individual context and Study 2 was carried out in a group setting. Results indicated that procedural and conceptual instruction were more effective than no instruction, and the ability to identify variables was identified as a key piece to the CVS process. Executive function predicted ability to identify variables and predicted success on CVS tasks. Developmental differences were present, in that older children outperformed younger children on CVS tasks, and that conceptual instruction was slightly more effective for older children. Some differences between individual and group instruction were found, with those in the individual context showing some advantage over the those in the group setting in learning CVS concepts. Conceptual implications about scientific thinking and practical implications in science education are discussed.

  8. 78 FR 47047 - Proposed Policy for Discontinuance of Certain Instrument Approach Procedures

    2013-08-02

    ... the cancellation of certain Non-directional Beacon (NDB) and Very High Frequency (VHF) Omnidirectional... approach procedures. The FAA proposes specific criteria to guide the identification and selection of... selection of potential NDB and VOR procedures for cancellation. Once the criteria are established and the...

  9. Squeezing more information out of time variable gravity data with a temporal decomposition approach

    Barletta, Valentina Roberta; Bordoni, A.; Aoudia, A.

    2012-01-01

    an explorative approach based on a suitable time series decomposition, which does not rely on predefined time signatures. The comparison and validation against the fitting approach commonly used in GRACE literature shows a very good agreement for what concerns trends and periodic signals on one side......A measure of the Earth's gravity contains contributions from solid Earth as well as climate-related phenomena, that cannot be easily distinguished both in time and space. After more than 7years, the GRACE gravity data available now support more elaborate analysis on the time series. We propose...... used to assess the possibility of finding evidence of meaningful geophysical signals different from hydrology over Africa in GRACE data. In this case we conclude that hydrological phenomena are dominant and so time variable gravity data in Africa can be directly used to calibrate hydrological models....

  10. An analytical approach to separate climate and human contributions to basin streamflow variability

    Li, Changbin; Wang, Liuming; Wanrui, Wang; Qi, Jiaguo; Linshan, Yang; Zhang, Yuan; Lei, Wu; Cui, Xia; Wang, Peng

    2018-04-01

    Climate variability and anthropogenic regulations are two interwoven factors in the ecohydrologic system across large basins. Understanding the roles that these two factors play under various hydrologic conditions is of great significance for basin hydrology and sustainable water utilization. In this study, we present an analytical approach based on coupling water balance method and Budyko hypothesis to derive effectiveness coefficients (ECs) of climate change, as a way to disentangle contributions of it and human activities to the variability of river discharges under different hydro-transitional situations. The climate dominated streamflow change (ΔQc) by EC approach was compared with those deduced by the elasticity method and sensitivity index. The results suggest that the EC approach is valid and applicable for hydrologic study at large basin scale. Analyses of various scenarios revealed that contributions of climate change and human activities to river discharge variation differed among the regions of the study area. Over the past several decades, climate change dominated hydro-transitions from dry to wet, while human activities played key roles in the reduction of streamflow during wet to dry periods. Remarkable decline of discharge in upstream was mainly due to human interventions, although climate contributed more to runoff increasing during dry periods in the semi-arid downstream. Induced effectiveness on streamflow changes indicated a contribution ratio of 49% for climate and 51% for human activities at the basin scale from 1956 to 2015. The mathematic derivation based simple approach, together with the case example of temporal segmentation and spatial zoning, could help people understand variation of river discharge with more details at a large basin scale under the background of climate change and human regulations.

  11. Fatigue Crack Propagation Under Variable Amplitude Loading Analyses Based on Plastic Energy Approach

    Sofiane Maachou

    2014-04-01

    Full Text Available Plasticity effects at the crack tip had been recognized as “motor” of crack propagation, the growth of cracks is related to the existence of a crack tip plastic zone, whose formation and intensification is accompanied by energy dissipation. In the actual state of knowledge fatigue crack propagation is modeled using crack closure concept. The fatigue crack growth behavior under constant amplitude and variable amplitude loading of the aluminum alloy 2024 T351 are analyzed using in terms energy parameters. In the case of VAL (variable amplitude loading tests, the evolution of the hysteretic energy dissipated per block is shown similar with that observed under constant amplitude loading. A linear relationship between the crack growth rate and the hysteretic energy dissipated per block is obtained at high growth rates. For lower growth rates values, the relationship between crack growth rate and hysteretic energy dissipated per block can represented by a power law. In this paper, an analysis of fatigue crack propagation under variable amplitude loading based on energetic approach is proposed.

  12. Assessment of variable fluorescence fluorometry as an approach for rapidly detecting living photoautotrophs in ballast water

    First, Matthew R.; Robbins-Wamsley, Stephanie H.; Riley, Scott C.; Drake, Lisa A.

    2018-03-01

    Variable fluorescence fluorometry, an analytical approach that estimates the fluorescence yield of chlorophyll a (F0, a proximal measure of algal concentration) and photochemical yield (FV/FM, an indicator of the physiological status of algae) was evaluated as a means to rapidly assess photoautotrophs. Specifically, it was used to gauge the efficacy of ballast water treatment designed to reduce the transport and delivery of potentially invasive organisms. A phytoflagellate, Tetraselmis spp. (10-12 μm) and mixed communities of ambient protists were examined in both laboratory experiments and large-scale field trials simulating 5-d hold times in mock ballast tanks. In laboratory incubations, ambient organisms held in the dark exhibited declining F0 and FV/FM measurements relative to organisms held under lighted conditions. In field experiments, increases and decreases in F0 and FV/FM over the tank hold time corresponded to those of microscope counts of organisms in two of three trials. In the third trial, concentrations of organisms ≥ 10 and protists) increased while F0 and FV/FM decreased. Rapid and sensitive, variable fluorescence fluorometry is appropriate for detecting changes in organism concentrations and physiological status in samples dominated by microalgae. Changes in the heterotrophic community, which may become more prevalent in light-limited ballast tanks, would not be detected via variable fluorescence fluorometry, however.

  13. Self-Consciousness and Assertiveness as Explanatory Variables of L2 Oral Ability: A Latent Variable Approach

    Ockey, Gary

    2011-01-01

    Drawing on current theories in personality, second-language (L2) oral ability, and psychometrics, this study investigates the extent to which self-consciousness and assertiveness are explanatory variables of L2 oral ability. Three hundred sixty first-year Japanese university students who were studying English as a foreign language participated in…

  14. A standardized approach to study human variability in isometric thermogenesis during low-intensity physical activity

    Delphine eSarafian

    2013-07-01

    Full Text Available Limitations of current methods: The assessment of human variability in various compartments of daily energy expenditure (EE under standardized conditions is well defined at rest (as basal metabolic rate and thermic effect of feeding, and currently under validation for assessing the energy cost of low-intensity dynamic work. However, because physical activities of daily life consist of a combination of both dynamic and isometric work, there is also a need to develop standardized tests for assessing human variability in the energy cost of low-intensity isometric work.Experimental objectives: Development of an approach to study human variability in isometric thermogenesis by incorporating a protocol of intermittent leg press exercise of varying low-intensity isometric loads with measurements of EE by indirect calorimetry. Results: EE was measured in the seated position with the subject at rest or while intermittently pressing both legs against a press-platform at 5 low-intensity isometric loads (+5, +10, + 15, +20 and +25 kg force, each consisting of a succession of 8 cycles of press (30 s and rest (30 s. EE, integrated over each 8-min period of the intermittent leg press exercise, was found to increase linearly across the 5 isometric loads with a correlation coefficient (r > 0.9 for each individual. The slope of this EE-Load relationship, which provides the energy cost of this standardized isometric exercise expressed per kg force applied intermittently (30 s in every min, was found to show good repeatability when assessed in subjects who repeated the same experimental protocol on 3 separate days: its low intra-individual coefficient of variation (CV of ~ 10% contrasted with its much higher inter-individual CV of 35%; the latter being mass-independent but partly explained by height. Conclusion: This standardized approach to study isometric thermogenesis opens up a new avenue for research in EE phenotyping and metabolic predisposition to obesity

  15. P20 - the integrated system approach to nuclear control and instrumentation

    Jansen Van Rensburg, C.F.

    1990-01-01

    The P20 System is a data acquisition, control and monitoring system which has been jointly developed by Cegelec and Electricite de France. This system has been developed with the stringent requirements of a nuclear power generating facility in mind. The system has a hierarchical structure consisting of local area networks, supporting distributed plant interfaces, processing units and man-machine interfaces. The system offers exceptional availability and reliability through the implementation of extensive self-diagnostic techniques and multiple redundancy. This paper describes the system, the applied design techniques and the resulting benefits from using this design approach. 3 figs

  16. Unit-specific calibration of Actigraph accelerometers in a mechanical setup - is it worth the effort? The effect on random output variation caused by technical inter-instrument variability in the laboratory and in the field

    Moeller, Niels C; Korsholm, Lars; Kristensen, Peter L

    2008-01-01

    BACKGROUND: Potentially, unit-specific in-vitro calibration of accelerometers could increase field data quality and study power. However, reduced inter-unit variability would only be important if random instrument variability contributes considerably to the total variation in field data. Therefor...

  17. Variable system: An alternative approach for the analysis of mediated moderation.

    Kwan, Joyce Lok Yin; Chan, Wai

    2018-06-01

    Mediated moderation (meMO) occurs when the moderation effect of the moderator (W) on the relationship between the independent variable (X) and the dependent variable (Y) is transmitted through a mediator (M). To examine this process empirically, 2 different model specifications (Type I meMO and Type II meMO) have been proposed in the literature. However, both specifications are found to be problematic, either conceptually or statistically. For example, it can be shown that each type of meMO model is statistically equivalent to a particular form of moderated mediation (moME), another process that examines the condition when the indirect effect from X to Y through M varies as a function of W. Consequently, it is difficult for one to differentiate these 2 processes mathematically. This study therefore has 2 objectives. First, we attempt to differentiate moME and meMO by proposing an alternative specification for meMO. Conceptually, this alternative specification is intuitively meaningful and interpretable, and, statistically, it offers meMO a unique representation that is no longer identical to its moME counterpart. Second, using structural equation modeling, we propose an integrated approach for the analysis of meMO as well as for other general types of conditional path models. VS, a computer software program that implements the proposed approach, has been developed to facilitate the analysis of conditional path models for applied researchers. Real examples are considered to illustrate how the proposed approach works in practice and to compare its performance against the traditional methods. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  18. Approaches to and Treatment Strategies for Playing-Related Pain Problems Among Czech Instrumental Music Students: An Epidemiological Study.

    Ioannou, Christos I; Altenmüller, Eckart

    2015-09-01

    The current study examined the severity of playing-related pain (PRP) problems among music students at the Prague State Conservatoire, as well as the various treatment methods used by these students and how they approach and deal with these phenomena while studying. In total, 180 instrumental students participated and completed a paper questionnaire. Of these, 88.9% reported that they had experienced PRP at least once in their lives, with 12.6% experiencing pain every time they play. The onset of PRP seemed to coincide with the transition period on entry to the conservatoire and was associated with the increase in hours of practice. Specific body regions associated with playing each particular instrument were most frequently affected, with females being more susceptible than males to the development of PRP. An alarming 35% of the affected students tended not to seek help at all, whereas those who did tended to seek advice first from their instrument tutor and second from medical doctors. Most students who visited doctors reported that medical treatments only partially helped them to overcome PRP problems. The most frequent treatment methods used were resting, gel or creams, and physical exercises. Students believed that inappropriate posture played a key role in the development of their PRP problems. Finally, students indicated a willingness to be aware of and educated about PRP issues during their studies. Further exploration of PRP problems among student musicians is warranted. Better understanding of differing attitudes toward, use of, and efficiency of various treatment methods after the occurrence of PRPs will provide additional insight for prevention and treatment.

  19. The hydrodynamic basis of the vacuum cleaner effect in continuous-flow PCNL instruments: an empiric approach and mathematical model.

    Mager, R; Balzereit, C; Gust, K; Hüsch, T; Herrmann, T; Nagele, U; Haferkamp, A; Schilling, D

    2016-05-01

    Passive removal of stone fragments in the irrigation stream is one of the characteristics in continuous-flow PCNL instruments. So far the physical principle of this so-called vacuum cleaner effect has not been fully understood yet. The aim of the study was to empirically prove the existence of the vacuum cleaner effect and to develop a physical hypothesis and generate a mathematical model for this phenomenon. In an empiric approach, common low-pressure PCNL instruments and conventional PCNL sheaths were tested using an in vitro model. Flow characteristics were visualized by coloring of irrigation fluid. Influence of irrigation pressure, sheath diameter, sheath design, nephroscope design and position of the nephroscope was assessed. Experiments were digitally recorded for further slow-motion analysis to deduce a physical model. In each tested nephroscope design, we could observe the vacuum cleaner effect. Increase in irrigation pressure and reduction in cross section of sheath sustained the effect. Slow-motion analysis of colored flow revealed a synergism of two effects causing suction and transportation of the stone. For the first time, our model showed a flow reversal in the sheath as an integral part of the origin of the stone transportation during vacuum cleaner effect. The application of Bernoulli's equation provided the explanation of these effects and confirmed our experimental results. We widen the understanding of PCNL with a conclusive physical model, which explains fluid mechanics of the vacuum cleaner effect.

  20. Instrumenting an upland research catchment in Canterbury, New Zealand to study controls on variability of soil moisture, shallow groundwater and streamflow

    McMillan, Hilary; Srinivasan, Ms

    2015-04-01

    Hydrologists recognise the importance of vertical drainage and deep flow paths in runoff generation, even in headwater catchments. Both soil and groundwater stores are highly variable over multiple scales, and the distribution of water has a strong control on flow rates and timing. In this study, we instrumented an upland headwater catchment in New Zealand to measure the temporal and spatial variation in unsaturated and saturated-zone responses. In NZ, upland catchments are the source of much of the water used in lowland agriculture, but the hydrology of such catchments and their role in water partitioning, storage and transport is poorly understood. The study area is the Langs Gully catchment in the North Branch of the Waipara River, Canterbury: this catchment was chosen to be representative of the foothills environment, with lightly managed dryland pasture and native Matagouri shrub vegetation cover. Over a period of 16 months we measured continuous soil moisture at 32 locations and near-surface water table (versus hillslope locations, and convergent versus divergent hillslopes. We found that temporal variability is strongly controlled by the climatic seasonal cycle, for both soil moisture and water table, and for both the mean and extremes of their distributions. Groundwater is a larger water storage component than soil moisture, and the difference increases with catchment wetness. The spatial standard deviation of both soil moisture and groundwater is larger in winter than in summer. It peaks during rainfall events due to partial saturation of the catchment, and also rises in spring as different locations dry out at different rates. The most important controls on spatial variability are aspect and distance from stream. South-facing and near-stream locations have higher water tables and more, larger soil moisture wetting events. Typical hydrological models do not explicitly account for aspect, but our results suggest that it is an important factor in hillslope

  1. An emotional processing writing intervention and heart rate variability: the role of emotional approach.

    Seeley, Saren H; Yanez, Betina; Stanton, Annette L; Hoyt, Michael A

    2017-08-01

    Expressing and understanding one's own emotional responses to negative events, particularly those that challenge the attainment of important life goals, is thought to confer physiological benefit. Individual preferences and/or abilities in approaching emotions might condition the efficacy of interventions designed to encourage written emotional processing (EP). This study examines the physiological impact (as indexed by heart rate variability (HRV)) of an emotional processing writing (EPW) task as well as the moderating influence of a dispositional preference for coping through emotional approach (EP and emotional expression (EE)), in response to a laboratory stress task designed to challenge an important life goal. Participants (n = 98) were randomly assigned to either EPW or fact control writing (FCW) following the stress task. Regression analyses revealed a significant dispositional EP by condition interaction, such that high EP participants in the EPW condition demonstrated higher HRV after writing compared to low EP participants. No significant main effects of condition or EE coping were observed. These findings suggest that EPW interventions may be best suited for those with preference or ability to process emotions related to a stressor or might require adaptation for those who less often cope through emotional approach.

  2. Classic electrocardiogram-based and mobile technology derived approaches to heart rate variability are not equivalent.

    Guzik, Przemyslaw; Piekos, Caroline; Pierog, Olivia; Fenech, Naiman; Krauze, Tomasz; Piskorski, Jaroslaw; Wykretowicz, Andrzej

    2018-05-01

    We compared classic ECG-derived versus a mobile approach to heart rate variability (HRV) measurement. 29 young adult healthy volunteers underwent a simultaneous recording of heart rate using an ECG and a chest heart rate monitor at supine rest, during mental stress and active standing. Mean RR interval, Standard Deviation of Normal-to-Normal (SDNN) of RR intervals, and Root Mean Square of the Successive Differences (RMSSD) between RR intervals were computed in 168 pairs of 5-minute epochs by in-house software on a PC (only sinus beats) and by mobile application "ELITEHRV" on a smartphone (no beat type identification). ECG analysis showed that 33.9% of the recordings contained at least one non-sinus beat or artefact, the mobile app did not report this. The mean RR intervals were significantly longer (p = 0.0378), while SDNN (p = 0.0001) and RMSSD (p = 0.0199) were smaller for the mobile approach. Measures of identical HRV parameters by ECG-based and mobile approaches are not equivalent. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Analysis of optically variable devices using a photometric light-field approach

    Soukup, Daniel; Å tolc, Svorad; Huber-Mörk, Reinhold

    2015-03-01

    Diffractive Optically Variable Image Devices (DOVIDs), sometimes loosely referred to as holograms, are popular security features for protecting banknotes, ID cards, or other security documents. Inspection, authentication, as well as forensic analysis of these security features are still demanding tasks requiring special hardware tools and expert knowledge. Existing equipment for such analyses is based either on a microscopic analysis of the grating structure or a point-wise projection and recording of the diffraction patterns. We investigated approaches for an examination of DOVID security features based on sampling the Bidirectional Reflectance Distribution Function (BRDF) of DOVIDs using photometric stereo- and light-field-based methods. Our approach is demonstrated on the practical task of automated discrimination between genuine and counterfeited DOVIDs on banknotes. For this purpose, we propose a tailored feature descriptor which is robust against several expected sources of inaccuracy but still specific enough for the given task. The suggested approach is analyzed from both theoretical as well as practical viewpoints and w.r.t. analysis based on photometric stereo and light fields. We show that especially the photometric method provides a reliable and robust tool for revealing DOVID behavior and authenticity.

  4. Describing the interannual variability of precipitation with the derived distribution approach: effects of record length and resolution

    C. I. Meier

    2016-10-01

    , analyzing the ability of the DD to estimate the long-term standard deviation of annual rainfall, as compared to direct computation from the sample of annual totals. Our results show that, as compared to the fitting of a normal or lognormal distribution (or equivalently, direct estimation of the sample moments, the DD approach reduces the uncertainty in annual precipitation estimates (especially interannual variability when only short records (below 6–8 years are available. In such cases, it also reduces the bias in annual precipitation quantiles with high return periods. We demonstrate that using precipitation data aggregated every 24 h, as commonly available at most weather stations, introduces a noticeable bias in the DD. These results point to the tangible benefits of installing high-resolution (hourly, at least precipitation gauges, next to the customary, manual rain-measuring instrument, at previously ungauged locations. We propose that the DD approach is a suitable tool for the statistical description and study of annual rainfall, not only when short records are available, but also when dealing with nonstationary time series of precipitation. Finally, to avert any misinterpretation of the presented method, we should like to emphasize that it only applies for climatic analyses of annual precipitation totals; even though storm data are used, there is no relation to the study of extreme rainfall intensities needed for engineering design.

  5. DEVELOPING ONLINE CO-CREATION INSTRUMENTS BASED ON A FOCUS GROUP APPROACH: THE E-PICUS CASE

    ALEXA Lidia

    2016-09-01

    Full Text Available The current business environment is in constant change, characterized by increased competition and in order to remain relevant and to create products and services that respond better to the customers’ needs and expectations, companies need to become more innovative and proactive. To address the competitive challenges, more and more companies are using innovation co-creation where all the relevant stakeholders are participating across the value chain, from idea generation, selection, development and eventually, even to marketing the new products or services.The paper presents the process of developing an online cocreation. The platform, within the framework of a research project, underlying the importance of using a focus group approach for requirements elicitation in IT instruments development.

  6. Comprehensive spectral and instrumental approaches for the easy monitoring of features and purity of different carbon nanostructures for nanocomposite applications

    Boccaleri, Enrico; Arrais, Aldo; Frache, Alberto; Gianelli, Walter; Fino, Paolo; Camino, Giovanni

    2006-01-01

    A wide series of carbon nanostructures (ranging from fullerenes, through carbon nanotubes, up to carbon nanofibers) promise to change several fields in material science, but a real industrial implementation depends on their availability at reasonable prices with affordable and reproducible degrees of purity. In this study we propose simple instrumental approaches to efficiently characterize different commercial samples, particularly for qualitative evaluation of impurities, the discrimination of their respective spectral features and, when possible, for quantitative determination. We critically discuss information that researchers in the field of nanocomposite technology can achieve in this aim by spectral techniques such as Raman and FT-IR spectroscopy, thermo-gravimetrical analysis, mass spectrometry-hyphenated thermogravimetry, X-ray diffraction and energy dispersive spectroscopy. All these can be helpful, in applied research on material science, for a fast reliable monitoring of the actual purity of carbon products in both commercial and laboratory-produced samples as well as in composite materials

  7. A Miniaturized Variable Pressure Scanning Electron Microscope (MVP-SEM) for the Surface of Mars: An Instrument for the Planetary Science Community

    Edmunson, J.; Gaskin, J. A.; Danilatos, G.; Doloboff, I. J.; Effinger, M. R.; Harvey, R. P.; Jerman, G. A.; Klein-Schoder, R.; Mackie, W.; Magera, B.; hide

    2016-01-01

    The Miniaturized Variable Pressure Scanning Electron Microscope(MVP-SEM) project, funded by the NASA Planetary Instrument Concepts for the Advancement of Solar System Observations (PICASSO) Research Opportunities in Space and Earth Science (ROSES), will build upon previous miniaturized SEM designs for lunar and International Space Station (ISS) applications and recent advancements in variable pressure SEM's to design and build a SEM to complete analyses of samples on the surface of Mars using the atmosphere as an imaging medium. By the end of the PICASSO work, a prototype of the primary proof-of-concept components (i.e., the electron gun, focusing optics and scanning system)will be assembled and preliminary testing in a Mars analog chamber at the Jet Propulsion Laboratory will be completed to partially fulfill Technology Readiness Level to 5 requirements for those components. The team plans to have Secondary Electron Imaging(SEI), Backscattered Electron (BSE) detection, and Energy Dispersive Spectroscopy (EDS) capabilities through the MVP-SEM.

  8. A Comparison of Approaches for the Analysis of Interaction Effects between Latent Variables Using Partial Least Squares Path Modeling

    Henseler, Jorg; Chin, Wynne W.

    2010-01-01

    In social and business sciences, the importance of the analysis of interaction effects between manifest as well as latent variables steadily increases. Researchers using partial least squares (PLS) to analyze interaction effects between latent variables need an overview of the available approaches as well as their suitability. This article…

  9. Variable speed wind turbine control by discrete-time sliding mode approach.

    Torchani, Borhen; Sellami, Anis; Garcia, Germain

    2016-05-01

    The aim of this paper is to propose a new design variable speed wind turbine control by discrete-time sliding mode approach. This methodology is designed for linear saturated system. The saturation constraint is reported on inputs vector. To this end, the back stepping design procedure is followed to construct a suitable sliding manifold that guarantees the attainment of a stabilization control objective. It is well known that the mechanisms are investigated in term of the most proposed assumptions to deal with the damping, shaft stiffness and inertia effect of the gear. The objectives are to synthesize robust controllers that maximize the energy extracted from wind, while reducing mechanical loads and rotor speed tracking combined with an electromagnetic torque. Simulation results of the proposed scheme are presented. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  10. Some issues in the loop variable approach to open strings and an extension to closed strings

    Sathiapalan, B.

    1994-01-01

    Some issues in the loop variable renormalization group approach to gauge-invariant equations for the free fields of the open string are discussed. It had been shown in an earlier paper that this leads to a simple form of the gauge transformation law. We discuss in some detail some of the curious features encountered there. The theory looks a little like a massless theory in one higher dimension that can be dimensionally reduced to give a massive theory. We discuss the origin of some constraints that are needed for gauge invariance and also for reducing the set of fields to that of standard string theory. The mechanism of gauge invariance and the connection with the Virasoro algebra is a little different from the usual story and is discussed. It is also shown that these results can be extended in a straightforward manner to closed strings. (orig.)

  11. A state variable approach to the BESSY II local beam-position-feedback system

    Gilpatrick, J.D.; Khan, S.; Kraemer, D.

    1996-01-01

    At the BESSY II facility, stability of the electron beam position and angle near insertion devices (IDs) is of utmost importance. Disturbances due to ground motion could result in unwanted broad-bandwidth beam-jitter which decreases the electron (and resultant photon) beam's effective brightness. Therefore, feedback techniques must be used. Operating over a frequency range of 100-Hz, a local feedback system will correct these beam-trajectory errors using the four bumps around IDs. This paper reviews how the state-variable feedback approach can be applied to real-time correction of these beam position and angle errors. A frequency-domain solution showing beam jitter reduction is presented. Finally, this paper reports results of a beam-feedback test at BESSY I

  12. Assessing multiscale complexity of short heart rate variability series through a model-based linear approach

    Porta, Alberto; Bari, Vlasta; Ranuzzi, Giovanni; De Maria, Beatrice; Baselli, Giuseppe

    2017-09-01

    We propose a multiscale complexity (MSC) method assessing irregularity in assigned frequency bands and being appropriate for analyzing the short time series. It is grounded on the identification of the coefficients of an autoregressive model, on the computation of the mean position of the poles generating the components of the power spectral density in an assigned frequency band, and on the assessment of its distance from the unit circle in the complex plane. The MSC method was tested on simulations and applied to the short heart period (HP) variability series recorded during graded head-up tilt in 17 subjects (age from 21 to 54 years, median = 28 years, 7 females) and during paced breathing protocols in 19 subjects (age from 27 to 35 years, median = 31 years, 11 females) to assess the contribution of time scales typical of the cardiac autonomic control, namely in low frequency (LF, from 0.04 to 0.15 Hz) and high frequency (HF, from 0.15 to 0.5 Hz) bands to the complexity of the cardiac regulation. The proposed MSC technique was compared to a traditional model-free multiscale method grounded on information theory, i.e., multiscale entropy (MSE). The approach suggests that the reduction of HP variability complexity observed during graded head-up tilt is due to a regularization of the HP fluctuations in LF band via a possible intervention of sympathetic control and the decrement of HP variability complexity observed during slow breathing is the result of the regularization of the HP variations in both LF and HF bands, thus implying the action of physiological mechanisms working at time scales even different from that of respiration. MSE did not distinguish experimental conditions at time scales larger than 1. Over a short time series MSC allows a more insightful association between cardiac control complexity and physiological mechanisms modulating cardiac rhythm compared to a more traditional tool such as MSE.

  13. Variable Pitch Approach for Performance Improving of Straight-Bladed VAWT at Rated Tip Speed Ratio

    Zhenzhou Zhao

    2018-06-01

    Full Text Available This paper presents a new variable pitch (VP approach to increase the peak power coefficient of the straight-bladed vertical-axis wind turbine (VAWT, by widening the azimuthal angle band of the blade with the highest aerodynamic torque, instead of increasing the highest torque. The new VP-approach provides a curve of pitch angle designed for the blade operating at the rated tip speed ratio (TSR corresponding to the peak power coefficient of the fixed pitch (FP-VAWT. The effects of the new approach are exploited by using the double multiple stream tubes (DMST model and Prandtl’s mathematics to evaluate the blade tip loss. The research describes the effects from six aspects, including the lift, drag, angle of attack (AoA, resultant velocity, torque, and power output, through a comparison between VP-VAWTs and FP-VAWTs working at four TSRs: 4, 4.5, 5, and 5.5. Compared with the FP-blade, the VP-blade has a wider azimuthal zone with the maximum AoA, lift, drag, and torque in the upwind half-cycle, and yields the two new larger maximum values in the downwind half-cycle. The power distribution in the swept area of the turbine changes from an arched shape of the FP-VAWT into the rectangular shape of the VP-VAWT. The new VP-approach markedly widens the highest-performance zone of the blade in a revolution, and ultimately achieves an 18.9% growth of the peak power coefficient of the VAWT at the optimum TSR. Besides achieving this growth, the new pitching method will enhance the performance at TSRs that are higher than current optimal values, and an increase of torque is also generated.

  14. A hybrid approach to fault diagnosis of roller bearings under variable speed conditions

    Wang, Yanxue; Yang, Lin; Xiang, Jiawei; Yang, Jianwei; He, Shuilong

    2017-12-01

    Rolling element bearings are one of the main elements in rotating machines, whose failure may lead to a fatal breakdown and significant economic losses. Conventional vibration-based diagnostic methods are based on the stationary assumption, thus they are not applicable to the diagnosis of bearings working under varying speeds. This constraint limits the bearing diagnosis to the industrial application significantly. A hybrid approach to fault diagnosis of roller bearings under variable speed conditions is proposed in this work, based on computed order tracking (COT) and variational mode decomposition (VMD)-based time frequency representation (VTFR). COT is utilized to resample the non-stationary vibration signal in the angular domain, while VMD is used to decompose the resampled signal into a number of band-limited intrinsic mode functions (BLIMFs). A VTFR is then constructed based on the estimated instantaneous frequency and instantaneous amplitude of each BLIMF. Moreover, the Gini index and time-frequency kurtosis are both proposed to quantitatively measure the sparsity and concentration measurement of time-frequency representation, respectively. The effectiveness of the VTFR for extracting nonlinear components has been verified by a bat signal. Results of this numerical simulation also show the sparsity and concentration of the VTFR are better than those of short-time Fourier transform, continuous wavelet transform, Hilbert-Huang transform and Wigner-Ville distribution techniques. Several experimental results have further demonstrated that the proposed method can well detect bearing faults under variable speed conditions.

  15. A GIS Approach to Evaluate Infrastructure Variables Influencing the Occurrence of Traffic Accidents in Urban Roads

    Murat Selim Çepni

    2017-03-01

    Full Text Available Several studies worldwide have been developed that seek to explain the occurrence of traffic accidents from different perspectives. The analyses have addressed legal perspectives, technical attributes of vehicles and infrastructure as well as the psychological, behavioral and socio-economic components of the road system users. Recently, some analysis techniques based on the use of Geographic Information Systems (GIS have been used, which allow the generation of spatial distribution maps, models and risk estimates from a spatial perspective. Sometimes analyses of traffic accidents are performed using quantitative statistical techniques, which place significant importance on the evolution of accidents. Studies such as those in references have shown that conventional statistical models are sometimes inadequate to model the frequency of traffic accidents, as they may provide erroneous inferences. GIS approach has been used to explore different spatial and temporal visualization technologies to reveal accident patterns and significant factors relating to vehicle crashes, or as a management system for accident analysis and the determination of hot spots. This paper examines the relationship between urban road accidents and variables related to road infrastructure, environment and traffic volumes. Some accident-prone sections in the city of Kocaeli are specifically identified by GIS tools. Urban road accidents in Kocaeli are a serious problem and it is believed that accidents can be related to infrastructure characteristics. The study aimed to establish the relationship between urban road accidents and the road infrastructure variables and revealed some possible accident prone locations for the period of 2013 and 2015 in Kocaeli city

  16. A state-and-transition simulation modeling approach for estimating the historical range of variability

    Kori Blankenship

    2015-04-01

    Full Text Available Reference ecological conditions offer important context for land managers as they assess the condition of their landscapes and provide benchmarks for desired future conditions. State-and-transition simulation models (STSMs are commonly used to estimate reference conditions that can be used to evaluate current ecosystem conditions and to guide land management decisions and activities. The LANDFIRE program created more than 1,000 STSMs and used them to assess departure from a mean reference value for ecosystems in the United States. While the mean provides a useful benchmark, land managers and researchers are often interested in the range of variability around the mean. This range, frequently referred to as the historical range of variability (HRV, offers model users improved understanding of ecosystem function, more information with which to evaluate ecosystem change and potentially greater flexibility in management options. We developed a method for using LANDFIRE STSMs to estimate the HRV around the mean reference condition for each model state in ecosystems by varying the fire probabilities. The approach is flexible and can be adapted for use in a variety of ecosystems. HRV analysis can be combined with other information to help guide complex land management decisions.

  17. A Synergetic Approach to Describe the Stability and Variability of Motor Behavior

    Witte, Kersttn; Bock, Holger; Storb, Ulrich; Blaser, Peter

    At the beginning of the 20th century, the Russian physiologist and biomechanist Bernstein developed his cyclograms, in which he showed in the non-repetition of the same movement under constant conditions. We can also observe this phenomenon when we analyze several cyclic sports movements. For example, we investigated the trajectories of single joints and segments of the body in breaststroke, walking, and running. The problem of the stability and variability of movement, and the relation between the two, cannot be satisfactorily tackled by means of linear methods. Thus, several authors (Turvey, 1977; Kugler et al., 1980; Haken et al., 1985; Schöner et al., 1986; Mitra et al., 1997; Kay et al., 1991; Ganz et al., 1996; Schöllhorn, 1999) use nonlinear models to describe human movement. These models and approaches have shown that nonlinear theories of complex systems provide a new understanding of the stability and variability of motor control. The purpose of this chapter is a presentation of a common synergetic model of motor behavior and its application to foot tapping, walking, and running.

  18. Statistical Modeling Approach to Quantitative Analysis of Interobserver Variability in Breast Contouring

    Yang, Jinzhong, E-mail: jyang4@mdanderson.org [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Woodward, Wendy A.; Reed, Valerie K.; Strom, Eric A.; Perkins, George H.; Tereffe, Welela; Buchholz, Thomas A. [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Zhang, Lifei; Balter, Peter; Court, Laurence E. [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Li, X. Allen [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States); Dong, Lei [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Scripps Proton Therapy Center, San Diego, California (United States)

    2014-05-01

    Purpose: To develop a new approach for interobserver variability analysis. Methods and Materials: Eight radiation oncologists specializing in breast cancer radiation therapy delineated a patient's left breast “from scratch” and from a template that was generated using deformable image registration. Three of the radiation oncologists had previously received training in Radiation Therapy Oncology Group consensus contouring for breast cancer atlas. The simultaneous truth and performance level estimation algorithm was applied to the 8 contours delineated “from scratch” to produce a group consensus contour. Individual Jaccard scores were fitted to a beta distribution model. We also applied this analysis to 2 or more patients, which were contoured by 9 breast radiation oncologists from 8 institutions. Results: The beta distribution model had a mean of 86.2%, standard deviation (SD) of ±5.9%, a skewness of −0.7, and excess kurtosis of 0.55, exemplifying broad interobserver variability. The 3 RTOG-trained physicians had higher agreement scores than average, indicating that their contours were close to the group consensus contour. One physician had high sensitivity but lower specificity than the others, which implies that this physician tended to contour a structure larger than those of the others. Two other physicians had low sensitivity but specificity similar to the others, which implies that they tended to contour a structure smaller than the others. With this information, they could adjust their contouring practice to be more consistent with others if desired. When contouring from the template, the beta distribution model had a mean of 92.3%, SD ± 3.4%, skewness of −0.79, and excess kurtosis of 0.83, which indicated a much better consistency among individual contours. Similar results were obtained for the analysis of 2 additional patients. Conclusions: The proposed statistical approach was able to measure interobserver variability quantitatively

  19. Supervised pre-processing approaches in multiple class variables classification for fish recruitment forecasting

    Fernandes, José Antonio

    2013-02-01

    A multi-species approach to fisheries management requires taking into account the interactions between species in order to improve recruitment forecasting of the fish species. Recent advances in Bayesian networks direct the learning of models with several interrelated variables to be forecasted simultaneously. These models are known as multi-dimensional Bayesian network classifiers (MDBNs). Pre-processing steps are critical for the posterior learning of the model in these kinds of domains. Therefore, in the present study, a set of \\'state-of-the-art\\' uni-dimensional pre-processing methods, within the categories of missing data imputation, feature discretization and feature subset selection, are adapted to be used with MDBNs. A framework that includes the proposed multi-dimensional supervised pre-processing methods, coupled with a MDBN classifier, is tested with synthetic datasets and the real domain of fish recruitment forecasting. The correctly forecasting of three fish species (anchovy, sardine and hake) simultaneously is doubled (from 17.3% to 29.5%) using the multi-dimensional approach in comparison to mono-species models. The probability assessments also show high improvement reducing the average error (estimated by means of Brier score) from 0.35 to 0.27. Finally, these differences are superior to the forecasting of species by pairs. © 2012 Elsevier Ltd.

  20. Consumer's risk in the EMA and FDA regulatory approaches for bioequivalence in highly variable drugs.

    Muñoz, Joel; Alcaide, Daniel; Ocaña, Jordi

    2016-05-30

    The 2010 US Food and Drug Administration and European Medicines Agency regulatory approaches to establish bioequivalence in highly variable drugs are both based on linearly scaling the bioequivalence limits, both take a 'scaled average bioequivalence' approach. The present paper corroborates previous work suggesting that none of them adequately controls type I error or consumer's risk, so they result in invalid test procedures in the neighbourhood of a within-subject coefficient of variation osf 30% for the reference (R) formulation. The problem is particularly serious in the US Food and Drug Administration regulation, but it is also appreciable in the European Medicines Agency one. For the partially replicated TRR/RTR/RRT and the replicated TRTR/RTRT crossover designs, we quantify these type I error problems by means of a simulation study, discuss their possible causes and propose straightforward improvements on both regulatory procedures that improve their type I error control while maintaining an adequate power. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  1. Improved radiological/nuclear source localization in variable NORM background: An MLEM approach with segmentation data

    Penny, Robert D., E-mail: robert.d.penny@leidos.com [Leidos Inc., 10260 Campus Point Road, San Diego, CA (United States); Crowley, Tanya M.; Gardner, Barbara M.; Mandell, Myron J.; Guo, Yanlin; Haas, Eric B.; Knize, Duane J.; Kuharski, Robert A.; Ranta, Dale; Shyffer, Ryan [Leidos Inc., 10260 Campus Point Road, San Diego, CA (United States); Labov, Simon; Nelson, Karl; Seilhan, Brandon [Lawrence Livermore National Laboratory, Livermore, CA (United States); Valentine, John D. [Lawrence Berkeley National Laboratory, Berkeley, CA (United States)

    2015-06-01

    A novel approach and algorithm have been developed to rapidly detect and localize both moving and static radiological/nuclear (R/N) sources from an airborne platform. Current aerial systems with radiological sensors are limited in their ability to compensate for variable naturally occurring radioactive material (NORM) background. The proposed approach suppresses the effects of NORM background by incorporating additional information to segment the survey area into regions over which the background is likely to be uniform. The method produces pixelated Source Activity Maps (SAMs) of both target and background radionuclide activity over the survey area. The task of producing the SAMs requires (1) the development of a forward model which describes the transformation of radionuclide activity to detector measurements and (2) the solution of the associated inverse problem. The inverse problem is ill-posed as there are typically fewer measurements than unknowns. In addition the measurements are subject to Poisson statistical noise. The Maximum-Likelihood Expectation-Maximization (MLEM) algorithm is used to solve the inverse problem as it is well suited for under-determined problems corrupted by Poisson noise. A priori terrain information is incorporated to segment the reconstruction space into regions within which we constrain NORM background activity to be uniform. Descriptions of the algorithm and examples of performance with and without segmentation on simulated data are presented.

  2. Exploring venlafaxine pharmacokinetic variability with a phenotyping approach, a multicentric french-swiss study (MARVEL study).

    Lloret-Linares, Célia; Daali, Youssef; Chevret, Sylvie; Nieto, Isabelle; Molière, Fanny; Courtet, Philippe; Galtier, Florence; Richieri, Raphaëlle-Marie; Morange, Sophie; Llorca, Pierre-Michel; El-Hage, Wissam; Desmidt, Thomas; Haesebaert, Frédéric; Vignaud, Philippe; Holtzmann, Jerôme; Cracowski, Jean-Luc; Leboyer, Marion; Yrondi, Antoine; Calvas, Fabienne; Yon, Liova; Le Corvoisier, Philippe; Doumy, Olivier; Heron, Kyle; Montange, Damien; Davani, Siamak; Déglon, Julien; Besson, Marie; Desmeules, Jules; Haffen, Emmanuel; Bellivier, Frank

    2017-11-07

    It is well known that the standard doses of a given drug may not have equivalent effects in all patients. To date, the management of depression remains mainly empirical and often poorly evaluated. The development of a personalized medicine in psychiatry may reduce treatment failure, intolerance or resistance, and hence the burden and costs of mood depressive disorders. The Geneva Cocktail Phenotypic approach presents several advantages including the "in vivo" measure of different cytochromes and transporter P-gp activities, their simultaneous determination in a single test, avoiding the influence of variability over time on phenotyping results, the administration of low dose substrates, a limited sampling strategy with an analytical method developed on DBS analysis. The goal of this project is to explore the relationship between the activity of drug-metabolizing enzymes (DME), assessed by a phenotypic approach, and the concentrations of Venlafaxine (VLX) + O-demethyl-venlafaxine (ODV), the efficacy and tolerance of VLX. This study is a multicentre prospective non-randomized open trial. Eligible patients present a major depressive episode, MADRS over or equal to 20, treatment with VLX regardless of the dose during at least 4 weeks. The Phenotype Visit includes VLX and ODV concentration measurement. Following the oral absorption of low doses of omeprazole, midazolam, dextromethorphan, and fexofenadine, drug metabolizing enzymes activity is assessed by specific metabolite/probe concentration ratios from a sample taken 2 h after cocktail administration for CYP2C19, CYP3A4, CYP2D6; and by the determination of the limited area under the curve from the capillary blood samples taken 2-3 and 6 h after cocktail administration for CYP2C19 and P-gp. Two follow-up visits will take place between 25 and 40 days and 50-70 days after inclusion. They include assessment of efficacy, tolerance and observance. Eleven french centres are involved in recruitment, expected to be

  3. Posterior instrumentation, anterior column reconstruction with single posterior approach for treatment of pyogenic osteomyelitis of thoracic and lumbar spine.

    Gorensek, M; Kosak, R; Travnik, L; Vengust, R

    2013-03-01

    Surgical treatment of thoracolumbar osteomyelitis consists of radical debridement, reconstruction of anterior column either with or without posterior stabilization. The objective of present study is to evaluate a case series of patients with osteomyelitis of thoracic and lumbar spine treated by single, posterior approach with posterior instrumentation and anterior column reconstruction. Seventeen patients underwent clinical and radiological evaluation pre and postoperatively with latest follow-up at 19 months (8-56 months) after surgery. Parameters assessed were site of infection, causative organism, angle of deformity, blood loss, duration of surgery, ICU stay, deformity correction, time to solid bony fusion, ambulatory status, neurologic status (ASIA impairment scale), and functional outcome (Kirkaldy-Willis criteria). Mean operating time was 207 min and average blood loss 1,150 ml. Patients spent 2 (1-4) days in ICU and were able to walk unaided 1.6 (1-2) days after surgery. Infection receded in all 17 patients postoperatively. Solid bony fusion occurred in 15 out of 17 patients (88 %) on average 6.3 months after surgery. Functional outcome was assessed as excellent or good in 82 % of cases. Average deformity correction was 8 (1-18) degrees, with loss of correction of 4 (0-19) degrees at final follow-up. Single, posterior approach addressing both columns poses safe alternative in treatment of pyogenic vertebral osteomyelitis of thoracic and lumbar spine. It proved to be less invasive resulting in faster postoperative recovery.

  4. A multivariate and stochastic approach to identify key variables to rank dairy farms on profitability.

    Atzori, A S; Tedeschi, L O; Cannas, A

    2013-05-01

    The economic efficiency of dairy farms is the main goal of farmers. The objective of this work was to use routinely available information at the dairy farm level to develop an index of profitability to rank dairy farms and to assist the decision-making process of farmers to increase the economic efficiency of the entire system. A stochastic modeling approach was used to study the relationships between inputs and profitability (i.e., income over feed cost; IOFC) of dairy cattle farms. The IOFC was calculated as: milk revenue + value of male calves + culling revenue - herd feed costs. Two databases were created. The first one was a development database, which was created from technical and economic variables collected in 135 dairy farms. The second one was a synthetic database (sDB) created from 5,000 synthetic dairy farms using the Monte Carlo technique and based on the characteristics of the development database data. The sDB was used to develop a ranking index as follows: (1) principal component analysis (PCA), excluding IOFC, was used to identify principal components (sPC); and (2) coefficient estimates of a multiple regression of the IOFC on the sPC were obtained. Then, the eigenvectors of the sPC were used to compute the principal component values for the original 135 dairy farms that were used with the multiple regression coefficient estimates to predict IOFC (dRI; ranking index from development database). The dRI was used to rank the original 135 dairy farms. The PCA explained 77.6% of the sDB variability and 4 sPC were selected. The sPC were associated with herd profile, milk quality and payment, poor management, and reproduction based on the significant variables of the sPC. The mean IOFC in the sDB was 0.1377 ± 0.0162 euros per liter of milk (€/L). The dRI explained 81% of the variability of the IOFC calculated for the 135 original farms. When the number of farms below and above 1 standard deviation (SD) of the dRI were calculated, we found that 21

  5. An Instrumental Approach to Culture Study in General Music: Use a Musical Instrument from Another Culture to Introduce the Sounds, Stories, and History of that Culture

    Carolin, Michael

    2006-01-01

    Conflicting philosophies exist in education today between those who believe that the traditional approach to curriculum is the best and those who believe there are better ways. At the core of both approaches is the sincere desire to give children the best education in the world. In this article, the author uses integration to bring the study of…

  6. Associations of dragonflies (Odonata) to habitat variables within the Maltese Islands: a spatio-temporal approach.

    Balzan, Mario V

    2012-01-01

    Relatively little information is available on environmental associations and the conservation of Odonata in the Maltese Islands. Aquatic habitats are normally spatio-temporally restricted, often located within predominantly rural landscapes, and are thereby susceptible to farmland water management practices, which may create additional pressure on water resources. This study investigates how odonate assemblage structure and diversity are associated with habitat variables of local breeding habitats and the surrounding agricultural landscapes. Standardized survey methodology for adult Odonata involved periodical counts over selected water-bodies (valley systems, semi-natural ponds, constructed agricultural reservoirs). Habitat variables relating to the type of water body, the floristic and physiognomic characteristics of vegetation, and the composition of the surrounding landscape, were studied and analyzed through a multivariate approach. Overall, odonate diversity was associated with a range of factors across multiple spatial scales, and was found to vary with time. Lentic water-bodies are probably of high conservation value, given that larval stages were mainly associated with this habitat category, and that all species were recorded in the adult stage in this habitat type. Comparatively, lentic and lotic seminatural waterbodies were more diverse than agricultural reservoirs and brackish habitats. Overall, different odonate groups were associated with different vegetation life-forms and height categories. The presence of the great reed, Arundo donax L., an invasive alien species that forms dense stands along several water-bodies within the Islands, seems to influence the abundance and/or occurrence of a number of species. At the landscape scale, roads and other ecologically disturbed ground, surface water-bodies, and landscape diversity were associated with particular components of the odonate assemblages. Findings from this study have several implications for the

  7. Does social trust increase willingness to pay taxes to improve public healthcare? Cross-sectional cross-country instrumental variable analysis.

    Habibov, Nazim; Cheung, Alex; Auchynnikava, Alena

    2017-09-01

    The purpose of this paper is to investigate the effect of social trust on the willingness to pay more taxes to improve public healthcare in post-communist countries. The well-documented association between higher levels of social trust and better health has traditionally been assumed to reflect the notion that social trust is positively associated with support for public healthcare system through its encouragement of cooperative behaviour, social cohesion, social solidarity, and collective action. Hence, in this paper, we have explicitly tested the notion that social trust contributes to an increase in willingness to financially support public healthcare. We use micro data from the 2010 Life-in-Transition survey (N = 29,526). Classic binomial probit and instrumental variables ivprobit regressions are estimated to model the relationship between social trust and paying more taxes to improve public healthcare. We found that an increase in social trust is associated with a greater willingness to pay more taxes to improve public healthcare. From the perspective of policy-making, healthcare administrators, policy-makers, and international donors should be aware that social trust is an important factor in determining the willingness of the population to provide much-needed financial resources to supporting public healthcare. From a theoretical perspective, we found that estimating the effect of trust on support for healthcare without taking confounding and measurement error problems into consideration will likely lead to an underestimation of the true effect of trust. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Impact on mortality of prompt admission to critical care for deteriorating ward patients: an instrumental variable analysis using critical care bed strain.

    Harris, Steve; Singer, Mervyn; Sanderson, Colin; Grieve, Richard; Harrison, David; Rowan, Kathryn

    2018-05-07

    To estimate the effect of prompt admission to critical care on mortality for deteriorating ward patients. We performed a prospective cohort study of consecutive ward patients assessed for critical care. Prompt admissions (within 4 h of assessment) were compared to a 'watchful waiting' cohort. We used critical care strain (bed occupancy) as a natural randomisation event that would predict prompt transfer to critical care. Strain was classified as low, medium or high (2+, 1 or 0 empty beds). This instrumental variable (IV) analysis was repeated for the subgroup of referrals with a recommendation for critical care once assessed. Risk-adjusted 90-day survival models were also constructed. A total of 12,380 patients from 48 hospitals were available for analysis. There were 2411 (19%) prompt admissions (median delay 1 h, IQR 1-2) and 9969 (81%) controls; 1990 (20%) controls were admitted later (median delay 11 h, IQR 6-26). Prompt admissions were less frequent (p care. In the risk-adjust survival model, 90-day mortality was similar. After allowing for unobserved prognostic differences between the groups, we find that prompt admission to critical care leads to lower 90-day mortality for patients assessed and recommended to critical care.

  9. Heart Rate Variability (HRV biofeedback: A new training approach for operator’s performance enhancement

    Auditya Purwandini Sutarto

    2010-06-01

    Full Text Available The widespread implementation of advanced and complex systems requires predominantly operators’ cognitive functions and less importance of human manual control. On the other hand, most operators perform their cognitive functions below their peak cognitive capacity level due to fatigue, stress, and boredom. Thus, there is a need to improve their cognitive functions during work. The goal of this paper is to present a psychophysiology training approach derived from cardiovascular response named heart rate variability (HRV biofeedback. Description of resonant frequency biofeedback - a specific HRV training protocol - is discussed as well as its supported researches for the performance enhancement. HRV biofeedback training works by teaching people to recognize their involuntary HRV and to control patterns of this physiological response. The training is directed to increase HRV amplitude that promotes autonomic nervous system balance. This balance is associated with improved physiological functioning as well as psychological benefits. Most individuals can learn HRV biofeedback training easily which involves slowing the breathing rate (around six breaths/min to each individual’s resonant frequency at which the amplitude of HRV is maximized. Maximal control over HRV can be obtained in most people after approximately four sessions of training. Recent studies have demonstrated the effectiveness of HRV biofeedback to the improvement of some cognitive functions in both simulated and real industrial operators.

  10. Retention and Curve Number Variability in a Small Agricultural Catchment: The Probabilistic Approach

    Kazimierz Banasik

    2014-04-01

    Full Text Available The variability of the curve number (CN and the retention parameter (S of the Soil Conservation Service (SCS-CN method in a small agricultural, lowland watershed (23.4 km2 to the gauging station in central Poland has been assessed using the probabilistic approach: distribution fitting and confidence intervals (CIs. Empirical CNs and Ss were computed directly from recorded rainfall depths and direct runoff volumes. Two measures of the goodness of fit were used as selection criteria in the identification of the parent distribution function. The measures specified the generalized extreme value (GEV, normal and general logistic (GLO distributions for 100-CN and GLO, lognormal and GEV distributions for S. The characteristics estimated from theoretical distribution (median, quantiles were compared to the tabulated CN and to the antecedent runoff conditions of Hawkins and Hjelmfelt. The distribution fitting for the whole sample revealed a good agreement between the tabulated CN and the median and between the antecedent runoff conditions (ARCs of Hawkins and Hjelmfelt, which certified a good calibration of the model. However, the division of the CN sample due to heavy and moderate rainfall depths revealed a serious inconsistency between the parameters mentioned. This analysis proves that the application of the SCS-CN method should rely on deep insight into the probabilistic properties of CN and S.

  11. A first approach to calculate BIOCLIM variables and climate zones for Antarctica

    Wagner, Monika; Trutschnig, Wolfgang; Bathke, Arne C.; Ruprecht, Ulrike

    2018-02-01

    For testing the hypothesis that macroclimatological factors determine the occurrence, biodiversity, and species specificity of both symbiotic partners of Antarctic lecideoid lichens, we present a first approach for the computation of the full set of 19 BIOCLIM variables, as available at http://www.worldclim.org/ for all regions of the world with exception of Antarctica. Annual mean temperature (Bio 1) and annual precipitation (Bio 12) were chosen to define climate zones of the Antarctic continent and adjacent islands as required for ecological niche modeling (ENM). The zones are based on data for the years 2009-2015 which was obtained from the Antarctic Mesoscale Prediction System (AMPS) database of the Ohio State University. For both temperature and precipitation, two separate zonings were specified; temperature values were divided into 12 zones (named 1 to 12) and precipitation values into five (named A to E). By combining these two partitions, we defined climate zonings where each geographical point can be uniquely assigned to exactly one zone, which allows an immediate explicit interpretation. The soundness of the newly calculated climate zones was tested by comparison with already published data, which used only three zones defined on climate information from the literature. The newly defined climate zones result in a more precise assignment of species distribution to the single habitats. This study provides the basis for a more detailed continental-wide ENM using a comprehensive dataset of lichen specimens which are located within 21 different climate regions.

  12. A combinatorial approach to detect coevolved amino acid networks in protein families of variable divergence.

    Julie Baussand

    2009-09-01

    Full Text Available Communication between distant sites often defines the biological role of a protein: amino acid long-range interactions are as important in binding specificity, allosteric regulation and conformational change as residues directly contacting the substrate. The maintaining of functional and structural coupling of long-range interacting residues requires coevolution of these residues. Networks of interaction between coevolved residues can be reconstructed, and from the networks, one can possibly derive insights into functional mechanisms for the protein family. We propose a combinatorial method for mapping conserved networks of amino acid interactions in a protein which is based on the analysis of a set of aligned sequences, the associated distance tree and the combinatorics of its subtrees. The degree of coevolution of all pairs of coevolved residues is identified numerically, and networks are reconstructed with a dedicated clustering algorithm. The method drops the constraints on high sequence divergence limiting the range of applicability of the statistical approaches previously proposed. We apply the method to four protein families where we show an accurate detection of functional networks and the possibility to treat sets of protein sequences of variable divergence.

  13. On Thermally Interacting Multiple Boreholes with Variable Heating Strength: Comparison between Analytical and Numerical Approaches

    Marc A. Rosen

    2012-08-01

    Full Text Available The temperature response in the soil surrounding multiple boreholes is evaluated analytically and numerically. The assumption of constant heat flux along the borehole wall is examined by coupling the problem to the heat transfer problem inside the borehole and presenting a model with variable heat flux along the borehole length. In the analytical approach, a line source of heat with a finite length is used to model the conduction of heat in the soil surrounding the boreholes. In the numerical method, a finite volume method in a three dimensional meshed domain is used. In order to determine the heat flux boundary condition, the analytical quasi-three-dimensional solution to the heat transfer problem of the U-tube configuration inside the borehole is used. This solution takes into account the variation in heating strength along the borehole length due to the temperature variation of the fluid running in the U-tube. Thus, critical depths at which thermal interaction occurs can be determined. Finally, in order to examine the validity of the numerical method, a comparison is made with the results of line source method.

  14. Hybrid Vibration Control under Broadband Excitation and Variable Temperature Using Viscoelastic Neutralizer and Adaptive Feedforward Approach

    João C. O. Marra

    2016-01-01

    Full Text Available Vibratory phenomena have always surrounded human life. The need for more knowledge and domain of such phenomena increases more and more, especially in the modern society where the human-machine integration becomes closer day after day. In that context, this work deals with the development and practical implementation of a hybrid (passive-active/adaptive vibration control system over a metallic beam excited by a broadband signal and under variable temperature, between 5 and 35°C. Since temperature variations affect directly and considerably the performance of the passive control system, composed of a viscoelastic dynamic vibration neutralizer (also called a viscoelastic dynamic vibration absorber, the associative strategy of using an active-adaptive vibration control system (based on a feedforward approach with the use of the FXLMS algorithm working together with the passive one has shown to be a good option to compensate the neutralizer loss of performance and generally maintain the extended overall level of vibration control. As an additional gain, the association of both vibration control systems (passive and active-adaptive has improved the attenuation of vibration levels. Some key steps matured over years of research on this experimental setup are presented in this paper.

  15. Simple Approaches to Minimally-Instrumented, Microfluidic-Based Point-of-Care Nucleic Acid Amplification Tests

    Mauk, Michael G.; Song, Jinzhao; Liu, Changchun; Bau, Haim H.

    2018-01-01

    Designs and applications of microfluidics-based devices for molecular diagnostics (Nucleic Acid Amplification Tests, NAATs) in infectious disease testing are reviewed, with emphasis on minimally instrumented, point-of-care (POC) tests for resource-limited settings. Microfluidic cartridges (‘chips’) that combine solid-phase nucleic acid extraction; isothermal enzymatic nucleic acid amplification; pre-stored, paraffin-encapsulated lyophilized reagents; and real-time or endpoint optical detection are described. These chips can be used with a companion module for separating plasma from blood through a combined sedimentation-filtration effect. Three reporter types: Fluorescence, colorimetric dyes, and bioluminescence; and a new paradigm for end-point detection based on a diffusion-reaction column are compared. Multiplexing (parallel amplification and detection of multiple targets) is demonstrated. Low-cost detection and added functionality (data analysis, control, communication) can be realized using a cellphone platform with the chip. Some related and similar-purposed approaches by others are surveyed. PMID:29495424

  16. Simple Approaches to Minimally-Instrumented, Microfluidic-Based Point-of-Care Nucleic Acid Amplification Tests

    Michael G. Mauk

    2018-02-01

    Full Text Available Designs and applications of microfluidics-based devices for molecular diagnostics (Nucleic Acid Amplification Tests, NAATs in infectious disease testing are reviewed, with emphasis on minimally instrumented, point-of-care (POC tests for resource-limited settings. Microfluidic cartridges (‘chips’ that combine solid-phase nucleic acid extraction; isothermal enzymatic nucleic acid amplification; pre-stored, paraffin-encapsulated lyophilized reagents; and real-time or endpoint optical detection are described. These chips can be used with a companion module for separating plasma from blood through a combined sedimentation-filtration effect. Three reporter types: Fluorescence, colorimetric dyes, and bioluminescence; and a new paradigm for end-point detection based on a diffusion-reaction column are compared. Multiplexing (parallel amplification and detection of multiple targets is demonstrated. Low-cost detection and added functionality (data analysis, control, communication can be realized using a cellphone platform with the chip. Some related and similar-purposed approaches by others are surveyed.

  17. Increasing the effectiveness of instrumentation and control training programs using integrated training settings and a systematic approach to training

    McMahon, J.F.; Rakos, N.

    1992-01-01

    The performance of plant maintenance-related tasks assigned to instrumentation and control (I ampersand C) technicians can be broken down into physical skills required to do the task; resident knowledge of how to do the task; effect of maintenance on plant operating conditions; interactions with other plant organizations such as operations, radiation protection, and quality control; and knowledge of consequences of miss-action. A technician who has learned about the task in formal classroom presentations has not had the advantage of integrating that knowledge with the requisite physical and communication skills; hence, the first time these distinct and vital parts of the task equation are put together is on the job, during initial task performance. On-the-job training provides for the integration of skills and knowledge; however, this form of training is limited by plant conditions, availability of supporting players, and training experience levels of the personnel conducting the exercise. For licensed operations personnel, most nuclear utilities use formal classroom and a full-scope control room simulator to achieve the integration of skills and knowledge in a controlled training environment. TU Electric has taken that same approach into maintenance areas by including identical plant equipment in a laboratory setting for the large portion of training received by maintenance personnel at its Comanche Peak steam electric station. The policy of determining training needs and defining the scope of training by using the systematic approach to training has been highly effective and provided training at a reasonable cost (approximately $18.00/student contact hour)

  18. A new model of wheezing severity in young children using the validated ISAAC wheezing module: A latent variable approach with validation in independent cohorts.

    Brunwasser, Steven M; Gebretsadik, Tebeb; Gold, Diane R; Turi, Kedir N; Stone, Cosby A; Datta, Soma; Gern, James E; Hartert, Tina V

    2018-01-01

    The International Study of Asthma and Allergies in Children (ISAAC) Wheezing Module is commonly used to characterize pediatric asthma in epidemiological studies, including nearly all airway cohorts participating in the Environmental Influences on Child Health Outcomes (ECHO) consortium. However, there is no consensus model for operationalizing wheezing severity with this instrument in explanatory research studies. Severity is typically measured using coarsely-defined categorical variables, reducing power and potentially underestimating etiological associations. More precise measurement approaches could improve testing of etiological theories of wheezing illness. We evaluated a continuous latent variable model of pediatric wheezing severity based on four ISAAC Wheezing Module items. Analyses included subgroups of children from three independent cohorts whose parents reported past wheezing: infants ages 0-2 in the INSPIRE birth cohort study (Cohort 1; n = 657), 6-7-year-old North American children from Phase One of the ISAAC study (Cohort 2; n = 2,765), and 5-6-year-old children in the EHAAS birth cohort study (Cohort 3; n = 102). Models were estimated using structural equation modeling. In all cohorts, covariance patterns implied by the latent variable model were consistent with the observed data, as indicated by non-significant χ2 goodness of fit tests (no evidence of model misspecification). Cohort 1 analyses showed that the latent factor structure was stable across time points and child sexes. In both cohorts 1 and 3, the latent wheezing severity variable was prospectively associated with wheeze-related clinical outcomes, including physician asthma diagnosis, acute corticosteroid use, and wheeze-related outpatient medical visits when adjusting for confounders. We developed an easily applicable continuous latent variable model of pediatric wheezing severity based on items from the well-validated ISAAC Wheezing Module. This model prospectively associates with

  19. Probabilistic approaches to accounting for data variability in the practical application of bioavailability in predicting aquatic risks from metals.

    Ciffroy, Philippe; Charlatchka, Rayna; Ferreira, Daniel; Marang, Laura

    2013-07-01

    The biotic ligand model (BLM) theoretically enables the derivation of environmental quality standards that are based on true bioavailable fractions of metals. Several physicochemical variables (especially pH, major cations, dissolved organic carbon, and dissolved metal concentrations) must, however, be assigned to run the BLM, but they are highly variable in time and space in natural systems. This article describes probabilistic approaches for integrating such variability during the derivation of risk indexes. To describe each variable using a probability density function (PDF), several methods were combined to 1) treat censored data (i.e., data below the limit of detection), 2) incorporate the uncertainty of the solid-to-liquid partitioning of metals, and 3) detect outliers. From a probabilistic perspective, 2 alternative approaches that are based on log-normal and Γ distributions were tested to estimate the probability of the predicted environmental concentration (PEC) exceeding the predicted non-effect concentration (PNEC), i.e., p(PEC/PNEC>1). The probabilistic approach was tested on 4 real-case studies based on Cu-related data collected from stations on the Loire and Moselle rivers. The approach described in this article is based on BLM tools that are freely available for end-users (i.e., the Bio-Met software) and on accessible statistical data treatments. This approach could be used by stakeholders who are involved in risk assessments of metals for improving site-specific studies. Copyright © 2013 SETAC.

  20. Validating a Computer-Assisted Language Learning Attitude Instrument Used in Iranian EFL Context: An Evidence-Based Approach

    Aryadoust, Vahid; Mehran, Parisa; Alizadeh, Mehrasa

    2016-01-01

    A few computer-assisted language learning (CALL) instruments have been developed in Iran to measure EFL (English as a foreign language) learners' attitude toward CALL. However, these instruments have no solid validity argument and accordingly would be unable to provide a reliable measurement of attitude. The present study aimed to develop a CALL…

  1. Bayesian Population Physiologically-Based Pharmacokinetic (PBPK Approach for a Physiologically Realistic Characterization of Interindividual Variability in Clinically Relevant Populations.

    Markus Krauss

    Full Text Available Interindividual variability in anatomical and physiological properties results in significant differences in drug pharmacokinetics. The consideration of such pharmacokinetic variability supports optimal drug efficacy and safety for each single individual, e.g. by identification of individual-specific dosings. One clear objective in clinical drug development is therefore a thorough characterization of the physiological sources of interindividual variability. In this work, we present a Bayesian population physiologically-based pharmacokinetic (PBPK approach for the mechanistically and physiologically realistic identification of interindividual variability. The consideration of a generic and highly detailed mechanistic PBPK model structure enables the integration of large amounts of prior physiological knowledge, which is then updated with new experimental data in a Bayesian framework. A covariate model integrates known relationships of physiological parameters to age, gender and body height. We further provide a framework for estimation of the a posteriori parameter dependency structure at the population level. The approach is demonstrated considering a cohort of healthy individuals and theophylline as an application example. The variability and co-variability of physiological parameters are specified within the population; respectively. Significant correlations are identified between population parameters and are applied for individual- and population-specific visual predictive checks of the pharmacokinetic behavior, which leads to improved results compared to present population approaches. In the future, the integration of a generic PBPK model into an hierarchical approach allows for extrapolations to other populations or drugs, while the Bayesian paradigm allows for an iterative application of the approach and thereby a continuous updating of physiological knowledge with new data. This will facilitate decision making e.g. from preclinical to

  2. Importance of the macroeconomic variables for variance prediction: A GARCH-MIDAS approach

    Asgharian, Hossein; Hou, Ai Jun; Javed, Farrukh

    2013-01-01

    This paper aims to examine the role of macroeconomic variables in forecasting the return volatility of the US stock market. We apply the GARCH-MIDAS (Mixed Data Sampling) model to examine whether information contained in macroeconomic variables can help to predict short-term and long-term compone......This paper aims to examine the role of macroeconomic variables in forecasting the return volatility of the US stock market. We apply the GARCH-MIDAS (Mixed Data Sampling) model to examine whether information contained in macroeconomic variables can help to predict short-term and long...

  3. Financial development and investment market integration: An approach of underlying financial variables & indicators for corporate governance growth empirical approach

    Vojinovič Borut

    2005-01-01

    Full Text Available Financial development is correlated with several underlying regulatory variables (such as indicators of investor protection, market transparency variables for corporate governance growth and rules for capital market development, which are under the control of national legislators and EU directives. This paper provides estimates of the relationship between financial market development and corporate growth and assesses the impact of financial market integration on this relationship with reference to European Union (EU countries. The regression results obtained using this panel support the hypothesis that financial development promotes growth particularly in industries that are more financially dependent on external finance. For policy purposes, analyzing changes in these regulatory variables may be a more interesting exercise than analyzing integration of the financial systems themselves. Since assuming that EU countries will raise its regulatory and legal standards to the U.S. standards appears unrealistic, in this case we examine a scenario where EU countries raise their standards to the highest current EU standard.

  4. Functionally relevant climate variables for arid lands: Aclimatic water deficit approach for modelling desert shrub distributions

    Thomas E. Dilts; Peter J. Weisberg; Camie M. Dencker; Jeanne C. Chambers

    2015-01-01

    We have three goals. (1) To develop a suite of functionally relevant climate variables for modelling vegetation distribution on arid and semi-arid landscapes of the Great Basin, USA. (2) To compare the predictive power of vegetation distribution models based on mechanistically proximate factors (water deficit variables) and factors that are more mechanistically removed...

  5. Evaluating two model reduction approaches for large scale hedonic models sensitive to omitted variables and multicollinearity

    Panduro, Toke Emil; Thorsen, Bo Jellesmark

    2014-01-01

    Hedonic models in environmental valuation studies have grown in terms of number of transactions and number of explanatory variables. We focus on the practical challenge of model reduction, when aiming for reliable parsimonious models, sensitive to omitted variable bias and multicollinearity. We...

  6. Measurement Uncertainty in Racial and Ethnic Identification among Adolescents of Mixed Ancestry: A Latent Variable Approach

    Tracy, Allison J.; Erkut, Sumru; Porche, Michelle V.; Kim, Jo; Charmaraman, Linda; Grossman, Jennifer M.; Ceder, Ineke; Garcia, Heidie Vazquez

    2010-01-01

    In this article, we operationalize identification of mixed racial and ethnic ancestry among adolescents as a latent variable to (a) account for measurement uncertainty, and (b) compare alternative wording formats for racial and ethnic self-categorization in surveys. Two latent variable models were fit to multiple mixed-ancestry indicator data from…

  7. Problems with radiological surveillance instrumentation

    Swinth, K.L.; Tanner, J.E.; Fleming, D.M.

    1984-09-01

    Many radiological surveillance instruments are in use at DOE facilities throughout the country. These instruments are an essential part of all health physics programs, and poor instrument performance can increase program costs or compromise program effectiveness. Generic data from simple tests on newly purchased instruments shows that many instruments will not meet requirements due to manufacturing defects. In other cases, lack of consideration of instrument use has resulted in poor acceptance of instruments and poor reliability. The performance of instruments is highly variable for electronic and mechanical performance, radiation response, susceptibility to interferences and response to environmental factors. Poor instrument performance in these areas can lead to errors or poor accuracy in measurements

  8. Problems with radiological surveillance instrumentation

    Swinth, K.L.; Tanner, J.E.; Fleming, D.M.

    1985-01-01

    Many radiological surveillance instruments are in use at DOE facilities throughout the country. These instruments are an essential part of all health physics programs, and poor instrument performance can increase program costs or compromise program effectiveness. Generic data from simple tests on newly purchased instruments shows that many instruments will not meet requirements due to manufacturing defects. In other cases, lack of consideration of instrument use has resulted in poor acceptance of instruments and poor reliability. The performance of instruments is highly variable for electronic and mechanical performance, radiation response, susceptibility to interferences and response to environmental factors. Poor instrument performance in these areas can lead to errors or poor accuracy in measurements

  9. A geostatistical approach to the change-of-support problem and variable-support data fusion in spatial analysis

    Wang, Jun; Wang, Yang; Zeng, Hui

    2016-01-01

    A key issue to address in synthesizing spatial data with variable-support in spatial analysis and modeling is the change-of-support problem. We present an approach for solving the change-of-support and variable-support data fusion problems. This approach is based on geostatistical inverse modeling that explicitly accounts for differences in spatial support. The inverse model is applied here to produce both the best predictions of a target support and prediction uncertainties, based on one or more measurements, while honoring measurements. Spatial data covering large geographic areas often exhibit spatial nonstationarity and can lead to computational challenge due to the large data size. We developed a local-window geostatistical inverse modeling approach to accommodate these issues of spatial nonstationarity and alleviate computational burden. We conducted experiments using synthetic and real-world raster data. Synthetic data were generated and aggregated to multiple supports and downscaled back to the original support to analyze the accuracy of spatial predictions and the correctness of prediction uncertainties. Similar experiments were conducted for real-world raster data. Real-world data with variable-support were statistically fused to produce single-support predictions and associated uncertainties. The modeling results demonstrate that geostatistical inverse modeling can produce accurate predictions and associated prediction uncertainties. It is shown that the local-window geostatistical inverse modeling approach suggested offers a practical way to solve the well-known change-of-support problem and variable-support data fusion problem in spatial analysis and modeling.

  10. Theoretical and numerical investigations of TAP experiments. New approaches for variable pressure conditions

    Senechal, U.; Breitkopf, C. [Technische Univ. Dresden (Germany). Inst. fuer Energietechnik

    2011-07-01

    Temporal analysis of products (TAP) is a valuable tool for characterization of porous catalytic structures. Established TAP-modeling requires a spatially constant diffusion coefficient and neglect convective flows, which is only valid in Knudsen diffusion regime. Therefore in experiments, the number of molecules per pulse must be chosen accordingly. New approaches for variable process conditions are highly required. Thus, a new theoretical model is developed for estimating the number of molecules per pulse to meet these requirements under any conditions and at any time. The void volume is calculated as the biggest sphere fitting between three pellets. The total number of pulsed molecules is assumed to fill the first void volume at the inlet immediately. Molecule numbers from these calculations can be understood as maximum possible molecules at any time in the reactor to be in Knudsen diffusion regime, i.e., above the Knudsen number of 2. Moreover, a new methodology for generating a full three-dimensional geometrical representation of beds is presented and used for numerical simulations to investigate spatial effects. Based on a freely available open-source game physics engine library (BULLET), beds of arbitrary-sized pellets can be generated and transformed to CFD-usable geometry. In CFD-software (ANSYS CFX registered) a transient diffusive transport equation with time-dependent inlet boundary conditions is solved. Three different pellet diameters were investigated with 1e18 molecules per pulse, which is higher than the limit from the theoretical calculation. Spatial and temporal distributions of transported species show regions inside the reactor, where non-Knudsen conditions exist. From this results, the distance from inlet can be calculated where the theoretical pressure limit (Knudsen number equals 2) is obtained, i.e., from this point to the end of the reactor Knudsen regime can be assumed. Due to linear dependency of pressure and concentration (assuming ideal

  11. State-Space Modeling and Performance Analysis of Variable-Speed Wind Turbine Based on a Model Predictive Control Approach

    H. Bassi

    2017-04-01

    Full Text Available Advancements in wind energy technologies have led wind turbines from fixed speed to variable speed operation. This paper introduces an innovative version of a variable-speed wind turbine based on a model predictive control (MPC approach. The proposed approach provides maximum power point tracking (MPPT, whose main objective is to capture the maximum wind energy in spite of the variable nature of the wind’s speed. The proposed MPC approach also reduces the constraints of the two main functional parts of the wind turbine: the full load and partial load segments. The pitch angle for full load and the rotating force for the partial load have been fixed concurrently in order to balance power generation as well as to reduce the operations of the pitch angle. A mathematical analysis of the proposed system using state-space approach is introduced. The simulation results using MATLAB/SIMULINK show that the performance of the wind turbine with the MPC approach is improved compared to the traditional PID controller in both low and high wind speeds.

  12. Variable-structure approaches analysis, simulation, robust control and estimation of uncertain dynamic processes

    Senkel, Luise

    2016-01-01

    This edited book aims at presenting current research activities in the field of robust variable-structure systems. The scope equally comprises highlighting novel methodological aspects as well as presenting the use of variable-structure techniques in industrial applications including their efficient implementation on hardware for real-time control. The target audience primarily comprises research experts in the field of control theory and nonlinear dynamics but the book may also be beneficial for graduate students.

  13. Environmental policy and economic efficiency: tradable permits versus regulatory instrument to control air pollution: a comparative approach USA/France

    Cros, Ch.

    1998-12-01

    The key issue of the thesis paradox of the weak implementation of economic instruments whereas 1) they are theoretically and also empirically considered as efficient; 2) the market imposes itself as the central reference to modem economies; and 3) economic efficiency is nowadays a legitimacy measure of public policies. Two different answers can be given: either theoretical analysis does not enable to explain the real economic efficiency of a political instrument, or environmental policies do not have economic efficiency as their main objective. The analysis take place in a context of a limited rationality and an inter-temporal consistency of public policies. The purpose is to understand the role of economic efficiency criteria during the adoption, building, and evolution of an environmental policy with an analytical point of view, and not a normative one. The institutional analysis of the American and the French pollution control policies, representative of the implementation of a trading permit system for the first, and of a regulatory instrument for the second, prove that the theoretical analysis of an instrument can not explain a real coordination, but only one organizational form among others. An institutional trajectory is the interpretation of policy instruments of policy instruments from 5 fundamental elements: the nature of the legitimacy of the policy; the nature of the regulator hypothesis on the information; the nature of the decision-making basis; the nature of the collective action. A coordination changes when the occurrence of an event moves one of the fundamental elements, and disorganizes the satisfying equilibrium of the agents. Then, the economic efficiency becomes a negotiation point. A political instrument is adopted for its own ability to solve a dysfunction without disrupting the coordination. (author)

  14. Identifying the most informative variables for decision-making problems – a survey of recent approaches and accompanying problems

    Pudil, Pavel; Somol, Petr

    2008-01-01

    Roč. 16, č. 4 (2008), s. 37-55 ISSN 0572-3043 R&D Projects: GA MŠk 1M0572 Grant - others:GA MŠk(CZ) 2C06019 Institutional research plan: CEZ:AV0Z10750506 Keywords : variable selection * decision making Subject RIV: BD - Theory of Information http://library.utia.cas.cz/separaty/2008/RO/pudil-identifying%20the%20most%20informative%20variables%20for%20decision- making %20problems%20a%20survey%20of%20recent%20approaches%20and%20accompanying%20problems.pdf

  15. A Hybrid ICA-SVM Approach for Determining the Quality Variables at Fault in a Multivariate Process

    Yuehjen E. Shao

    2012-01-01

    Full Text Available The monitoring of a multivariate process with the use of multivariate statistical process control (MSPC charts has received considerable attention. However, in practice, the use of MSPC chart typically encounters a difficulty. This difficult involves which quality variable or which set of the quality variables is responsible for the generation of the signal. This study proposes a hybrid scheme which is composed of independent component analysis (ICA and support vector machine (SVM to determine the fault quality variables when a step-change disturbance existed in a multivariate process. The proposed hybrid ICA-SVM scheme initially applies ICA to the Hotelling T2 MSPC chart to generate independent components (ICs. The hidden information of the fault quality variables can be identified in these ICs. The ICs are then served as the input variables of the classifier SVM for performing the classification process. The performance of various process designs is investigated and compared with the typical classification method. Using the proposed approach, the fault quality variables for a multivariate process can be accurately and reliably determined.

  16. On-line scheme for parameter estimation of nonlinear lithium ion battery equivalent circuit models using the simplified refined instrumental variable method for a modified Wiener continuous-time model

    Allafi, Walid; Uddin, Kotub; Zhang, Cheng; Mazuir Raja Ahsan Sha, Raja; Marco, James

    2017-01-01

    Highlights: •Off-line estimation approach for continuous-time domain for non-invertible function. •Model reformulated to multi-input-single-output; nonlinearity described by sigmoid. •Method directly estimates parameters of nonlinear ECM from the measured-data. •Iterative on-line technique leads to smoother convergence. •The model is validated off-line and on-line using NCA battery. -- Abstract: The accuracy of identifying the parameters of models describing lithium ion batteries (LIBs) in typical battery management system (BMS) applications is critical to the estimation of key states such as the state of charge (SoC) and state of health (SoH). In applications such as electric vehicles (EVs) where LIBs are subjected to highly demanding cycles of operation and varying environmental conditions leading to non-trivial interactions of ageing stress factors, this identification is more challenging. This paper proposes an algorithm that directly estimates the parameters of a nonlinear battery model from measured input and output data in the continuous time-domain. The simplified refined instrumental variable method is extended to estimate the parameters of a Wiener model where there is no requirement for the nonlinear function to be invertible. To account for nonlinear battery dynamics, in this paper, the typical linear equivalent circuit model (ECM) is enhanced by a block-oriented Wiener configuration where the nonlinear memoryless block following the typical ECM is defined to be a sigmoid static nonlinearity. The nonlinear Weiner model is reformulated in the form of a multi-input, single-output linear model. This linear form allows the parameters of the nonlinear model to be estimated using any linear estimator such as the well-established least squares (LS) algorithm. In this paper, the recursive least square (RLS) method is adopted for online parameter estimation. The approach was validated on experimental data measured from an 18650-type Graphite

  17. Assessment of economic instruments for countries with low municipal waste management performance: An approach based on the analytic hierarchy process.

    Kling, Maximilian; Seyring, Nicole; Tzanova, Polia

    2016-09-01

    Economic instruments provide significant potential for countries with low municipal waste management performance in decreasing landfill rates and increasing recycling rates for municipal waste. In this research, strengths and weaknesses of landfill tax, pay-as-you-throw charging systems, deposit-refund systems and extended producer responsibility schemes are compared, focusing on conditions in countries with low waste management performance. In order to prioritise instruments for implementation in these countries, the analytic hierarchy process is applied using results of a literature review as input for the comparison. The assessment reveals that pay-as-you-throw is the most preferable instrument when utility-related criteria are regarded (wb = 0.35; analytic hierarchy process distributive mode; absolute comparison) mainly owing to its waste prevention effect, closely followed by landfill tax (wb = 0.32). Deposit-refund systems (wb = 0.17) and extended producer responsibility (wb = 0.16) rank third and fourth, with marginal differences owing to their similar nature. When cost-related criteria are additionally included in the comparison, landfill tax seems to provide the highest utility-cost ratio. Data from literature concerning cost (contrary to utility-related criteria) is currently not sufficiently available for a robust ranking according to the utility-cost ratio. In general, the analytic hierarchy process is seen as a suitable method for assessing economic instruments in waste management. Independent from the chosen analytic hierarchy process mode, results provide valuable indications for policy-makers on the application of economic instruments, as well as on their specific strengths and weaknesses. Nevertheless, the instruments need to be put in the country-specific context along with the results of this analytic hierarchy process application before practical decisions are made. © The Author(s) 2016.

  18. The Development of Verbal and Visual Working Memory Processes: A Latent Variable Approach

    Koppenol-Gonzalez, Gabriela V.; Bouwmeester, Samantha; Vermunt, Jeroen K.

    2012-01-01

    Working memory (WM) processing in children has been studied with different approaches, focusing on either the organizational structure of WM processing during development (factor analytic) or the influence of different task conditions on WM processing (experimental). The current study combined both approaches, aiming to distinguish verbal and…

  19. Variability in University Students' Use of Technology: An "Approaches to Learning" Perspective

    Mimirinis, Mike

    2016-01-01

    This study reports the results of a cross-case study analysis of how students' approaches to learning are demonstrated in blended learning environments. It was initially propositioned that approaches to learning as key determinants of the quality of student learning outcomes are demonstrated specifically in how students utilise technology in…

  20. Predicting suicidal ideation in primary care: An approach to identify easily assessable key variables.

    Jordan, Pascal; Shedden-Mora, Meike C; Löwe, Bernd

    To obtain predictors of suicidal ideation, which can also be used for an indirect assessment of suicidal ideation (SI). To create a classifier for SI based on variables of the Patient Health Questionnaire (PHQ) and sociodemographic variables, and to obtain an upper bound on the best possible performance of a predictor based on those variables. From a consecutive sample of 9025 primary care patients, 6805 eligible patients (60% female; mean age = 51.5 years) participated. Advanced methods of machine learning were used to derive the prediction equation. Various classifiers were applied and the area under the curve (AUC) was computed as a performance measure. Classifiers based on methods of machine learning outperformed ordinary regression methods and achieved AUCs around 0.87. The key variables in the prediction equation comprised four items - namely feelings of depression/hopelessness, low self-esteem, worrying, and severe sleep disturbances. The generalized anxiety disorder scale (GAD-7) and the somatic symptom subscale (PHQ-15) did not enhance prediction substantially. In predicting suicidal ideation researchers should refrain from using ordinary regression tools. The relevant information is primarily captured by the depression subscale and should be incorporated in a nonlinear model. For clinical practice, a classification tree using only four items of the whole PHQ may be advocated. Copyright © 2018 Elsevier Inc. All rights reserved.

  1. Sums and Products of Jointly Distributed Random Variables: A Simplified Approach

    Stein, Sheldon H.

    2005-01-01

    Three basic theorems concerning expected values and variances of sums and products of random variables play an important role in mathematical statistics and its applications in education, business, the social sciences, and the natural sciences. A solid understanding of these theorems requires that students be familiar with the proofs of these…

  2. A Source Area Approach Demonstrates Moderate Predictive Ability but Pronounced Variability of Invasive Species Traits.

    Günther Klonner

    Full Text Available The search for traits that make alien species invasive has mostly concentrated on comparing successful invaders and different comparison groups with respect to average trait values. By contrast, little attention has been paid to trait variability among invaders. Here, we combine an analysis of trait differences between invasive and non-invasive species with a comparison of multidimensional trait variability within these two species groups. We collected data on biological and distributional traits for 1402 species of the native, non-woody vascular plant flora of Austria. We then compared the subsets of species recorded and not recorded as invasive aliens anywhere in the world, respectively, first, with respect to the sampled traits using univariate and multiple regression models; and, second, with respect to their multidimensional trait diversity by calculating functional richness and dispersion metrics. Attributes related to competitiveness (strategy type, nitrogen indicator value, habitat use (agricultural and ruderal habitats, occurrence under the montane belt, and propagule pressure (frequency were most closely associated with invasiveness. However, even the best multiple model, including interactions, only explained a moderate fraction of the differences in invasive success. In addition, multidimensional variability in trait space was even larger among invasive than among non-invasive species. This pronounced variability suggests that invasive success has a considerable idiosyncratic component and is probably highly context specific. We conclude that basing risk assessment protocols on species trait profiles will probably face hardly reducible uncertainties.

  3. Tracking climate variability in the western Mediterranean during the Late Holocene: A multiproxy approach

    Nieto-Moreno, V.; Martínez-Ruiz, F.; Giralt, S.; Jimenéz-Espejo, F.; Gallego-Torres, D.; Rodrigo-Gámiz, M.; Garcia-Orellana, J.; Ortega-Huertas, M.; de Lange, G.J.

    2011-01-01

    Climate variability in the western Mediterranean is reconstructed for the last 4000 yr using marine sediments recovered in the west Algerian-Balearic basin, near the Alboran basin. Fluctuations in chemical and mineralogical sediment composition as well as grain size distribution are linked to

  4. Quantitative assessment of drivers of recent global temperature variability: an information theoretic approach

    Bhaskar, Ankush; Ramesh, Durbha Sai; Vichare, Geeta; Koganti, Triven; Gurubaran, S.

    2017-12-01

    Identification and quantification of possible drivers of recent global temperature variability remains a challenging task. This important issue is addressed adopting a non-parametric information theory technique, the Transfer Entropy and its normalized variant. It distinctly quantifies actual information exchanged along with the directional flow of information between any two variables with no bearing on their common history or inputs, unlike correlation, mutual information etc. Measurements of greenhouse gases: CO2, CH4 and N2O; volcanic aerosols; solar activity: UV radiation, total solar irradiance ( TSI) and cosmic ray flux ( CR); El Niño Southern Oscillation ( ENSO) and Global Mean Temperature Anomaly ( GMTA) made during 1984-2005 are utilized to distinguish driving and responding signals of global temperature variability. Estimates of their relative contributions reveal that CO2 ({˜ } 24 %), CH4 ({˜ } 19 %) and volcanic aerosols ({˜ }23 %) are the primary contributors to the observed variations in GMTA. While, UV ({˜ } 9 %) and ENSO ({˜ } 12 %) act as secondary drivers of variations in the GMTA, the remaining play a marginal role in the observed recent global temperature variability. Interestingly, ENSO and GMTA mutually drive each other at varied time lags. This study assists future modelling efforts in climate science.

  5. A Nonlinear Mixed Effects Approach for Modeling the Cell-To-Cell Variability of Mig1 Dynamics in Yeast.

    Joachim Almquist

    Full Text Available The last decade has seen a rapid development of experimental techniques that allow data collection from individual cells. These techniques have enabled the discovery and characterization of variability within a population of genetically identical cells. Nonlinear mixed effects (NLME modeling is an established framework for studying variability between individuals in a population, frequently used in pharmacokinetics and pharmacodynamics, but its potential for studies of cell-to-cell variability in molecular cell biology is yet to be exploited. Here we take advantage of this novel application of NLME modeling to study cell-to-cell variability in the dynamic behavior of the yeast transcription repressor Mig1. In particular, we investigate a recently discovered phenomenon where Mig1 during a short and transient period exits the nucleus when cells experience a shift from high to intermediate levels of extracellular glucose. A phenomenological model based on ordinary differential equations describing the transient dynamics of nuclear Mig1 is introduced, and according to the NLME methodology the parameters of this model are in turn modeled by a multivariate probability distribution. Using time-lapse microscopy data from nearly 200 cells, we estimate this parameter distribution according to the approach of maximizing the population likelihood. Based on the estimated distribution, parameter values for individual cells are furthermore characterized and the resulting Mig1 dynamics are compared to the single cell times-series data. The proposed NLME framework is also compared to the intuitive but limited standard two-stage (STS approach. We demonstrate that the latter may overestimate variabilities by up to almost five fold. Finally, Monte Carlo simulations of the inferred population model are used to predict the distribution of key characteristics of the Mig1 transient response. We find that with decreasing levels of post-shift glucose, the transient

  6. Supervised pre-processing approaches in multiple class variables classification for fish recruitment forecasting

    Fernandes, José Antonio; Lozano, Jose A.; Iñ za, Iñ aki; Irigoien, Xabier; Pé rez, Aritz; Rodrí guez, Juan Diego

    2013-01-01

    A multi-species approach to fisheries management requires taking into account the interactions between species in order to improve recruitment forecasting of the fish species. Recent advances in Bayesian networks direct the learning of models

  7. Designing Data-Driven Battery Prognostic Approaches for Variable Loading Profiles: Some Lessons Learned

    National Aeronautics and Space Administration — Among various approaches for implementing prognostic algorithms data-driven algorithms are popular in the industry due to their intuitive nature and relatively fast...

  8. Quality by Design approach for studying the impact of formulation and process variables on product quality of oral disintegrating films.

    Mazumder, Sonal; Pavurala, Naresh; Manda, Prashanth; Xu, Xiaoming; Cruz, Celia N; Krishnaiah, Yellela S R

    2017-07-15

    The present investigation was carried out to understand the impact of formulation and process variables on the quality of oral disintegrating films (ODF) using Quality by Design (QbD) approach. Lamotrigine (LMT) was used as a model drug. Formulation variable was plasticizer to film former ratio and process variables were drying temperature, air flow rate in the drying chamber, drying time and wet coat thickness of the film. A Definitive Screening Design of Experiments (DoE) was used to identify and classify the critical formulation and process variables impacting critical quality attributes (CQA). A total of 14 laboratory-scale DoE formulations were prepared and evaluated for mechanical properties (%elongation at break, yield stress, Young's modulus, folding endurance) and other CQA (dry thickness, disintegration time, dissolution rate, moisture content, moisture uptake, drug assay and drug content uniformity). The main factors affecting mechanical properties were plasticizer to film former ratio and drying temperature. Dissolution rate was found to be sensitive to air flow rate during drying and plasticizer to film former ratio. Data were analyzed for elucidating interactions between different variables, rank ordering the critical materials attributes (CMA) and critical process parameters (CPP), and for providing a predictive model for the process. Results suggested that plasticizer to film former ratio and process controls on drying are critical to manufacture LMT ODF with the desired CQA. Published by Elsevier B.V.

  9. Laser-induced breakdown spectroscopy (LIBS), part II: review of instrumental and methodological approaches to material analysis and applications to different fields.

    Hahn, David W; Omenetto, Nicoló

    2012-04-01

    The first part of this two-part review focused on the fundamental and diagnostics aspects of laser-induced plasmas, only touching briefly upon concepts such as sensitivity and detection limits and largely omitting any discussion of the vast panorama of the practical applications of the technique. Clearly a true LIBS community has emerged, which promises to quicken the pace of LIBS developments, applications, and implementations. With this second part, a more applied flavor is taken, and its intended goal is summarizing the current state-of-the-art of analytical LIBS, providing a contemporary snapshot of LIBS applications, and highlighting new directions in laser-induced breakdown spectroscopy, such as novel approaches, instrumental developments, and advanced use of chemometric tools. More specifically, we discuss instrumental and analytical approaches (e.g., double- and multi-pulse LIBS to improve the sensitivity), calibration-free approaches, hyphenated approaches in which techniques such as Raman and fluorescence are coupled with LIBS to increase sensitivity and information power, resonantly enhanced LIBS approaches, signal processing and optimization (e.g., signal-to-noise analysis), and finally applications. An attempt is made to provide an updated view of the role played by LIBS in the various fields, with emphasis on applications considered to be unique. We finally try to assess where LIBS is going as an analytical field, where in our opinion it should go, and what should still be done for consolidating the technique as a mature method of chemical analysis. © 2012 Society for Applied Spectroscopy

  10. A Parallel Approach in Computing Correlation Immunity up to Six Variables

    2015-07-24

    failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE 23 JUL 2015 2. REPORT TYPE...second step, we specify that a condition hold across all assignments of values to the variables chosen in the first step. For pedagogical reasons, we could...table of the function whose correlation immunity is currently being computed. When this circuit is used in exhaustive enumeration, the Function

  11. Implementation of upper limit calculation for a poisson variable by bayesian approach

    Zhu Yongsheng

    2008-01-01

    The calculation of Bayesian confidence upper limit for a Poisson variable including both signal and background with and without systematic uncertainties has been formulated. A Fortran 77 routine, BPULE, has been developed to implement the calculation. The routine can account for systematic uncertainties in the background expectation and signal efficiency. The systematic uncertainties may be separately parameterized by a Gaussian, Log-Gaussian or flat probability density function (pdf). Some technical details of BPULE have been discussed. (authors)

  12. Higher Energy Intake Variability as Predisposition to Obesity: Novel Approach Using Interquartile Range.

    Forejt, Martin; Brázdová, Zuzana Derflerová; Novák, Jan; Zlámal, Filip; Forbelská, Marie; Bienert, Petr; Mořkovská, Petra; Zavřelová, Miroslava; Pohořalá, Aneta; Jurášková, Miluše; Salah, Nabil; Bienertová-Vašků, Julie

    2017-12-01

    It is known that total energy intake and its distribution during the day influences human anthropometric characteristics. However, possible association between variability in total energy intake and obesity has thus far remained unexamined. This study was designed to establish the influence of energy intake variability of each daily meal on the anthropometric characteristics of obesity. A total of 521 individuals of Czech Caucasian origin aged 16–73 years (390 women and 131 men) were included in the study, 7-day food records were completed by all study subjects and selected anthropometric characteristics were measured. The interquartile range (IQR) of energy intake was assessed individually for each meal of the day (as a marker of energy intake variability) and subsequently correlated with body mass index (BMI), body fat percentage (%BF), waist-hip ratio (WHR), and waist circumference (cW). Four distinct models were created using multiple logistic regression analysis and backward stepwise logistic regression. The most precise results, based on the area under the curve (AUC), were observed in case of the %BF model (AUC=0.895) and cW model (AUC=0.839). According to the %BF model, age (p<0.001) and IQR-lunch (p<0.05) seem to play an important prediction role for obesity. Likewise, according to the cW model, age (p<0.001), IQR-breakfast (p<0.05) and IQR-dinner (p <0.05) predispose patients to the development of obesity. The results of our study show that higher variability in the energy intake of key daily meals may increase the likelihood of obesity development. Based on the obtained results, it is necessary to emphasize the regularity in meals intake for maintaining proper body composition. Copyright© by the National Institute of Public Health, Prague 2017

  13. Economies of scale in the Korean district heating system: A variable cost function approach

    Park, Sun-Young; Lee, Kyoung-Sil; Yoo, Seung-Hoon

    2016-01-01

    This paper aims to investigate the cost efficiency of South Korea’s district heating (DH) system by using a variable cost function and cost-share equation. We employ a seemingly unrelated regression model, with quarterly time-series data from the Korea District Heating Corporation (KDHC)—a public utility that covers about 59% of the DH system market in South Korea—over the 1987–2011 period. The explanatory variables are price of labor, price of material, capital cost, and production level. The results indicate that economies of scale are present and statistically significant. Thus, expansion of its DH business would allow KDHC to obtain substantial economies of scale. According to our forecasts vis-à-vis scale economies, the KDHC will enjoy cost efficiency for some time yet. To ensure a socially efficient supply of DH, it is recommended that the KDHC expand its business proactively. With regard to informing policy or regulations, our empirical results could play a significant role in decision-making processes. - Highlights: • We examine economies of scale in the South Korean district heating sector. • We focus on Korea District Heating Corporation (KDHC), a public utility. • We estimate a translog cost function, using a variable cost function. • We found economies of scale to be present and statistically significant. • KDHC will enjoy cost efficiency and expanding its supply is socially efficient.

  14. Investigating the Constrained Action Hypothesis: A Movement Coordination and Coordination Variability Approach.

    Vidal, Anthony; Wu, Will; Nakajima, Mimi; Becker, James

    2017-09-19

    The purpose of this study was to examine the effects of focus of attention cues on movement coordination and coordination variability in the lower extremity. Twenty participants performed the standing long jump under both internal and external focus of attention conditions. A modified vector coding technique was used to evaluate the influence of attentional focus cues on lower extremity coordination patterns and coordination variability during the jumps. Participants jumped significantly further under an external focus of attention condition compared with an internal focus of attention condition (p = .035, effect size = .29). Focus of attention also influenced coordination between the ankle and knee, F(6, 19) = 2.87, p = .012, effect size = .388, with participants primarily using their knees under the internal focus of attention, and using both their ankles and knees under the external focus of attention. Attentional focus cues did not influence ankle-knee, F(1, 19) = 0.02, p = .98, effect size = .02, or hip-knee, F(1, 19) = 5.00, p = .49, effect size = .16, coordination variability. Results suggest that while attentional focus may not directly influence movement coordination condition, there is still a change in movement strategy resulting in greater jump distances following an external focus of attention.

  15. Psychological variables implied in the therapeutic effect of ayahuasca: A contextual approach.

    Franquesa, Alba; Sainz-Cort, Alberto; Gandy, Sam; Soler, Joaquim; Alcázar-Córcoles, Miguel Ángel; Bouso, José Carlos

    2018-04-04

    Ayahuasca is a psychedelic decoction originating from Amazonia. The ayahuasca-induced introspective experience has been shown to have potential benefits in the treatment of several pathologies, to protect mental health and to improve neuropsychological functions and creativity, and boost mindfulness. The underlying psychological processes related to the use of ayahuasca in a psychotherapeutic context are not yet well described in the scientific literature, but there is some evidence to suggest that psychological variables described in psychotherapies could be useful in explaining the therapeutic effects of the brew. In this study we explore the link between ayahuasca use and Decentering, Values and Self, comparing subjects without experience of ayahuasca (n = 41) with subjects with experience (n = 81). Results confirm that ayahuasca users scored higher than non-users in Decentering and Positive self, but not in Valued living, Life fulfillment, Self in social relations, Self in close relations and General self. Scores in Decentering were higher in the more experienced subjects (more than 15 occasions) than in those with less experience (less than 15 occasions). Our results show that psychological process variables may explain the outcomes in ayahuasca psychotherapy. The introduction of these variables is warranted in future ayahuasca therapeutic studies. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. A New Statistical Approach to the Optical Spectral Variability in Blazars

    Jose A. Acosta-Pulido

    2016-12-01

    Full Text Available We present a spectral variability study of a sample of about 25 bright blazars, based on optical spectroscopy. Observations cover the period from the end of 2008 to mid 2015, with an approximately monthly cadence. Emission lines have been identified and measured in the spectra, which permits us to classify the sources into BL Lac-type or FSRQs, according to the commonly used EW limit. We have obtained synthetic photometry and produced colour-magnitude diagrams which show different trends associated with the object classes: generally, BL Lacs tend to become bluer when brighter and FSRQs become redder when brighter, although several objects exhibit both trends, depending on brightness. We have also applied a pattern recognition algorithm to obtain the minimum number of physical components which can explain the variability of the optical spectrum. We have used NMF (Non-Negative Matrix Factorization instead of PCA (Principal Component Analysis to avoid un-realistic negative components. For most targets we found that 2 or 3 meta-components are enough to explain the observed spectral variability.

  17. The Propagation of Movement Variability in Time: A Methodological Approach for Discrete Movements with Multiple Degrees of Freedom

    Krüger, Melanie; Straube, Andreas; Eggert, Thomas

    2017-01-01

    In recent years, theory-building in motor neuroscience and our understanding of the synergistic control of the redundant human motor system has significantly profited from the emergence of a range of different mathematical approaches to analyze the structure of movement variability. Approaches such as the Uncontrolled Manifold method or the Noise-Tolerance-Covariance decomposition method allow to detect and interpret changes in movement coordination due to e.g., learning, external task constraints or disease, by analyzing the structure of within-subject, inter-trial movement variability. Whereas, for cyclical movements (e.g., locomotion), mathematical approaches exist to investigate the propagation of movement variability in time (e.g., time series analysis), similar approaches are missing for discrete, goal-directed movements, such as reaching. Here, we propose canonical correlation analysis as a suitable method to analyze the propagation of within-subject variability across different time points during the execution of discrete movements. While similar analyses have already been applied for discrete movements with only one degree of freedom (DoF; e.g., Pearson's product-moment correlation), canonical correlation analysis allows to evaluate the coupling of inter-trial variability across different time points along the movement trajectory for multiple DoF-effector systems, such as the arm. The theoretical analysis is illustrated by empirical data from a study on reaching movements under normal and disturbed proprioception. The results show increased movement duration, decreased movement amplitude, as well as altered movement coordination under ischemia, which results in a reduced complexity of movement control. Movement endpoint variability is not increased under ischemia. This suggests that healthy adults are able to immediately and efficiently adjust the control of complex reaching movements to compensate for the loss of proprioceptive information. Further, it is

  18. A spray flamelet/progress variable approach combined with a transported joint PDF model for turbulent spray flames

    Hu, Yong; Olguin, Hernan; Gutheil, Eva

    2017-05-01

    A spray flamelet/progress variable approach is developed for use in spray combustion with partly pre-vaporised liquid fuel, where a laminar spray flamelet library accounts for evaporation within the laminar flame structures. For this purpose, the standard spray flamelet formulation for pure evaporating liquid fuel and oxidiser is extended by a chemical reaction progress variable in both the turbulent spray flame model and the laminar spray flame structures, in order to account for the effect of pre-vaporised liquid fuel for instance through use of a pilot flame. This new approach is combined with a transported joint probability density function (PDF) method for the simulation of a turbulent piloted ethanol/air spray flame, and the extension requires the formulation of a joint three-variate PDF depending on the gas phase mixture fraction, the chemical reaction progress variable, and gas enthalpy. The molecular mixing is modelled with the extended interaction-by-exchange-with-the-mean (IEM) model, where source terms account for spray evaporation and heat exchange due to evaporation as well as the chemical reaction rate for the chemical reaction progress variable. This is the first formulation using a spray flamelet model considering both evaporation and partly pre-vaporised liquid fuel within the laminar spray flamelets. Results with this new formulation show good agreement with the experimental data provided by A.R. Masri, Sydney, Australia. The analysis of the Lagrangian statistics of the gas temperature and the OH mass fraction indicates that partially premixed combustion prevails near the nozzle exit of the spray, whereas further downstream, the non-premixed flame is promoted towards the inner rich-side of the spray jet since the pilot flame heats up the premixed inner spray zone. In summary, the simulation with the new formulation considering the reaction progress variable shows good performance, greatly improving the standard formulation, and it provides new

  19. A cross-scale approach to understand drought-induced variability of sagebrush ecosystem productivity

    Assal, T.; Anderson, P. J.

    2016-12-01

    Sagebrush (Artemisia spp.) mortality has recently been reported in the Upper Green River Basin (Wyoming, USA) of the sagebrush steppe of western North America. Numerous causes have been suggested, but recent drought (2012-13) is the likely mechanism of mortality in this water-limited ecosystem which provides critical habitat for many species of wildlife. An understanding of the variability in patterns of productivity with respect to climate is essential to exploit landscape scale remote sensing for detection of subtle changes associated with mortality in this sparse, uniformly vegetated ecosystem. We used the standardized precipitation index to characterize drought conditions and Moderate Resolution Imaging Spectroradiometer (MODIS) satellite imagery (250-m resolution) to characterize broad characteristics of growing season productivity. We calculated per-pixel growing season anomalies over a 16-year period (2000-2015) to identify the spatial and temporal variability in productivity. Metrics derived from Landsat satellite imagery (30-m resolution) were used to further investigate trends within anomalous areas at local scales. We found evidence to support an initial hypothesis that antecedent winter drought was most important in explaining reduced productivity. The results indicate drought effects were inconsistent over space and time. MODIS derived productivity deviated by more than four standard deviations in heavily impacted areas, but was well within the interannual variability in other areas. Growing season anomalies highlighted dramatic declines in productivity during the 2012 and 2013 growing seasons. However, large negative anomalies persisted in other areas during the 2014 growing season, indicating lag effects of drought. We are further investigating if the reduction in productivity is mediated by local biophysical properties. Our analysis identified spatially explicit patterns of ecosystem properties altered by severe drought which are consistent with

  20. Diet Composition and Variability of Wild Octopus vulgaris and Alloteuthis media (Cephalopoda Paralarvae: a Metagenomic Approach

    Lorena Olmos-Pérez

    2017-05-01

    Full Text Available The high mortality of cephalopod early stages is the main bottleneck to grow them from paralarvae to adults in culture conditions, probably because the inadequacy of the diet that results in malnutrition. Since visual analysis of digestive tract contents of paralarvae provides little evidence of diet composition, the use of molecular tools, particularly next generation sequencing (NGS platforms, offers an alternative to understand prey preferences and nutrient requirements of wild paralarvae. In this work, we aimed to determine the diet of paralarvae of the loliginid squid Alloteuthis media and to enhance the knowledge of the diet of recently hatched Octopus vulgaris paralarvae collected in different areas and seasons in an upwelling area (NW Spain. DNA from the dissected digestive glands of 32 A. media and 64 O. vulgaris paralarvae was amplified with universal primers for the mitochondrial gene COI, and specific primers targeting the mitochondrial gene 16S gene of arthropods and the mitochondrial gene 16S of Chordata. Following high-throughput DNA sequencing with the MiSeq run (Illumina, up to 4,124,464 reads were obtained and 234,090 reads of prey were successfully identified in 96.87 and 81.25% of octopus and squid paralarvae, respectively. Overall, we identified 122 Molecular Taxonomic Units (MOTUs belonging to several taxa of decapods, copepods, euphausiids, amphipods, echinoderms, molluscs, and hydroids. Redundancy analysis (RDA showed seasonal and spatial variability in the diet of O. vulgaris and spatial variability in A. media diet. General Additive Models (GAM of the most frequently detected prey families of O. vulgaris revealed seasonal variability of the presence of copepods (family Paracalanidae and ophiuroids (family Euryalidae, spatial variability in presence of crabs (family Pilumnidae and preference in small individual octopus paralarvae for cladocerans (family Sididae and ophiuroids. No statistically significant variation in

  1. Systems approach to the design of the CCD sensors and camera electronics for the AIA and HMI instruments on solar dynamics observatory

    Waltham, N.; Beardsley, S.; Clapp, M.; Lang, J.; Jerram, P.; Pool, P.; Auker, G.; Morris, D.; Duncan, D.

    2017-11-01

    Solar Dynamics Observatory (SDO) is imaging the Sun in many wavelengths near simultaneously and with a resolution ten times higher than the average high-definition television. In this paper we describe our innovative systems approach to the design of the CCD cameras for two of SDO's remote sensing instruments, the Atmospheric Imaging Assembly (AIA) and the Helioseismic and Magnetic Imager (HMI). Both instruments share use of a custom-designed 16 million pixel science-grade CCD and common camera readout electronics. A prime requirement was for the CCD to operate with significantly lower drive voltages than before, motivated by our wish to simplify the design of the camera readout electronics. Here, the challenge lies in the design of circuitry to drive the CCD's highly capacitive electrodes and to digitize its analogue video output signal with low noise and to high precision. The challenge is greatly exacerbated when forced to work with only fully space-qualified, radiation-tolerant components. We describe our systems approach to the design of the AIA and HMI CCD and camera electronics, and the engineering solutions that enabled us to comply with both mission and instrument science requirements.

  2. Variability of orogenic magmatism during Mediterranean-style continental collisions : A numerical modelling approach

    Andrić, N.; Vogt, K.; Matenco, L.; Cvetković, V.; Cloetingh, S.; Gerya, T.

    The relationship between magma generation and the tectonic evolution of orogens during subduction and subsequent collision requires self-consistent numerical modelling approaches predicting volumes and compositions of the produced magmatic rocks. Here, we use a 2D magmatic-thermomechanical numerical

  3. A new Approach to Variable-Topology Shape Design Using a Constraint on the Perimeter

    Haber, R.B; Bendsøe, Martin P.; Jog, C.S.

    1996-01-01

    the number of holes in the optimal design and to establish their characteristic length scale. Finite element procedures based on this approach generate practical designs that are convergent with respect to grid refinement. Thus, an arbitrary level of geometric resolution can be achieved, so single...

  4. Upper limit for Poisson variable incorporating systematic uncertainties by Bayesian approach

    Zhu, Yongsheng

    2007-01-01

    To calculate the upper limit for the Poisson observable at given confidence level with inclusion of systematic uncertainties in background expectation and signal efficiency, formulations have been established along the line of Bayesian approach. A FORTRAN program, BPULE, has been developed to implement the upper limit calculation

  5. Straight line fitting and predictions: On a marginal likelihood approach to linear regression and errors-in-variables models

    Christiansen, Bo

    2015-04-01

    Linear regression methods are without doubt the most used approaches to describe and predict data in the physical sciences. They are often good first order approximations and they are in general easier to apply and interpret than more advanced methods. However, even the properties of univariate regression can lead to debate over the appropriateness of various models as witnessed by the recent discussion about climate reconstruction methods. Before linear regression is applied important choices have to be made regarding the origins of the noise terms and regarding which of the two variables under consideration that should be treated as the independent variable. These decisions are often not easy to make but they may have a considerable impact on the results. We seek to give a unified probabilistic - Bayesian with flat priors - treatment of univariate linear regression and prediction by taking, as starting point, the general errors-in-variables model (Christiansen, J. Clim., 27, 2014-2031, 2014). Other versions of linear regression can be obtained as limits of this model. We derive the likelihood of the model parameters and predictands of the general errors-in-variables model by marginalizing over the nuisance parameters. The resulting likelihood is relatively simple and easy to analyze and calculate. The well known unidentifiability of the errors-in-variables model is manifested as the absence of a well-defined maximum in the likelihood. However, this does not mean that probabilistic inference can not be made; the marginal likelihoods of model parameters and the predictands have, in general, well-defined maxima. We also include a probabilistic version of classical calibration and show how it is related to the errors-in-variables model. The results are illustrated by an example from the coupling between the lower stratosphere and the troposphere in the Northern Hemisphere winter.

  6. An innovative statistical approach for analysing non-continuous variables in environmental monitoring: assessing temporal trends of TBT pollution.

    Santos, José António; Galante-Oliveira, Susana; Barroso, Carlos

    2011-03-01

    The current work presents an innovative statistical approach to model ordinal variables in environmental monitoring studies. An ordinal variable has values that can only be compared as "less", "equal" or "greater" and it is not possible to have information about the size of the difference between two particular values. The example of ordinal variable under this study is the vas deferens sequence (VDS) used in imposex (superimposition of male sexual characters onto prosobranch females) field assessment programmes for monitoring tributyltin (TBT) pollution. The statistical methodology presented here is the ordered logit regression model. It assumes that the VDS is an ordinal variable whose values match up a process of imposex development that can be considered continuous in both biological and statistical senses and can be described by a latent non-observable continuous variable. This model was applied to the case study of Nucella lapillus imposex monitoring surveys conducted in the Portuguese coast between 2003 and 2008 to evaluate the temporal evolution of TBT pollution in this country. In order to produce more reliable conclusions, the proposed model includes covariates that may influence the imposex response besides TBT (e.g. the shell size). The model also provides an analysis of the environmental risk associated to TBT pollution by estimating the probability of the occurrence of females with VDS ≥ 2 in each year, according to OSPAR criteria. We consider that the proposed application of this statistical methodology has a great potential in environmental monitoring whenever there is the need to model variables that can only be assessed through an ordinal scale of values.

  7. Late Holocene climate variability in the southwestern Mediterranean region: an integrated marine and terrestrial geochemical approach

    C. Martín-Puertas

    2010-12-01

    Full Text Available A combination of marine (Alboran Sea cores, ODP 976 and TTR 300 G and terrestrial (Zoñar Lake, Andalucia, Spain geochemical proxies provides a high-resolution reconstruction of climate variability and human influence in the southwestern Mediterranean region for the last 4000 years at inter-centennial resolution. Proxies respond to changes in precipitation rather than temperature alone. Our combined terrestrial and marine archive documents a succession of dry and wet periods coherent with the North Atlantic climate signal. A dry period occurred prior to 2.7 cal ka BP – synchronously to the global aridity crisis of the third-millennium BC – and during the Medieval Climate Anomaly (1.4–0.7 cal ka BP. Wetter conditions prevailed from 2.7 to 1.4 cal ka BP. Hydrological signatures during the Little Ice Age are highly variable but consistent with more humidity than the Medieval Climate Anomaly. Additionally, Pb anomalies in sediments at the end of the Bronze Age suggest anthropogenic pollution earlier than the Roman Empire development in the Iberian Peninsula. The Late Holocene climate evolution of the in the study area confirms the see-saw pattern between the eastern and western Mediterranean regions and the higher influence of the North Atlantic dynamics in the western Mediterranean.

  8. What variables are important in predicting bovine viral diarrhea virus? A random forest approach.

    Machado, Gustavo; Mendoza, Mariana Recamonde; Corbellini, Luis Gustavo

    2015-07-24

    Bovine viral diarrhea virus (BVDV) causes one of the most economically important diseases in cattle, and the virus is found worldwide. A better understanding of the disease associated factors is a crucial step towards the definition of strategies for control and eradication. In this study we trained a random forest (RF) prediction model and performed variable importance analysis to identify factors associated with BVDV occurrence. In addition, we assessed the influence of features selection on RF performance and evaluated its predictive power relative to other popular classifiers and to logistic regression. We found that RF classification model resulted in an average error rate of 32.03% for the negative class (negative for BVDV) and 36.78% for the positive class (positive for BVDV).The RF model presented area under the ROC curve equal to 0.702. Variable importance analysis revealed that important predictors of BVDV occurrence were: a) who inseminates the animals, b) number of neighboring farms that have cattle and c) rectal palpation performed routinely. Our results suggest that the use of machine learning algorithms, especially RF, is a promising methodology for the analysis of cross-sectional studies, presenting a satisfactory predictive power and the ability to identify predictors that represent potential risk factors for BVDV investigation. We examined classical predictors and found some new and hard to control practices that may lead to the spread of this disease within and among farms, mainly regarding poor or neglected reproduction management, which should be considered for disease control and eradication.

  9. Variable Structure Disturbance Rejection Control for Nonlinear Uncertain Systems with State and Control Delays via Optimal Sliding Mode Surface Approach

    Jing Lei

    2013-01-01

    Full Text Available The paper considers the problem of variable structure control for nonlinear systems with uncertainty and time delays under persistent disturbance by using the optimal sliding mode surface approach. Through functional transformation, the original time-delay system is transformed into a delay-free one. The approximating sequence method is applied to solve the nonlinear optimal sliding mode surface problem which is reduced to a linear two-point boundary value problem of approximating sequences. The optimal sliding mode surface is obtained from the convergent solutions by solving a Riccati equation, a Sylvester equation, and the state and adjoint vector differential equations of approximating sequences. Then, the variable structure disturbance rejection control is presented by adopting an exponential trending law, where the state and control memory terms are designed to compensate the state and control delays, a feedforward control term is designed to reject the disturbance, and an adjoint compensator is designed to compensate the effects generated by the nonlinearity and the uncertainty. Furthermore, an observer is constructed to make the feedforward term physically realizable, and thus the dynamical observer-based dynamical variable structure disturbance rejection control law is produced. Finally, simulations are demonstrated to verify the effectiveness of the presented controller and the simplicity of the proposed approach.

  10. Radioisotope instruments

    Cameron, J F; Silverleaf, D J

    1971-01-01

    International Series of Monographs in Nuclear Energy, Volume 107: Radioisotope Instruments, Part 1 focuses on the design and applications of instruments based on the radiation released by radioactive substances. The book first offers information on the physical basis of radioisotope instruments; technical and economic advantages of radioisotope instruments; and radiation hazard. The manuscript then discusses commercial radioisotope instruments, including radiation sources and detectors, computing and control units, and measuring heads. The text describes the applications of radioisotop

  11. A cognitive approach to social assistance instruments: how public assistance recipients represent and interpret the anti-poverty progra m Familias en Acción

    Valeria Ayola Betancourt

    2016-06-01

    Full Text Available This article considers the contribution of the cognitive framework of public policy and the instrument approach to understanding poverty regulation in Colombia through its instruments. We analyze the relationship between the ideological frame of the Familias en Acción program and the recipient’s construction of representations, meanings and interpretations about it. It describes the manner in which the beneficiaries are interpreting the State intervention, but it also describes some social effects arising from the implementation of the programs, taking into account the local scope. For this we use qualitative research techniques and stand out the semi-structured interview as method. Our fieldwork, is based on a comparison between Cartagena city and the San Jacinto’s rural zone.

  12. Efficient Estimation of Spectral Moments and the Polarimetric Variables on Weather Radars, Sonars, Sodars, Acoustic Flow Meters, Lidars, and Similar Active Remote Sensing Instruments

    National Oceanic and Atmospheric Administration, Department of Commerce — A method for estimation of Doppler spectrum, its moments, and polarimetric variables on pulsed weather radars which uses over sampled echo components at a rate...

  13. The effect of vocal and instrumental music on cardio respiratory variables, energy expenditure and exertion levels during sub maximal treadmill exercise.

    Savitha, D; Sejil, T V; Rao, Shwetha; Roshan, C J; Roshan, C J

    2013-01-01

    The purpose of the study was to investigate the effect of vocal and instrumental music on various physiological parameters during submaximal exercise. Each subject underwent three sessions of exercise protocol without music, with vocal music, and instrumental versions of same piece of music. The protocol consisted of 10 min treadmill exercise at 70% HR(max) and 20 min of recovery. Minute to minute heart rate and breath by breath recording of respiratory parameters, rate of energy expenditure and perceived exertion levels were measured. Music, irrespective of the presence or absence of lyrics, enabled the subjects to exercise at a significantly lower heart rate and oxygen consumption, reduced the metabolic cost and perceived exertion levels of exercise (P Music having a relaxant effect could have probably increased the parasympathetic activation leading to these effects.

  14. Tracking climate variability in the western Mediterranean during the Late Holocene: a multiproxy approach

    V. Nieto-Moreno

    2011-12-01

    Full Text Available Climate variability in the western Mediterranean is reconstructed for the last 4000 yr using marine sediments recovered in the west Algerian-Balearic Basin, near the Alboran Basin. Fluctuations in chemical and mineralogical sediment composition as well as grain size distribution are linked to fluvial-eolian oscillations, changes in redox conditions and paleocurrent intensity. Multivariate analyses allowed us to characterize three main groups of geochemical and mineralogical proxies determining the sedimentary record of this region. These three statistical groups were applied to reconstruct paleoclimate conditions at high resolution during the Late Holocene. An increase in riverine input (fluvial-derived elements – Rb/Al, Ba/Al, REE/Al, Si/Al, Ti/Al, Mg/Al and K/Al ratios, and a decrease in Saharan eolian input (Zr/Al ratio depict the Roman Humid Period and the Little Ice Age, while drier environmental conditions are recognized during the Late Bronze Age-Iron Age, the Dark Ages and the Medieval Climate Anomaly. Additionally, faster bottom currents and more energetic hydrodynamic conditions for the former periods are evidenced by enhanced sortable silt (10-63 μm and quartz content, and by better oxygenated bottom waters – as reflected by decreasing redox-sensitive elements (V/Al, Cr/Al, Ni/Al and Zn/Al ratios. In contrast, opposite paleoceanographic conditions are distinguished during the latter periods, i.e. the Late Bronze Age-Iron Age, the Dark Ages and the Medieval Climate Anomaly. Although no Ba excess was registered, other paleoproductivity indicators (total organic carbon content, Br/Al ratio, and organometallic ligands such as U and Cu display the highest values during the Roman Humid Period, and together with increasing preservation of organic matter, this period exhibits by far the most intense productivity of the last 4000 yr. Fluctuations in detrital input into the basin as the main process managing deposition, reflected by the

  15. Variability and conservation of structural domains in divide-and-conquer approaches

    Wiegand, Thomas [ETH Zurich, Physical Chemistry (Switzerland); Gardiennet, Carole [CNRS, Université de Lorraine, CRM2, UMR 7036 (France); Cadalbert, Riccardo [ETH Zurich, Physical Chemistry (Switzerland); Lacabanne, Denis; Kunert, Britta; Terradot, Laurent, E-mail: laurent.terradot@ibcp.fr; Böckmann, Anja, E-mail: a.bockmann@ibcp.fr [Université de Lyon, Institut de Biologie et Chimie des Protéines, Bases Moléculaires et Structurales des Systèmes Infectieux, Labex Ecofect, UMR 5086 CNRS (France); Meier, Beat H., E-mail: beme@ethz.ch [ETH Zurich, Physical Chemistry (Switzerland)

    2016-06-15

    The use of protein building blocks for the structure determination of multidomain proteins and protein–protein complexes, also known as the “divide and conquer” approach, is an important strategy for obtaining protein structures. Atomic-resolution X-ray or NMR data of the individual domains are combined with lower-resolution electron microscopy maps or X-ray data of the full-length protein or the protein complex. Doing so, it is often assumed that the individual domain structures remain invariant in the context of the superstructure. In this work, we show the potentials and limitations of NMR to validate this approach at the example of the dodecameric DnaB helicase from Helicobacter pylori. We investigate how sequentially assigned spectra, as well as unassigned spectral fingerprints can be used to indicate the conservation of individual domains, and also to highlight conformational differences.

  16. Heart Rate Variability (HRV) biofeedback: A new training approach for operator’s performance enhancement

    Auditya Purwandini Sutarto; Muhammad Nubli Abdul Wahab; Nora Mat Zin

    2010-01-01

    The widespread implementation of advanced and complex systems requires predominantly operators’ cognitive functions and less importance of human manual control. On the other hand, most operators perform their cognitive functions below their peak cognitive capacity level due to fatigue, stress, and boredom. Thus, there is a need to improve their cognitive functions during work. The goal of this paper is to present a psychophysiology training approach derived from cardiovascular response ...

  17. Active queue management controller design for TCP communication networks: Variable structure control approach

    Chen, C.-K.; Liao, T.-L.; Yan, J.-J.

    2009-01-01

    On the basis of variable structure control (VSC), an active queue management (AQM) controller is presented for a class of TCP communication networks. In the TCP/IP networks, the packet drop probability is limited between 0 and 1. Therefore, we modeled TCP/AQM as a rate-based non-linear system with a saturated input. The objective of the VSC-based AQM controller is to achieve the desired queue size and to guarantee the asymptotic stability of the closed-loop TCP non-linear system with saturated input. The performance and effectiveness of the proposed control law are then validated for different network scenarios through numerical simulations in both MATLAB and Network Simulator-2 (NS-2). Both sets of simulation results have confirmed that the proposed scheme outperforms other AQM schemes.

  18. Active queue management controller design for TCP communication networks: Variable structure control approach

    Chen, C.-K. [Department of Engineering Science, National Cheng Kung University, Tainan 701, Taiwan (China); Liao, T.-L. [Department of Engineering Science, National Cheng Kung University, Tainan 701, Taiwan (China)], E-mail: tlliao@mail.ncku.edu; Yan, J.-J. [Department of Computer and Communication, Shu-Te University, Kaohsiung 824, Taiwan (China)

    2009-04-15

    On the basis of variable structure control (VSC), an active queue management (AQM) controller is presented for a class of TCP communication networks. In the TCP/IP networks, the packet drop probability is limited between 0 and 1. Therefore, we modeled TCP/AQM as a rate-based non-linear system with a saturated input. The objective of the VSC-based AQM controller is to achieve the desired queue size and to guarantee the asymptotic stability of the closed-loop TCP non-linear system with saturated input. The performance and effectiveness of the proposed control law are then validated for different network scenarios through numerical simulations in both MATLAB and Network Simulator-2 (NS-2). Both sets of simulation results have confirmed that the proposed scheme outperforms other AQM schemes.

  19. A coupled approach for the three-dimensional simulation of pipe leakage in variably saturated soil

    Peche, Aaron; Graf, Thomas; Fuchs, Lothar; Neuweiler, Insa

    2017-12-01

    In urban water pipe networks, pipe leakage may lead to subsurface contamination or to reduced waste water treatment efficiency. The quantification of pipe leakage is challenging due to inaccessibility and unknown hydraulic properties of the soil. A novel physically-based model for three-dimensional numerical simulation of pipe leakage in variably saturated soil is presented. We describe the newly implemented coupling between the pipe flow simulator HYSTEM-EXTRAN and the groundwater flow simulator OpenGeoSys and its validation. We further describe a novel upscaling of leakage using transfer functions derived from numerical simulations. This upscaling enables the simulation of numerous pipe defects with the benefit of reduced computation times. Finally, we investigate the response of leakage to different time-dependent pipe flow events and conclude that larger pipe flow volume and duration lead to larger leakage while the peak position in time has a small effect on leakage.

  20. Determination of acid ionization constants for weak acids by osmometry and the instrumental analysis self-evaluation feedback approach to student preparation of solutions

    Kakolesha, Nyanguila

    One focus of this work was to develop of an alternative method to conductivity for determining the acid ionization constants. Computer-controlled osmometry is one of the emerging analytical tools in industrial research and clinical laboratories. It is slowly finding its way into chemistry laboratories. The instrument's microprocessor control ensures shortened data collection time, repeatability, accuracy, and automatic calibration. The equilibrium constants of acetic acid, chloroacetic acid, bromoacetic acid, cyanoacetic acid, and iodoacetic acid have been measured using osmometry and their values compared with the existing literature values obtained, usually, from conductometric measurements. Ionization constant determined by osmometry for the moderately strong weak acids were in reasonably good agreement with literature values. The results showed that two factors, the ionic strength and the osmotic coefficient, exert opposite effects in solutions of such weak acids. Another focus of the work was analytical chemistry students solution preparation skills. The prevailing teacher-structured experiments leave little room for students' ingenuity in quantitative volumetric analysis. The purpose of this part of the study was to improve students' skills in making solutions using instrument feedback in a constructivist-learning model. After making some solutions by weighing and dissolving solutes or by serial dilution, students used the spectrophotometer and the osmometer to compare their solutions with standard solutions. Students perceived the instrument feedback as a nonthreatening approach to monitoring the development of their skill levels and liked to clarify their understanding through interacting with an instructor-observer. An assessment of the instrument feedback and the constructivist model indicated that students would assume responsibility for their own learning if given the opportunity. This study involved 167 students enrolled in Quantitative Chemical

  1. Intraindividual variability in inhibitory function in adults with ADHD--an ex-Gaussian approach.

    Dennis Gmehlin

    Full Text Available OBJECTIVE: Attention deficit disorder (ADHD is commonly associated with inhibitory dysfunction contributing to typical behavioral symptoms like impulsivity or hyperactivity. However, some studies analyzing intraindividual variability (IIV of reaction times in children with ADHD (cADHD question a predominance of inhibitory deficits. IIV is a measure of the stability of information processing and provides evidence that longer reaction times (RT in inhibitory tasks in cADHD are due to only a few prolonged responses which may indicate deficits in sustained attention rather than inhibitory dysfunction. We wanted to find out, whether a slowing in inhibitory functioning in adults with ADHD (aADHD is due to isolated slow responses. METHODS: Computing classical RT measures (mean RT, SD, ex-Gaussian parameters of IIV (which allow a better separation of reaction time (mu, variability (sigma and abnormally slow responses (tau than classical measures as well as errors of omission and commission, we examined response inhibition in a well-established GoNogo task in a sample of aADHD subjects without medication and healthy controls matched for age, gender and education. RESULTS: We did not find higher numbers of commission errors in aADHD, while the number of omissions was significantly increased compared with controls. In contrast to increased mean RT, the distributional parameter mu did not document a significant slowing in aADHD. However, subjects with aADHD were characterized by increased IIV throughout the entire RT distribution as indicated by the parameters sigma and tau as well as the SD of reaction time. Moreover, we found a significant correlation between tau and the number of omission errors. CONCLUSIONS: Our findings question a primacy of inhibitory deficits in aADHD and provide evidence for attentional dysfunction. The present findings may have theoretical implications for etiological models of ADHD as well as more practical implications for

  2. Correlation or Limits of Agreement? Applying the Bland-Altman Approach to the Comparison of Cognitive Screening Instruments.

    Larner, A J

    2016-01-01

    Calculation of correlation coefficients is often undertaken as a way of comparing different cognitive screening instruments (CSIs). However, test scores may correlate but not agree, and high correlation may mask lack of agreement between scores. The aim of this study was to use the methodology of Bland and Altman to calculate limits of agreement between the scores of selected CSIs and contrast the findings with Pearson's product moment correlation coefficients between the test scores of the same instruments. Datasets from three pragmatic diagnostic accuracy studies which examined the Mini-Mental State Examination (MMSE) vs. the Montreal Cognitive Assessment (MoCA), the MMSE vs. the Mini-Addenbrooke's Cognitive Examination (M-ACE), and the M-ACE vs. the MoCA were analysed to calculate correlation coefficients and limits of agreement between test scores. Although test scores were highly correlated (all >0.8), calculated limits of agreement were broad (all >10 points), and in one case, MMSE vs. M-ACE, was >15 points. Correlation is not agreement. Highly correlated test scores may conceal broad limits of agreement, consistent with the different emphases of different tests with respect to the cognitive domains examined. Routine incorporation of limits of agreement into diagnostic accuracy studies which compare different tests merits consideration, to enable clinicians to judge whether or not their agreement is close. © 2016 S. Karger AG, Basel.

  3. On Darboux's approach to R-separability of variables. Classification of conformally flat 4-dimensional binary metrics

    Szereszewski, A; Sym, A

    2015-01-01

    The standard method of separation of variables in PDEs called the Stäckel–Robertson–Eisenhart (SRE) approach originated in the papers by Robertson (1928 Math. Ann. 98 749–52) and Eisenhart (1934 Ann. Math. 35 284–305) on separability of variables in the Schrödinger equation defined on a pseudo-Riemannian space equipped with orthogonal coordinates, which in turn were based on the purely classical mechanics results by Paul Stäckel (1891, Habilitation Thesis, Halle). These still fundamental results have been further extended in diverse directions by e.g. Havas (1975 J. Math. Phys. 16 1461–8; J. Math. Phys. 16 2476–89) or Koornwinder (1980 Lecture Notes in Mathematics 810 (Berlin: Springer) pp 240–63). The involved separability is always ordinary (factor R = 1) and regular (maximum number of independent parameters in separation equations). A different approach to separation of variables was initiated by Gaston Darboux (1878 Ann. Sci. E.N.S. 7 275–348) which has been almost completely forgotten in today’s research on the subject. Darboux’s paper was devoted to the so-called R-separability of variables in the standard Laplace equation. At the outset he did not make any specific assumption about the separation equations (this is in sharp contrast to the SRE approach). After impressive calculations Darboux obtained a complete solution of the problem. He found not only eleven cases of ordinary separability Eisenhart (1934 Ann. Math. 35 284–305) but also Darboux–Moutard–cyclidic metrics (Bôcher 1894 Ueber die Reihenentwickelungen der Potentialtheorie (Leipzig: Teubner)) and non-regularly separable Dupin-cyclidic metrics as well. In our previous paper Darboux’s approach was extended to the case of the stationary Schrödinger equation on Riemannian spaces admitting orthogonal coordinates. In particular the class of isothermic metrics was defined (isothermicity of the metric is a necessary condition for its R-separability). An important sub

  4. The long-term variability of cosmic ray protons in the heliosphere: A modeling approach

    M.S. Potgieter

    2013-05-01

    Full Text Available Galactic cosmic rays are charged particles created in our galaxy and beyond. They propagate through interstellar space to eventually reach the heliosphere and Earth. Their transport in the heliosphere is subjected to four modulation processes: diffusion, convection, adiabatic energy changes and particle drifts. Time-dependent changes, caused by solar activity which varies from minimum to maximum every ∼11 years, are reflected in cosmic ray observations at and near Earth and along spacecraft trajectories. Using a time-dependent compound numerical model, the time variation of cosmic ray protons in the heliosphere is studied. It is shown that the modeling approach is successful and can be used to study long-term modulation cycles.

  5. Successful emotion regulation is predicted by amygdala activity and aspects of personality: A latent variable approach.

    Morawetz, Carmen; Alexandrowicz, Rainer W; Heekeren, Hauke R

    2017-04-01

    The experience of emotions and their cognitive control are based upon neural responses in prefrontal and subcortical regions and could be affected by personality and temperamental traits. Previous studies established an association between activity in reappraisal-related brain regions (e.g., inferior frontal gyrus and amygdala) and emotion regulation success. Given these relationships, we aimed to further elucidate how individual differences in emotion regulation skills relate to brain activity within the emotion regulation network on the one hand, and personality/temperamental traits on the other. We directly examined the relationship between personality and temperamental traits, emotion regulation success and its underlying neuronal network in a large sample (N = 82) using an explicit emotion regulation task and functional MRI (fMRI). We applied a multimethodological analysis approach, combing standard activation-based analyses with structural equation modeling. First, we found that successful downregulation is predicted by activity in key regions related to emotion processing. Second, the individual ability to successfully upregulate emotions is strongly associated with the ability to identify feelings, conscientiousness, and neuroticism. Third, the successful downregulation of emotion is modulated by openness to experience and habitual use of reappraisal. Fourth, the ability to regulate emotions is best predicted by a combination of brain activity and personality as well temperamental traits. Using a multimethodological analysis approach, we provide a first step toward a causal model of individual differences in emotion regulation ability by linking biological systems underlying emotion regulation with descriptive constructs. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. An Object-Based Approach to Evaluation of Climate Variability Projections and Predictions

    Ammann, C. M.; Brown, B.; Kalb, C. P.; Bullock, R.

    2017-12-01

    Evaluations of the performance of earth system model predictions and projections are of critical importance to enhance usefulness of these products. Such evaluations need to address specific concerns depending on the system and decisions of interest; hence, evaluation tools must be tailored to inform about specific issues. Traditional approaches that summarize grid-based comparisons of analyses and models, or between current and future climate, often do not reveal important information about the models' performance (e.g., spatial or temporal displacements; the reason behind a poor score) and are unable to accommodate these specific information needs. For example, summary statistics such as the correlation coefficient or the mean-squared error provide minimal information to developers, users, and decision makers regarding what is "right" and "wrong" with a model. New spatial and temporal-spatial object-based tools from the field of weather forecast verification (where comparisons typically focus on much finer temporal and spatial scales) have been adapted to more completely answer some of the important earth system model evaluation questions. In particular, the Method for Object-based Diagnostic Evaluation (MODE) tool and its temporal (three-dimensional) extension (MODE-TD) have been adapted for these evaluations. More specifically, these tools can be used to address spatial and temporal displacements in projections of El Nino-related precipitation and/or temperature anomalies, ITCZ-associated precipitation areas, atmospheric rivers, seasonal sea-ice extent, and other features of interest. Examples of several applications of these tools in a climate context will be presented, using output of the CESM large ensemble. In general, these tools provide diagnostic information about model performance - accounting for spatial, temporal, and intensity differences - that cannot be achieved using traditional (scalar) model comparison approaches. Thus, they can provide more

  7. A simple approach to ignoring irrelevant variables by population decoding based on multisensory neurons

    Kim, HyungGoo R.; Pitkow, Xaq; Angelaki, Dora E.

    2016-01-01

    Sensory input reflects events that occur in the environment, but multiple events may be confounded in sensory signals. For example, under many natural viewing conditions, retinal image motion reflects some combination of self-motion and movement of objects in the world. To estimate one stimulus event and ignore others, the brain can perform marginalization operations, but the neural bases of these operations are poorly understood. Using computational modeling, we examine how multisensory signals may be processed to estimate the direction of self-motion (i.e., heading) and to marginalize out effects of object motion. Multisensory neurons represent heading based on both visual and vestibular inputs and come in two basic types: “congruent” and “opposite” cells. Congruent cells have matched heading tuning for visual and vestibular cues and have been linked to perceptual benefits of cue integration during heading discrimination. Opposite cells have mismatched visual and vestibular heading preferences and are ill-suited for cue integration. We show that decoding a mixed population of congruent and opposite cells substantially reduces errors in heading estimation caused by object motion. In addition, we present a general formulation of an optimal linear decoding scheme that approximates marginalization and can be implemented biologically by simple reinforcement learning mechanisms. We also show that neural response correlations induced by task-irrelevant variables may greatly exceed intrinsic noise correlations. Overall, our findings suggest a general computational strategy by which neurons with mismatched tuning for two different sensory cues may be decoded to perform marginalization operations that dissociate possible causes of sensory inputs. PMID:27334948

  8. Risk implications of renewable support instruments: Comparative analysis of feed-in tariffs and premiums using a mean–variance approach

    Kitzing, Lena

    2014-01-01

    Different support instruments for renewable energy expose investors differently to market risks. This has implications on the attractiveness of investment. We use mean–variance portfolio analysis to identify the risk implications of two support instruments: feed-in tariffs and feed-in premiums. Using cash flow analysis, Monte Carlo simulations and mean–variance analysis, we quantify risk-return relationships for an exemplary offshore wind park in a simplified setting. We show that feed-in tariffs systematically require lower direct support levels than feed-in premiums while providing the same attractiveness for investment, because they expose investors to less market risk. These risk implications should be considered when designing policy schemes. - Highlights: • Mean–variance portfolio approach to analyse risk implications of policy instruments. • We show that feed-in tariffs require lower support levels than feed-in premiums. • This systematic effect stems from the lower exposure of investors to market risk. • We created a stochastic model for an exemplary offshore wind park in West Denmark. • We quantify risk-return, Sharpe Ratios and differences in required support levels

  9. Instrumental interaction

    Luciani , Annie

    2007-01-01

    International audience; The expression instrumental interaction as been introduced by Claude Cadoz to identify a human-object interaction during which a human manipulates a physical object - an instrument - in order to perform a manual task. Classical examples of instrumental interaction are all the professional manual tasks: playing violin, cutting fabrics by hand, moulding a paste, etc.... Instrumental interaction differs from other types of interaction (called symbolic or iconic interactio...

  10. Gravity wave control on ESF day-to-day variability: An empirical approach

    Aswathy, R. P.; Manju, G.

    2017-06-01

    irregularities lie below and above the curve. The model is validated with data from the years 2001 (high solar activity), 2004 (moderate solar activity), and 1995 (low solar activity) which have not been used in the model development. Presently, the model is developed for autumnal equinox season, but the model development will be undertaken for other seasons also in a future work so that the seasonal variability is also incorporated. This model thus holds the potential to be developed into a full-fledged model which can predict occurrence of nocturnal ionospheric irregularities. Globally, concerted efforts are underway to predict these ionospheric irregularities. Hence, this study is extremely important from the point of view of predicting communication and navigation outages.

  11. Combination of individual tree detection and area-based approach in imputation of forest variables using airborne laser data

    Vastaranta, Mikko; Kankare, Ville; Holopainen, Markus; Yu, Xiaowei; Hyyppä, Juha; Hyyppä, Hannu

    2012-01-01

    The two main approaches to deriving forest variables from laser-scanning data are the statistical area-based approach (ABA) and individual tree detection (ITD). With ITD it is feasible to acquire single tree information, as in field measurements. Here, ITD was used for measuring training data for the ABA. In addition to automatic ITD (ITD auto), we tested a combination of ITD auto and visual interpretation (ITD visual). ITD visual had two stages: in the first, ITD auto was carried out and in the second, the results of the ITD auto were visually corrected by interpreting three-dimensional laser point clouds. The field data comprised 509 circular plots ( r = 10 m) that were divided equally for testing and training. ITD-derived forest variables were used for training the ABA and the accuracies of the k-most similar neighbor ( k-MSN) imputations were evaluated and compared with the ABA trained with traditional measurements. The root-mean-squared error (RMSE) in the mean volume was 24.8%, 25.9%, and 27.2% with the ABA trained with field measurements, ITD auto, and ITD visual, respectively. When ITD methods were applied in acquiring training data, the mean volume, basal area, and basal area-weighted mean diameter were underestimated in the ABA by 2.7-9.2%. This project constituted a pilot study for using ITD measurements as training data for the ABA. Further studies are needed to reduce the bias and to determine the accuracy obtained in imputation of species-specific variables. The method could be applied in areas with sparse road networks or when the costs of fieldwork must be minimized.

  12. PG-Metrics: A chemometric-based approach for classifying bacterial peptidoglycan data sets and uncovering their subjacent chemical variability.

    Keshav Kumar

    Full Text Available Bacteria cells are protected from osmotic and environmental stresses by an exoskeleton-like polymeric structure called peptidoglycan (PG or murein sacculus. This structure is fundamental for bacteria's viability and thus, the mechanisms underlying cell wall assembly and how it is modulated serve as targets for many of our most successful antibiotics. Therefore, it is now more important than ever to understand the genetics and structural chemistry of the bacterial cell walls in order to find new and effective methods of blocking it for the treatment of disease. In the last decades, liquid chromatography and mass spectrometry have been demonstrated to provide the required resolution and sensitivity to characterize the fine chemical structure of PG. However, the large volume of data sets that can be produced by these instruments today are difficult to handle without a proper data analysis workflow. Here, we present PG-metrics, a chemometric based pipeline that allows fast and easy classification of bacteria according to their muropeptide chromatographic profiles and identification of the subjacent PG chemical variability between e.g. bacterial species, growth conditions and, mutant libraries. The pipeline is successfully validated here using PG samples from different bacterial species and mutants in cell wall proteins. The obtained results clearly demonstrated that PG-metrics pipeline is a valuable bioanalytical tool that can lead us to cell wall classification and biomarker discovery.

  13. Interpolation Approaches for Characterizing Spatial Variability of Soil Properties in Tuz Lake Basin of Turkey

    Gorji, Taha; Sertel, Elif; Tanik, Aysegul

    2017-12-01

    Soil management is an essential concern in protecting soil properties, in enhancing appropriate soil quality for plant growth and agricultural productivity, and in preventing soil erosion. Soil scientists and decision makers require accurate and well-distributed spatially continuous soil data across a region for risk assessment and for effectively monitoring and managing soils. Recently, spatial interpolation approaches have been utilized in various disciplines including soil sciences for analysing, predicting and mapping distribution and surface modelling of environmental factors such as soil properties. The study area selected in this research is Tuz Lake Basin in Turkey bearing ecological and economic importance. Fertile soil plays a significant role in agricultural activities, which is one of the main industries having great impact on economy of the region. Loss of trees and bushes due to intense agricultural activities in some parts of the basin lead to soil erosion. Besides, soil salinization due to both human-induced activities and natural factors has exacerbated its condition regarding agricultural land development. This study aims to compare capability of Local Polynomial Interpolation (LPI) and Radial Basis Functions (RBF) as two interpolation methods for mapping spatial pattern of soil properties including organic matter, phosphorus, lime and boron. Both LPI and RBF methods demonstrated promising results for predicting lime, organic matter, phosphorous and boron. Soil samples collected in the field were used for interpolation analysis in which approximately 80% of data was used for interpolation modelling whereas the remaining for validation of the predicted results. Relationship between validation points and their corresponding estimated values in the same location is examined by conducting linear regression analysis. Eight prediction maps generated from two different interpolation methods for soil organic matter, phosphorus, lime and boron parameters

  14. Application of a computationally efficient geostatistical approach to characterizing variably spaced water-table data

    Quinn, J.J.

    1996-01-01

    Geostatistical analysis of hydraulic head data is useful in producing unbiased contour plots of head estimates and relative errors. However, at most sites being characterized, monitoring wells are generally present at different densities, with clusters of wells in some areas and few wells elsewhere. The problem that arises when kriging data at different densities is in achieving adequate resolution of the grid while maintaining computational efficiency and working within software limitations. For the site considered, 113 data points were available over a 14-mi 2 study area, including 57 monitoring wells within an area of concern of 1.5 mi 2 . Variogram analyses of the data indicate a linear model with a negligible nugget effect. The geostatistical package used in the study allows a maximum grid of 100 by 100 cells. Two-dimensional kriging was performed for the entire study area with a 500-ft grid spacing, while the smaller zone was modeled separately with a 100-ft spacing. In this manner, grid cells for the dense area and the sparse area remained small relative to the well separation distances, and the maximum dimensions of the program were not exceeded. The spatial head results for the detailed zone were then nested into the regional output by use of a graphical, object-oriented database that performed the contouring of the geostatistical output. This study benefitted from the two-scale approach and from very fine geostatistical grid spacings relative to typical data separation distances. The combining of the sparse, regional results with those from the finer-resolution area of concern yielded contours that honored the actual data at every measurement location. The method applied in this study can also be used to generate reproducible, unbiased representations of other types of spatial data

  15. A volatolomic approach for studying plant variability: the case of selected Helichrysum species (Asteraceae).

    Giuliani, Claudia; Lazzaro, Lorenzo; Calamassi, Roberto; Calamai, Luca; Romoli, Riccardo; Fico, Gelsomina; Foggi, Bruno; Mariotti Lippi, Marta

    2016-10-01

    between this phytochemical approach and the traditional morphometrical analysis in studying the Helichrysum populations supports the validity of the VOC profile in solving taxonomic problems. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. A network-based approach for semi-quantitative knowledge mining and its application to yield variability

    Schauberger, Bernhard; Rolinski, Susanne; Müller, Christoph

    2016-12-01

    Variability of crop yields is detrimental for food security. Under climate change its amplitude is likely to increase, thus it is essential to understand the underlying causes and mechanisms. Crop models are the primary tool to project future changes in crop yields under climate change. A systematic overview of drivers and mechanisms of crop yield variability (YV) can thus inform crop model development and facilitate improved understanding of climate change impacts on crop yields. Yet there is a vast body of literature on crop physiology and YV, which makes a prioritization of mechanisms for implementation in models challenging. Therefore this paper takes on a novel approach to systematically mine and organize existing knowledge from the literature. The aim is to identify important mechanisms lacking in models, which can help to set priorities in model improvement. We structure knowledge from the literature in a semi-quantitative network. This network consists of complex interactions between growing conditions, plant physiology and crop yield. We utilize the resulting network structure to assign relative importance to causes of YV and related plant physiological processes. As expected, our findings confirm existing knowledge, in particular on the dominant role of temperature and precipitation, but also highlight other important drivers of YV. More importantly, our method allows for identifying the relevant physiological processes that transmit variability in growing conditions to variability in yield. We can identify explicit targets for the improvement of crop models. The network can additionally guide model development by outlining complex interactions between processes and by easily retrieving quantitative information for each of the 350 interactions. We show the validity of our network method as a structured, consistent and scalable dictionary of literature. The method can easily be applied to many other research fields.

  17. The reproducibility and responsiveness of a patient-specific approach: a new instrument in evaluation of treatment of temporomandibular disorders

    Rollman, A.; Naeije, M.; Visscher, C.M.

    2010-01-01

    AIMS: To evaluate the choice of activities on the Patient Specific Approach (PSA) in a sample of temporomandibular disorder (TMD) patients and to determine the clinimetric properties of the visual analog scale (VAS) scores of the PSA, in terms of reproducibility and responsiveness. METHODS: At

  18. Economic Policy Instruments and Evaluation Methods in Dutch Water Management: An analysis of their contribution to an integrated approach

    S.P. Boot (Sander Paul)

    2007-01-01

    textabstractIn international water policy, a trend can be observed towards more attention for economic approaches in water management. In 1992, at the International Conference on Water and the Environment (ICWE) in Dublin, the Convention on the Protection and Use of Transboundary Water Courses and

  19. Sensitivity of Anopheles gambiae population dynamics to meteo-hydrological variability: a mechanistic approach

    Gilioli Gianni

    2011-10-01

    Full Text Available Abstract Background Mechanistic models play an important role in many biological disciplines, and they can effectively contribute to evaluate the spatial-temporal evolution of mosquito populations, in the light of the increasing knowledge of the crucial driving role on vector dynamics played by meteo-climatic features as well as other physical-biological characteristics of the landscape. Methods In malaria eco-epidemiology landscape components (atmosphere, water bodies, land use interact with the epidemiological system (interacting populations of vector, human, and parasite. In the background of the eco-epidemiological approach, a mosquito population model is here proposed to evaluate the sensitivity of An. gambiae s.s. population to some peculiar thermal-pluviometric scenarios. The scenarios are obtained perturbing meteorological time series data referred to four Kenyan sites (Nairobi, Nyabondo, Kibwesi, and Malindi representing four different eco-epidemiological settings. Results Simulations highlight a strong dependence of mosquito population abundance on temperature variation with well-defined site-specific patterns. The upper extreme of thermal perturbation interval (+ 3°C gives rise to an increase in adult population abundance at Nairobi (+111% and Nyabondo (+61%, and a decrease at Kibwezi (-2% and Malindi (-36%. At the lower extreme perturbation (-3°C is observed a reduction in both immature and adult mosquito population in three sites (Nairobi -74%, Nyabondo -66%, Kibwezi -39%, and an increase in Malindi (+11%. A coherent non-linear pattern of population variation emerges. The maximum rate of variation is +30% population abundance for +1°C of temperature change, but also almost null and negative values are obtained. Mosquitoes are less sensitive to rainfall and both adults and immature populations display a positive quasi-linear response pattern to rainfall variation. Conclusions The non-linear temperature-dependent response is in

  20. Can masses of non-experts train highly accurate image classifiers? A crowdsourcing approach to instrument segmentation in laparoscopic images.

    Maier-Hein, Lena; Mersmann, Sven; Kondermann, Daniel; Bodenstedt, Sebastian; Sanchez, Alexandro; Stock, Christian; Kenngott, Hannes Gotz; Eisenmann, Mathias; Speidel, Stefanie

    2014-01-01

    Machine learning algorithms are gaining increasing interest in the context of computer-assisted interventions. One of the bottlenecks so far, however, has been the availability of training data, typically generated by medical experts with very limited resources. Crowdsourcing is a new trend that is based on outsourcing cognitive tasks to many anonymous untrained individuals from an online community. In this work, we investigate the potential of crowdsourcing for segmenting medical instruments in endoscopic image data. Our study suggests that (1) segmentations computed from annotations of multiple anonymous non-experts are comparable to those made by medical experts and (2) training data generated by the crowd is of the same quality as that annotated by medical experts. Given the speed of annotation, scalability and low costs, this implies that the scientific community might no longer need to rely on experts to generate reference or training data for certain applications. To trigger further research in endoscopic image processing, the data used in this study will be made publicly available.

  1. Influence of plant productivity over variability of soil respiration: a multi-scale approach

    Curiel Yuste, J.

    2009-04-01

    general controlled by the seasonality of substrate supply by plants (via photosynthates translocation and/or litter) to soil. Although soil temperature and soil moisture exert a strong influence over the variation in SR, our results indicates that substrate supply by plant activity could exert a more important than previously expected role in the variability of soil respiration. 1. CREAF (Centre de Recerca Ecológica i Aplicacions Forestals), Unitat d'Ecofisiologia i Canvi Global CREAF-CEAB-CSIC, BELLATERRA (Barcelona), Spain (j.curiel@creaf.uab.es) 2. University of Antwerp (UA), Antwerp, Belgium (ivan.janssens@ua.ac.be) 3. Institute of Ecology, University of Innsbruck, Innsbruck, Austria (michael.bahn@uibk.ac.at) 4. UMR Ecologie et Ecophysiologie Forestières, Centre INRA de Nancy, France (longdoz@nancy.inra.fr) 5. ESPM, University of Calicornia at Berkeley, Berkeley, CA, US (baldocchi@nature.berkeley.edu) 6. The Woods Hole Research Center, Falmouth, USA (edavidson@whrc.org) 7. Max-Planck-Institute for Biogeochemistry, Jena, Germany (markus.reichstein@bgc-jena.mpg.de) 8. Institute of Systems Biology and Ecology, Academy of Sciences of the Czech Republic, Czech Republic (manuel@brno.cas.cz) 9. Università degli studi della Tuscia, Viterbo, Italy (arriga@unitus.it) 10. Laurence Berkeley lab, Berkeley, CA, USA (mstorn@lbl.gov) 11. Gembloux Agricultural University, Gembloux, Belgium (aubinet.m@fsagx.ac.be) 12. Fundacion CEAM(Centro de Estudios Ambientales del Mediterráneo), Valencia, Spain (arnaud@ceam.es) 13. Institute of Hydrology and Meteorology, Technische Universität Dresden, Pienner, Germany (gruenwald@forst.tu-dresden.de) 14. Department of Environmental Sciences, Second University of Naples, Caserta, Italy (ilaria.inglima@unina2.it) 15. CNRS-CEFE Montpellier, France (Laurent.MISSON@cefe.cnrs.fr) 16. Agenzia Provinciale per l'Ambiente, Bolzano, Italy (leonar@inwind.it) 17. University of Helsinki Department of Forest Ecology, Helsinki, Finland (jukka

  2. A new mathematical approach for the estimation of the AUC and its variability under different experimental designs in preclinical studies.

    Navarro-Fontestad, Carmen; González-Álvarez, Isabel; Fernández-Teruel, Carlos; Bermejo, Marival; Casabó, Vicente Germán

    2012-01-01

    The aim of the present work was to develop a new mathematical method for estimating the area under the curve (AUC) and its variability that could be applied in different preclinical experimental designs and amenable to be implemented in standard calculation worksheets. In order to assess the usefulness of the new approach, different experimental scenarios were studied and the results were compared with those obtained with commonly used software: WinNonlin® and Phoenix WinNonlin®. The results do not show statistical differences among the AUC values obtained by both procedures, but the new method appears to be a better estimator of the AUC standard error, measured as the coverage of 95% confidence interval. In this way, the new proposed method demonstrates to be as useful as WinNonlin® software when it was applicable. Copyright © 2011 John Wiley & Sons, Ltd.

  3. Using latent variable approach to estimate China's economy-wide energy rebound effect over 1954–2010

    Shao, Shuai; Huang, Tao; Yang, Lili

    2014-01-01

    The energy rebound effect has been a significant issue in China, which is undergoing economic transition, since it reflects the effectiveness of energy-saving policy relying on improved energy efficiency. Based on the IPAT equation and Brookes' explanation of the rebound effect, this paper develops an alternative estimation model of the rebound effect. By using the estimation model and latent variable approach, which is achieved through a time-varying coefficient state space model, we estimate China's economy-wide energy rebound effect over 1954–2010. The results show that the rebound effect evidently exists in China as a result of the annual average of 39.73% over 1954–2010. Before and after the implementation of China's reform and opening-up policy in 1978, the rebound effects are 47.24% and 37.32%, with a strong fluctuation and a circuitously downward trend, respectively, indicating that a stable political environment and the development of market economy system facilitate the effectiveness of energy-saving policy. Although the energy-saving effect of improving energy efficiency has been partly realised, there remains a large energy-saving potential in China. - Highlights: • We present an improved estimation methodology of economy-wide energy rebound effect. • We use the latent variable approach to estimate China's economy-wide rebound effect. • The rebound exists in China and varies before and after reform and opening-up. • After 1978, the average rebound is 37.32% with a circuitously downward trend. • Traditional Solow remainder method underestimates the rebound in most cases

  4. The Use of a Novel Approach for the Instrumentation of a Cone-beam Computed Tomography-discernible Lateral Canal in an Unusual Maxillary Incisor: Case Report.

    Chaniotis, Antonis; Filippatos, Christos G

    2017-06-01

    Lateral and apical ramifications of the main root canal create potential pathways through which bacteria can spread and remain unaffected by treatment procedures. It is a challenge for the specialty to find techniques that can predictably reach, disinfect, and obturate these ramifications. Here, we report the use of a novel instrumentation approach to aid in the negotiation and management of a lateral canal discernible on cone-beam computed tomography (CBCT) in an unusual maxillary central incisor. A 23-year-old female patient was referred for evaluation and possible treatment of tooth 9. The periapical radiographic examination revealed pulp chamber obliteration, existence of a lateral lesion, and a possible complex internal root canal anatomy. The CBCT evaluation revealed the existence of a lateral lesion, a periapical lesion, an additional distopalatal canal, and a lateral canal exiting at the lateral lesion. The diagnosis of asymptomatic apical and lateral periodontitis of tooth 9 was reached. CBCT-aided access cavity preparation and scouting resulted in the successful negotiation of all canals, main and lateral. A novel instrumentation technique with precurved controlled memory files was used for the mechanical preparation of the lateral canal to a 25/04 enlargement. Obturaton of the lateral canal was achieved with a single gutta-percha cone and AH Plus Root Canal Sealer. At the 2-year follow-up, the patient was asymptomatic, and the 2-dimensional radiographic examinations revealed resolution of both the periapical and the lateral lesions. This case report describes the application of a novel instrumentation technique for the mechanical debridement of an infected lateral canal discernible on CBCT and reinforces the importance of treating the root canals as systems that possesses anatomic intricacies that need to be addressed. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  5. Instrumentation Facility

    Federal Laboratory Consortium — Provides instrumentation support for flight tests of prototype weapons systems using a vast array of airborne sensors, transducers, signal conditioning and encoding...

  6. Fracture in quasi-brittle materials: experimental and numerical approach for the determination of an incremental model with generalized variables

    Morice, Erwan

    2014-01-01

    Fracture in quasi-brittle materials, such as ceramics or concrete, can be represented schematically by series of events of nucleation and coalescence of micro-cracks. Modeling this process is an important challenge for the reliability and life prediction of concrete structures, in particular the prediction of the permeability of damaged structures. A multi-scale approach is proposed. The global behavior is modeled within the fracture mechanics framework and the local behavior is modeled by the discrete element method. An approach was developed to condense the non linear behavior of the mortar. A model reduction technic is used to extract the relevant information from the discrete elements method. To do so, the velocity field is partitioned into mode I, II, linear and non-linear components, each component being characterized by an intensity factor and a fixed spatial distribution. The response of the material is hence condensed in the evolution of the intensity factors, used as non-local variables. A model was also proposed to predict the behavior of the crack for proportional and non-proportional mixed mode I+II loadings. An experimental campaign was finally conducted to characterize the fatigue and fracture behavior of mortar. The results show that fatigue crack growth can be of significant importance. The experimental velocity field determined, in the crack tip region, by DIC, were analyzed using the same technic as that used for analyzing the fields obtained by the discrete element method showing consistent results. (author)

  7. Propensity-score matching in economic analyses: comparison with regression models, instrumental variables, residual inclusion, differences-in-differences, and decomposition methods.

    Crown, William H

    2014-02-01

    This paper examines the use of propensity score matching in economic analyses of observational data. Several excellent papers have previously reviewed practical aspects of propensity score estimation and other aspects of the propensity score literature. The purpose of this paper is to compare the conceptual foundation of propensity score models with alternative estimators of treatment effects. References are provided to empirical comparisons among methods that have appeared in the literature. These comparisons are available for a subset of the methods considered in this paper. However, in some cases, no pairwise comparisons of particular methods are yet available, and there are no examples of comparisons across all of the methods surveyed here. Irrespective of the availability of empirical comparisons, the goal of this paper is to provide some intuition about the relative merits of alternative estimators in health economic evaluations where nonlinearity, sample size, availability of pre/post data, heterogeneity, and missing variables can have important implications for choice of methodology. Also considered is the potential combination of propensity score matching with alternative methods such as differences-in-differences and decomposition methods that have not yet appeared in the empirical literature.

  8. A novel genetic score approach using instruments to investigate interactions between pathways and environment: application to air pollution.

    Marie-Abele Bind

    Full Text Available Air pollution has been associated with increased systemic inflammation markers. We developed a new pathway analysis approach to investigate whether gene variants within relevant pathways (oxidative stress, endothelial function, and metal processing modified the association between particulate air pollution and fibrinogen, C-reactive protein (CRP, intercellular adhesion molecule-1 (ICAM-1, and vascular cell adhesion molecule-1 (VCAM-1. Our study population consisted of 822 elderly participants of the Normative Aging Study (1999-2011. To investigate the role of biological mechanisms and to reduce the number of comparisons in the analysis, we created pathway-specific scores using gene variants related to each pathway. To select the most appropriate gene variants, we used the least absolute shrinkage and selection operator (Lasso to relate independent outcomes representative of each pathway (8-hydroxydeoxyguanosine for oxidative stress, augmentation index for endothelial function, and patella lead for metal processing to gene variants. A high genetic score corresponds to a higher allelic risk profile. We fit mixed-effects models to examine modification by the genetic score of the weekly air pollution association with the outcome. Among participants with higher genetic scores within the oxidative stress pathway, we observed significant associations between particle number and fibrinogen, while we did not find any association among participants with lower scores (p(interaction = 0.04. Compared to individuals with low genetic scores of metal processing gene variants, participants with higher scores had greater effects of particle number on fibrinogen (p(interaction = 0.12, CRP (p(interaction = 0.02, and ICAM-1 (pinteraction = 0.08. This two-stage penalization method is easy to implement and can be used for large-scale genetic applications.

  9. Liver-related mortality in countries of the developed world: an ecological study approach to explain the variability.

    von Wulffen, M; Clark, P J; Macdonald, G A; Raj, A S; Kendall, B J; Powell, E E; Jones, M P; Holtmann, G

    2016-07-01

    Liver-related mortality varies across developed nations. To assess the relative role of various risk factors in relation to liver-related mortality in an ecological study approach. Data for liver-related mortality, prevalence data for hepatitis B and C, human immunodeficiency virus (HIV), alcohol consumption per capita, Type 2 Diabetes mellitus (T2DM), overweight and obesity were extracted from peer-reviewed publications or WHO databases for different developed countries. As potential other risk-modifying factors, purchase power parity (PPP)-adjusted gross domestic product (GDP) per capita and health expenditure per capita were assessed. As an environmental 'hygiene factor', we also assessed the effect of the prevalence of Helicobacter pylori. Only countries with a PPP-adjusted GDP greater than $20 000 and valid information for at least 8 risk modifiers were included. Univariate and multivariate analyses were utilised to quantify the contribution to the variability in liver-related mortality. The proportion of chronic liver diseases (CLD)-related mortality ranged from 0.73-2.40% [mean 1.56%, 95% CI (1.43-1.69)] of all deaths. Univariately, CLD-related mortality was significantly associated with Hepatitis B prevalence, alcohol consumption, PPP-adjusted GDP (all P < 0.05) and potentially H. pylori prevalence (P = 0.055). Other investigated factors, including hepatitis C, did not yield significance. Backward elimination suggested hepatitis B, alcohol consumption and PPP-adjusted GDP as risk factors (explaining 66.3% of the variability). Hepatitis B infection, alcohol consumption and GDP, but not hepatitis C or other factors, explain most of the variance of liver-related mortality. © 2016 John Wiley & Sons Ltd.

  10. Instrumentation development

    Ubbes, W.F.; Yow, J.L. Jr.

    1988-01-01

    Instrumentation is developed for the Civilian Radioactive Waste Management Program to meet several different (and sometimes conflicting) objectives. This paper addresses instrumentation development for data needs that are related either directly or indirectly to a repository site, but does not touch on instrumentation for work with waste forms or other materials. Consequently, this implies a relatively large scale for the measurements, and an in situ setting for instrument performance. In this context, instruments are needed for site characterization to define phenomena, develop models, and obtain parameter values, and for later design and performance confirmation testing in the constructed repository. The former set of applications is more immediate, and is driven by the needs of program design and performance assessment activities. A host of general technical and nontechnical issues have arisen to challenge instrumentation development. Instruments can be classed into geomechanical, geohydrologic, or other specialty categories, but these issues cut across artificial classifications. These issues are outlined. Despite this imposing list of issues, several case histories are cited to evaluate progress in the area

  11. Medical instruments in museums

    Söderqvist, Thomas; Arnold, Ken

    2011-01-01

    This essay proposes that our understanding of medical instruments might benefit from adding a more forthright concern with their immediate presence to the current historical focus on simply decoding their meanings and context. This approach is applied to the intriguingly tricky question of what...... actually is meant by a "medical instrument." It is suggested that a pragmatic part of the answer might lie simply in reconsidering the holdings of medical museums, where the significance of the physical actuality of instruments comes readily to hand....

  12. Study protocol. IDUS - Instrumental delivery & ultrasound: a multi-centre randomised controlled trial of ultrasound assessment of the fetal head position versus standard care as an approach to prevent morbidity at instrumental delivery.

    Murphy, Deirdre J

    2012-01-01

    Instrumental deliveries are commonly performed in the United Kingdom and Ireland, with rates of 12 - 17% in most centres. Knowing the exact position of the fetal head is a pre-requisite for safe instrumental delivery. Traditionally, diagnosis of the fetal head position is made on transvaginal digital examination by delineating the suture lines of the fetal skull and the fontanelles. However, the accuracy of transvaginal digital examination can be unreliable and varies between 20% and 75%. Failure to identify the correct fetal head position increases the likelihood of failed instrumental delivery with the additional morbidity of sequential use of instruments or second stage caesarean section. The use of ultrasound in determining the position of the fetal head has been explored but is not part of routine clinical practice.

  13. Iwamoto-Harada coalescence/pickup model for cluster emission: state density approach including angular momentum variables

    Běták Emil

    2014-04-01

    Full Text Available For low-energy nuclear reactions well above the resonance region, but still below the pion threshold, statistical pre-equilibrium models (e.g., the exciton and the hybrid ones are a frequent tool for analysis of energy spectra and the cross sections of cluster emission. For α’s, two essentially distinct approaches are popular, namely the preformed one and the different versions of coalescence approaches, whereas only the latter group of models can be used for other types of cluster ejectiles. The original Iwamoto-Harada model of pre-equilibrium cluster emission was formulated using the overlap of the cluster and its constituent nucleons in momentum space. Transforming it into level or state densities is not a straigthforward task; however, physically the same model was presented at a conference on reaction models five years earlier. At that time, only the densities without spin were used. The introduction of spin variables into the exciton model enabled detailed calculation of the γ emission and its competition with nucleon channels, and – at the same time – it stimulated further developments of the model. However – to the best of our knowledge – no spin formulation has been presented for cluster emission till recently, when the first attempts have been reported, but restricted to the first emission only. We have updated this effort now and we are able to handle (using the same simplifications as in our previous work pre-equilibrium cluster emission with spin including all nuclei in the reaction chain.

  14. Study Protocol. IDUS -- Instrumental delivery & ultrasound. A multi-centre randomised controlled trial of ultrasound assessment of the fetal head position versus standard care as an approach to prevent morbidity at instrumental delivery

    Murphy, Deirdre J

    2012-09-13

    AbstractBackgroundInstrumental deliveries are commonly performed in the United Kingdom and Ireland, with rates of 12 – 17% in most centres. Knowing the exact position of the fetal head is a pre-requisite for safe instrumental delivery. Traditionally, diagnosis of the fetal head position is made on transvaginal digital examination by delineating the suture lines of the fetal skull and the fontanelles. However, the accuracy of transvaginal digital examination can be unreliable and varies between 20% and 75%. Failure to identify the correct fetal head position increases the likelihood of failed instrumental delivery with the additional morbidity of sequential use of instruments or second stage caesarean section. The use of ultrasound in determining the position of the fetal head has been explored but is not part of routine clinical practice.Methods\\/DesignA multi-centre randomised controlled trial is proposed. The study will take place in two large maternity units in Ireland with a combined annual birth rate of 13,500 deliveries. It will involve 450 nulliparous women undergoing instrumental delivery after 37 weeks gestation. The main outcome measure will be incorrect diagnosis of the fetal head position. A study involving 450 women will have 80% power to detect a 10% difference in the incidence of inaccurate diagnosis of the fetal head position with two-sided 5% alpha.DiscussionIt is both important and timely to evaluate the use of ultrasound to diagnose the fetal head position prior to instrumental delivery before routine use can be advocated. The overall aim is to reduce the incidence of incorrect diagnosis of the fetal head position prior to instrumental delivery and improve the safety of instrumental deliveries.Trial registrationCurrent Controlled Trials ISRCTN72230496

  15. Study Protocol. IDUS – Instrumental delivery & ultrasound. A multi-centre randomised controlled trial of ultrasound assessment of the fetal head position versus standard care as an approach to prevent morbidity at instrumental delivery

    Murphy Deirdre J

    2012-09-01

    Full Text Available Abstract Background Instrumental deliveries are commonly performed in the United Kingdom and Ireland, with rates of 12 – 17% in most centres. Knowing the exact position of the fetal head is a pre-requisite for safe instrumental delivery. Traditionally, diagnosis of the fetal head position is made on transvaginal digital examination by delineating the suture lines of the fetal skull and the fontanelles. However, the accuracy of transvaginal digital examination can be unreliable and varies between 20% and 75%. Failure to identify the correct fetal head position increases the likelihood of failed instrumental delivery with the additional morbidity of sequential use of instruments or second stage caesarean section. The use of ultrasound in determining the position of the fetal head has been explored but is not part of routine clinical practice. Methods/Design A multi-centre randomised controlled trial is proposed. The study will take place in two large maternity units in Ireland with a combined annual birth rate of 13,500 deliveries. It will involve 450 nulliparous women undergoing instrumental delivery after 37 weeks gestation. The main outcome measure will be incorrect diagnosis of the fetal head position. A study involving 450 women will have 80% power to detect a 10% difference in the incidence of inaccurate diagnosis of the fetal head position with two-sided 5% alpha. Discussion It is both important and timely to evaluate the use of ultrasound to diagnose the fetal head position prior to instrumental delivery before routine use can be advocated. The overall aim is to reduce the incidence of incorrect diagnosis of the fetal head position prior to instrumental delivery and improve the safety of instrumental deliveries. Trial registration Current Controlled Trials ISRCTN72230496

  16. Instrumental analysis

    Kim, Seung Jae; Seo, Seong Gyu

    1995-03-15

    This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.

  17. Instrumental analysis

    Kim, Seung Jae; Seo, Seong Gyu

    1995-03-01

    This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.

  18. LOFT instrumentation

    Bixby, W.W.

    1979-01-01

    A description of instrumentation used in the Loss-of-Fluid Test (LOFT) large break Loss-of-Coolant Experiments is presented. Emphasis is placed on hydraulic and thermal measurements in the primary system piping and components, reactor vessel, and pressure suppression system. In addition, instrumentation which is being considered for measurement of phenomena during future small break testing is discussed. (orig.) 891 HP/orig. 892 BRE [de

  19. A hybrid approach to estimating national scale spatiotemporal variability of PM2.5 in the contiguous United States.

    Beckerman, Bernardo S; Jerrett, Michael; Serre, Marc; Martin, Randall V; Lee, Seung-Jae; van Donkelaar, Aaron; Ross, Zev; Su, Jason; Burnett, Richard T

    2013-07-02

    Airborne fine particulate matter exhibits spatiotemporal variability at multiple scales, which presents challenges to estimating exposures for health effects assessment. Here we created a model to predict ambient particulate matter less than 2.5 μm in aerodynamic diameter (PM2.5) across the contiguous United States to be applied to health effects modeling. We developed a hybrid approach combining a land use regression model (LUR) selected with a machine learning method, and Bayesian Maximum Entropy (BME) interpolation of the LUR space-time residuals. The PM2.5 data set included 104,172 monthly observations at 1464 monitoring locations with approximately 10% of locations reserved for cross-validation. LUR models were based on remote sensing estimates of PM2.5, land use and traffic indicators. Normalized cross-validated R(2) values for LUR were 0.63 and 0.11 with and without remote sensing, respectively, suggesting remote sensing is a strong predictor of ground-level concentrations. In the models including the BME interpolation of the residuals, cross-validated R(2) were 0.79 for both configurations; the model without remotely sensed data described more fine-scale variation than the model including remote sensing. Our results suggest that our modeling framework can predict ground-level concentrations of PM2.5 at multiple scales over the contiguous U.S.

  20. A comprehensive approach to handle the dynamics of customer’s needs in Quality Function Deployment based on linguistic variables

    Zohreh Bostaki

    2014-04-01

    Full Text Available In the contexture of a customer-driven goods or service design process, a well-timed update of customer’s requirements may not only serve as a necessity indicator to observe how things change over time, but also it incorporates the firms a better ground to interoperate different strategies to meet the future needs of its customer. This paper proposes a systematic methodology to deal with the customer needs’ dynamics, in terms of their relative weights, in the QFD. Compared with previous research, the contribution of this paper is fourfold. First, it applies some linguistic variables to get preferences of customers and experts to determine the relative importance of customer requirements (CRs and the relationships between customer requirements and engineering characteristics (ECs. Second, it proposes the implementation of a forecasting technique. Third, it describes more comprehensively on how future uncertainty in the weights of customer’s needs could be estimated and transmitted into the design attributes. Fourth, it proposes the implementation of a quantitative approach, which takes into account the decision maker’s attitude towards risk to optimize the QFD decision making analysis. Finally, a real-world application of QFD is provided to demonstrate the practical applicability of the proposed methodology.

  1. The mediation proportion: a structural equation approach for estimating the proportion of exposure effect on outcome explained by an intermediate variable

    Ditlevsen, Susanne; Christensen, Ulla; Lynch, John

    2005-01-01

    It is often of interest to assess how much of the effect of an exposure on a response is mediated through an intermediate variable. However, systematic approaches are lacking, other than assessment of a surrogate marker for the endpoint of a clinical trial. We review a measure of "proportion...... of several intermediate variables. Binary or categorical variables can be included directly through threshold models. We call this measure the mediation proportion, that is, the part of an exposure effect on outcome explained by a third, intermediate variable. Two examples illustrate the approach. The first...... example is a randomized clinical trial of the effects of interferon-alpha on visual acuity in patients with age-related macular degeneration. In this example, the exposure, mediator and response are all binary. The second example is a common problem in social epidemiology-to find the proportion...

  2. A Fiji multi-coral δ18O composite approach to obtaining a more accurate reconstruction of the last two-centuries of the ocean-climate variability in the South Pacific Convergence Zone region

    Dassié, Emilie P.; Linsley, Braddock K.; Corrège, Thierry; Wu, Henry C.; Lemley, Gavin M.; Howe, Steve; Cabioch, Guy

    2014-12-01

    The limited availability of oceanographic data in the tropical Pacific Ocean prior to the satellite era makes coral-based climate reconstructions a key tool for extending the instrumental record back in time, thereby providing a much needed test for climate models and projections. We have generated a unique regional network consisting of five Porites coral δ18O time series from different locations in the Fijian archipelago. Our results indicate that using a minimum of three Porites coral δ18O records from Fiji is statistically sufficient to obtain a reliable signal for climate reconstruction, and that application of an approach used in tree ring studies is a suitable tool to determine this number. The coral δ18O composite indicates that while sea surface temperature (SST) variability is the primary driver of seasonal δ18O variability in these Fiji corals, annual average coral δ18O is more closely correlated to sea surface salinity (SSS) as previously reported. Our results highlight the importance of water mass advection in controlling Fiji coral δ18O and salinity variability at interannual and decadal time scales despite being located in the heavy rainfall region of the South Pacific Convergence Zone (SPCZ). The Fiji δ18O composite presents a secular freshening and warming trend since the 1850s coupled with changes in both interannual (IA) and decadal/interdecadal (D/I) variance. The changes in IA and D/I variance suggest a re-organization of climatic variability in the SPCZ region beginning in the late 1800s to period of a more dominant interannual variability, which could correspond to a southeast expansion of the SPCZ.

  3. Investigation of the spatio-temporal variability of atmospheric boundary layer depths over mountainous terrain observed with a suite of ground-based and airborne instruments during the MATERHORN field experiment

    Pal, S.; De Wekker, S.; Emmitt, G. D.

    2013-12-01

    We present first results of the spatio-temporal variability of atmospheric boundary layer depths obtained with a suite of ground-based and airborne instruments deployed during the first field phase of The Mountain Terrain Atmospheric Modeling and Observations (MATERHORN) Program (http://www3.nd.edu/~dynamics/materhorn/index.php) at Dugway Proving Ground (DPG, Utah, USA) in Fall 2012. We mainly use high-resolution data collected on selected intensive observation periods obtained by Doppler lidars, ceilometer, and in-situ measurements from an unmanned aerial vehicle for the measurements of atmospheric boundary layer (ABL) depths. In particular, a Navy Twin Otter aircraft flew 6 missions of about 5 hours each during the daytime, collecting remotely sensed (Doppler lidar, TODWL) wind data in addition to in-situ turbulence measurements which allowed a detailed investigation of the spatial heterogeneity of the convective boundary layer turbulence features over a steep isolated mountain of a horizontal and vertical scale of about 10 km and 1 km, respectively. Additionally, we use data collected by (1) radiosonde systems at two sites of Granite Mountain area in DPG (Playa and Sagebrush), (2) sonic anemometers (CSAT-3D) for high resolution turbulence flux measurements near ground, (3) Pyranometer for incoming solar radiation, and (4) standard meteorological measurements (PTU) obtained near the surface. In this contribution, we discuss and address (1) composites obtained with lidar, ceilometer, micro-meteorological measurements, and radiosonde observations to determine the quasi-continuous regime of ABL depths, growth rates, maximum convective boundary layer (CBL) depths, etc., (2) the temporal variability in the ABL depths during entire diurnal cycle and the spatial heterogeneity in the daytime ABL depths triggered by the underlying orography in the experimental area to investigate the most possible mechanisms (e.g. combined effect of diurnal cycle and orographic trigger

  4. Temporal relationships between awakening cortisol and psychosocial variables in inpatients with anorexia nervosa - A time series approach.

    Wild, Beate; Stadnitski, Tatjana; Wesche, Daniela; Stroe-Kunold, Esther; Schultz, Jobst-Hendrik; Rudofsky, Gottfried; Maser-Gluth, Christiane; Herzog, Wolfgang; Friederich, Hans-Christoph

    2016-04-01

    The aim of the study was to investigate the characteristics of the awakening salivary cortisol in patients with anorexia nervosa (AN) using a time series design. We included ten AN inpatients, six with a very low BMI (high symptom severity, HSS group) and four patients with less severe symptoms (low symptom severity, LSS group). Patients collected salivary cortisol daily upon awakening. The number of collected saliva samples varied across patients between n=65 and n=229 (due to the different lengths of their inpatient stay). In addition, before retiring, the patients answered questions daily on the handheld regarding disorder-related psychosocial variables. The analysis of cortisol and diary data was conducted by using a time series approach. Time series showed that the awakening cortisol of the AN patients was elevated as compared to a control group. Cortisol measurements of patients with LSS essentially fluctuated in a stationary manner around a constant mean. The series of patients with HSS were generally less stable; four HSS patients showed a non-stationary cortisol awakening series. Antipsychotic medication did not change awakening cortisol in a specific way. The lagged dependencies between cortisol and depressive feelings became significant for four patients. Here, higher cortisol values were temporally associated with higher values of depressive feelings. Upon awakening, the cortisol of all AN patients was in the standard range but elevated as compared to healthy controls. Patients with HSS appeared to show less stable awakening cortisol time series compared to patients with LSS. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Instrumental Capital

    Gabriel Valerio

    2007-07-01

    Full Text Available During the history of human kind, since our first ancestors, tools have represented a mean to reach objectives which might otherwise seemed impossibles. In the called New Economy, where tangibles assets appear to be losing the role as the core element to produce value versus knowledge, tools have kept aside man in his dairy work. In this article, the author's objective is to describe, in a simple manner, the importance of managing the organization's group of tools or instruments (Instrumental Capital. The characteristic conditions of this New Economy, the way Knowledge Management deals with these new conditions and the sub-processes that provide support to the management of Instrumental Capital are described.

  6. Innovative instrumentation

    Anon.

    1983-01-01

    At this year's particle physics conference at Brighton, a parallel session was given over to instrumentation and detector development. While this work is vital to the health of research and its continued progress, its share of prime international conference time is limited. Instrumentation can be innovative three times — first when a new idea is outlined, secondly when it is shown to be feasible, and finally when it becomes productive in a real experiment, amassing useful data rather than operational experience. Hyams' examples showed that it can take a long time for a new idea to filter through these successive stages, if it ever makes it at all

  7. Innovative instrumentation

    Anon.

    1983-11-15

    At this year's particle physics conference at Brighton, a parallel session was given over to instrumentation and detector development. While this work is vital to the health of research and its continued progress, its share of prime international conference time is limited. Instrumentation can be innovative three times — first when a new idea is outlined, secondly when it is shown to be feasible, and finally when it becomes productive in a real experiment, amassing useful data rather than operational experience. Hyams' examples showed that it can take a long time for a new idea to filter through these successive stages, if it ever makes it at all.

  8. Instrumental aspects

    Qureshi Navid

    2017-01-01

    Full Text Available Every neutron scattering experiment requires the choice of a suited neutron diffractometer (or spectrometer in the case of inelastic scattering with its optimal configuration in order to accomplish the experimental tasks in the most successful way. Most generally, the compromise between the incident neutron flux and the instrumental resolution has to be considered, which is depending on a number of optical devices which are positioned in the neutron beam path. In this chapter the basic instrumental principles of neutron diffraction will be explained. Examples of different types of experiments and their respective expectable results will be shown. Furthermore, the production and use of polarized neutrons will be stressed.

  9. An approach based on defense-in-depth and diversity (3D) for the reliability assessment of digital instrument and control systems of nuclear power plants

    Silva, Paulo Adriano da; Saldanha, Pedro L.C., E-mail: pasilva@cnen.gov.b, E-mail: Saldanha@cnen.gov.b [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil). Coord. Geral de Reatores Nucleares; Melo, Paulo F. Frutuoso e, E-mail: frutuoso@nuclear.ufrj.b [Universidade Federal do Rio de Janeiro (PEN/COPPE/UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-Graduacao em Engenharia. Programa de Engenharia Nuclear; Araujo, Ademir L. de [Associacao Brasileira de Ensino Universitario (UNIABEU), Angra dos Reis, RJ (Brazil)

    2011-07-01

    The adoption of instrumentation and control (I and C) digital technology has been slower in nuclear power plants. The reason has been unfruitful efforts to obtain evidence in order to prove that I and C systems can be used in nuclear safety systems, for example, the Reactor Protection System (RPS), ensuring the proper operation of all its functions. This technology offers a potential improvement for safety and reliability. However, there still no consensus about the model to be adopted for digital systems software to be used in reliability studies. This paper presents the 3D methodology approach to assess digital I and C reliability. It is based on the study of operational events occurring in NPPs. It is easy to identify, in general, the level of I and C system reliability, showing its key vulnerabilities, enabling to trace regulatory actions to minimize or avoid them. This approach makes it possible to identify the main types of digital I and C system failure, with the potential for common cause failures as well as evaluating the dominant failure modes. The MAFIC-D software was developed to assist the implementation of the relationships between the reliability criteria, the analysis of relationships and data collection. The results obtained through this tool proved to be satisfactory and complete the process of regulatory decision-making from licensing I and C digital of NPPs and call still be used to monitor the performance of I and C digital post-licensing during the lifetime of the system, providing the basis for the elaboration of checklists of regulatory inspections. (author)

  10. An approach based on defense-in-depth and diversity (3D) for the reliability assessment of digital instrument and control systems of nuclear power plants

    Silva, Paulo Adriano da; Saldanha, Pedro L.C.

    2011-01-01

    The adoption of instrumentation and control (I and C) digital technology has been slower in nuclear power plants. The reason has been unfruitful efforts to obtain evidence in order to prove that I and C systems can be used in nuclear safety systems, for example, the Reactor Protection System (RPS), ensuring the proper operation of all its functions. This technology offers a potential improvement for safety and reliability. However, there still no consensus about the model to be adopted for digital systems software to be used in reliability studies. This paper presents the 3D methodology approach to assess digital I and C reliability. It is based on the study of operational events occurring in NPPs. It is easy to identify, in general, the level of I and C system reliability, showing its key vulnerabilities, enabling to trace regulatory actions to minimize or avoid them. This approach makes it possible to identify the main types of digital I and C system failure, with the potential for common cause failures as well as evaluating the dominant failure modes. The MAFIC-D software was developed to assist the implementation of the relationships between the reliability criteria, the analysis of relationships and data collection. The results obtained through this tool proved to be satisfactory and complete the process of regulatory decision-making from licensing I and C digital of NPPs and call still be used to monitor the performance of I and C digital post-licensing during the lifetime of the system, providing the basis for the elaboration of checklists of regulatory inspections. (author)

  11. The SPICE concept - An approach to providing geometric and other ancillary information needed for interpretation of data returned from space science instruments

    Acton, Charles H., Jr.

    1990-01-01

    The Navigation Ancillary Information Facility (NAIF), acting under the direction of NASA's Office of Space Science and Applications, and with substantial participation of the planetary science community, is designing and implementing an ancillary data system - called SPICE - to assist scientists in planning and interpreting scientific observations taken from spaceborne instruments. The principal objective of the implemented SPICE system is that it will hold the essential geometric and related ancillary information needed to recover the full value of science instrument data, and that it will facilitate correlations of individual instrument datasets with data obtained from other instruments on the same or other spacecraft.

  12. Fuzzy associative memories for instrument fault detection

    Heger, A.S.

    1996-01-01

    A fuzzy logic instrument fault detection scheme is developed for systems having two or three redundant sensors. In the fuzzy logic approach the deviation between each signal pairing is computed and classified into three fuzzy sets. A rule base is created allowing the human perception of the situation to be represented mathematically. Fuzzy associative memories are then applied. Finally, a defuzzification scheme is used to find the centroid location, and hence the signal status. Real-time analyses are carried out to evaluate the instantaneous signal status as well as the long-term results for the sensor set. Instantaneous signal validation results are used to compute a best estimate for the measured state variable. The long-term sensor validation method uses a frequency fuzzy variable to determine the signal condition over a specific period. To corroborate the methodology synthetic data representing various anomalies are analyzed with both the fuzzy logic technique and the parity space approach. (Author)

  13. Surgical Instrument

    Dankelman, J.; Horeman, T.

    2009-01-01

    The present invention relates to a surgical instrument for minimall-invasive surgery, comprising a handle, a shaft and an actuating part, characterised by a gastight cover surrounding the shaft, wherein the cover is provided with a coupler that has a feed- through opening with a loskable seal,

  14. Weather Instruments.

    Brantley, L. Reed, Sr.; Demanche, Edna L.; Klemm, E. Barbara; Kyselka, Will; Phillips, Edwin A.; Pottenger, Francis M.; Yamamoto, Karen N.; Young, Donald B.

    This booklet presents some activities to measure various weather phenomena. Directions for constructing a weather station are included. Instruments including rain gauges, thermometers, wind vanes, wind speed devices, humidity devices, barometers, atmospheric observations, a dustfall jar, sticky-tape can, detection of gases in the air, and pH of…

  15. A comparison of RANZCR and Singapore-designed radiation oncology practice audit instruments: how does reproducibility affect future approaches to revalidation

    Lu, Jiade J.; Wynne, Christopher J.; Kumar, Mahesh B.; Shakespeare, Thomas P.; Mukherjee, Rahul; Back, Michael F.

    2004-01-01

    Physician competency assessment requires the use of validated methods and instruments. The Royal Australian and New Zealand College of Radiologists (RANZCR) developed a draft audit form to be evaluated as a competency assessment instrument for radiation oncologists (ROs) in Australasia. We evaluated the reliability of the RANZCR instrument as well as a separate The Cancer Institute (TCI) Singapore-designed instrument by having two ROs perform an independent chart review of 80 randomly selected patients seen at The Cancer Institute (TCI), Singapore. Both RANZCR and TCI Singapore instruments were used to score each chart. Inter- and intra-observer reliability for both audit instruments were compared using misclassification rates as the primary end-point. Overall, for inter-observer reproducibility, 2.3% of TCI Singapore items were misclassified compared to 22.3% of RANZCR items (P < 0.0001, 100.00% confidence that TCI instrument has less inter-observer misclassification). For intra-observer reproducibility, 2.4% of TCI Singapore items were misclassified compared to 13.6% of RANZCR items (P < 0.0001, 100.00% confidence that TCI instrument has less intra-observer misclassification). The proposed RANZCR RO revalidation audit instrument requires further refinement to improve validity. Several items require modification or removal because of lack of reliability, whereas inclusion of other important and reproducible items can be incorporated as demonstrated by the TCI Singapore instrument. The TCI Singapore instrument also has the advantage of incorporating a simple scoring system and criticality index to allow discrimination between ROs and comparisons against future College standards Copyright (2004) Blackwell Publishing Asia Pty Ltd

  16. Advanced instrumentation and teleoperation

    Decreton, M.

    1998-01-01

    SCK-CEN's advanced instrumentation and teleoperation project aims at evaluating the potential of a telerobotic approach in a nuclear environment and, in particular, the use of remote-perception systems. Main achievements in 1997 in the areas of R and D on radiation tolerance for remote sensing, optical fibres and optical-fibre sensors, and computer-aided teleoperation are reported

  17. Using multiple biomarkers and determinants to obtain a better measurement of oxidative stress: a latent variable structural equation model approach.

    Eldridge, Ronald C; Flanders, W Dana; Bostick, Roberd M; Fedirko, Veronika; Gross, Myron; Thyagarajan, Bharat; Goodman, Michael

    2017-09-01

    Since oxidative stress involves a variety of cellular changes, no single biomarker can serve as a complete measure of this complex biological process. The analytic technique of structural equation modeling (SEM) provides a possible solution to this problem by modelling a latent (unobserved) variable constructed from the covariance of multiple biomarkers. Using three pooled datasets, we modelled a latent oxidative stress variable from five biomarkers related to oxidative stress: F 2 -isoprostanes (FIP), fluorescent oxidation products, mitochondrial DNA copy number, γ-tocopherol (Gtoc) and C-reactive protein (CRP, an inflammation marker closely linked to oxidative stress). We validated the latent variable by assessing its relation to pro- and anti-oxidant exposures. FIP, Gtoc and CRP characterized the latent oxidative stress variable. Obesity, smoking, aspirin use and β-carotene were statistically significantly associated with oxidative stress in the theorized directions; the same exposures were weakly and inconsistently associated with the individual biomarkers. Our results suggest that using SEM with latent variables decreases the biomarker-specific variability, and may produce a better measure of oxidative stress than do single variables. This methodology can be applied to similar areas of research in which a single biomarker is not sufficient to fully describe a complex biological phenomenon.

  18. Impact of marriage on HIV/AIDS risk behaviors among impoverished, at-risk couples: a multilevel latent variable approach.

    Stein, Judith A; Nyamathi, Adeline; Ullman, Jodie B; Bentler, Peter M

    2007-01-01

    Studies among normative samples generally demonstrate a positive impact of marriage on health behaviors and other related attitudes. In this study, we examine the impact of marriage on HIV/AIDS risk behaviors and attitudes among impoverished, highly stressed, homeless couples, many with severe substance abuse problems. A multilevel analysis of 368 high-risk sexually intimate married and unmarried heterosexual couples assessed individual and couple-level effects on social support, substance use problems, HIV/AIDS knowledge, perceived HIV/AIDS risk, needle-sharing, condom use, multiple sex partners, and HIV/AIDS testing. More variance was explained in the protective and risk variables by couple-level latent variable predictors than by individual latent variable predictors, although some gender effects were found (e.g., more alcohol problems among men). The couple-level variable of marriage predicted lower perceived risk, less deviant social support, and fewer sex partners but predicted more needle-sharing.

  19. Spatial variability in intertidal macroalgal assemblages on the North Portuguese coast: consistence between species and functional group approaches

    Veiga, P.; Rubal, M.; Vieira, R.; Arenas, F.; Sousa-Pinto, I.

    2013-03-01

    Natural assemblages are variable in space and time; therefore, quantification of their variability is imperative to identify relevant scales for investigating natural or anthropogenic processes shaping these assemblages. We studied the variability of intertidal macroalgal assemblages on the North Portuguese coast, considering three spatial scales (from metres to 10 s of kilometres) following a hierarchical design. We tested the hypotheses that (1) spatial pattern will be invariant at all the studied scales and (2) spatial variability of macroalgal assemblages obtained by using species will be consistent with that obtained using functional groups. This was done considering as univariate variables: total biomass and number of taxa as well as biomass of the most important species and functional groups and as multivariate variables the structure of macroalgal assemblages, both considering species and functional groups. Most of the univariate results confirmed the first hypothesis except for the total number of taxa and foliose macroalgae that showed significant variability at the scale of site and area, respectively. In contrast, when multivariate patterns were examined, the first hypothesis was rejected except at the scale of 10 s of kilometres. Both uni- and multivariate results indicated that variation was larger at the smallest scale, and thus, small-scale processes seem to have more effect on spatial variability patterns. Macroalgal assemblages, both considering species and functional groups as surrogate, showed consistent spatial patterns, and therefore, the second hypothesis was confirmed. Consequently, functional groups may be considered a reliable biological surrogate to study changes on macroalgal assemblages at least along the investigated Portuguese coastline.

  20. Nuclear instrumentation

    Weill, Jacky; Fabre, Rene.

    1981-01-01

    This article sums up the Research and Development effort at present being carried out in the five following fields of applications: Health physics and Radioprospection, Control of nuclear reactors, Plant control (preparation and reprocessing of the fuel, testing of nuclear substances, etc.), Research laboratory instrumentation, Detectors. It also sets the place of French industrial activities by means of an estimate of the French market, production and flow of trading with other countries [fr

  1. Divided Instruments

    Chapman, A.; Murdin, P.

    2000-11-01

    Although the division of the zodiac into 360° probably derives from Egypt or Assyria around 2000 BC, there is no surviving evidence of Mesopotamian cultures embodying this division into a mathematical instrument. Almost certainly, however, it was from Babylonia that the Greek geometers learned of the 360° circle, and by c. 80 BC they had incorporated it into that remarkably elaborate device gener...

  2. Instrumentation development

    Anon.

    1976-01-01

    Areas being investigated for instrumentation improvement during low-level pollution monitoring include laser opto-acoustic spectroscopy, x-ray fluorescence spectroscopy, optical fluorescence spectroscopy, liquid crystal gas detectors, advanced forms of atomic absorption spectroscopy, electro-analytical chemistry, and mass spectroscopy. Emphasis is also directed toward development of physical methods, as opposed to conventional chemical analysis techniques for monitoring these trace amounts of pollution related to energy development and utilization

  3. Instrumentation maintenance

    Mack, D.A.

    1976-09-01

    It is essential to any research activity that accurate and efficient measurements be made for the experimental parameters under consideration for each individual experiment or test. Satisfactory measurements in turn depend upon having the necessary instruments and the capability of ensuring that they are performing within their intended specifications. This latter requirement can only be achieved by providing an adequate maintenance facility, staffed with personnel competent to understand the problems associated with instrument adjustment and repair. The Instrument Repair Shop at the Lawrence Berkeley Laboratory is designed to achieve this end. The organization, staffing and operation of this system is discussed. Maintenance policy should be based on studies of (1) preventive vs. catastrophic maintenance, (2) records indicating when equipment should be replaced rather than repaired and (3) priorities established to indicate the order in which equipment should be repaired. Upon establishing a workable maintenance policy, the staff should be instructed so that they may provide appropriate scheduled preventive maintenance, calibration and corrective procedures, and emergency repairs. The education, training and experience of the maintenance staff is discussed along with the organization for an efficient operation. The layout of the various repair shops is described in the light of laboratory space and financial constraints

  4. Industrial instrumentation principles and design

    Padmanabhan, Tattamangalam R

    2000-01-01

    Pneumatic, hydraulic and allied instrumentation schemes have given way to electronic schemes in recent years thanks to the rapid strides in electronics and allied areas. Principles, design and applications of such state-of-the-art instrumentation schemes form the subject matter of this book. Through representative examples, the basic building blocks of instrumentation schemes are identified and each of these building blocks discussed in terms of its design and interface characteristics. The common generic schemes synthesized with such building blocks are dealt with subsequently. This forms the scope of Part I. The focus in Part II is on application. Displacement and allied instrumentation, force and allied instrumentation and process instrumentation in terms of temperature, flow, pressure level and other common process variables are dealt with separately and exhaustively. Despite the diversity in the sensor principles and characteristics and the variety in the applications and their environments, it is possib...

  5. Action and familiarity effects on self and other expert musicians’ Laban effort-shape analyses of expressive bodily behaviors in instrumental music performance: a case study approach

    Broughton, Mary C.; Davidson, Jane W.

    2014-01-01

    Self-reflective performance review and expert evaluation are features of Western music performance practice. While music is usually the focus, visual information provided by performing musicians’ expressive bodily behaviors communicates expressiveness to musically trained and untrained observers. Yet, within a seemingly homogenous group, such as one of musically trained individuals, diversity of experience exists. Individual differences potentially affect perception of the subtleties of expressive performance, and performers’ effective communication of their expressive intentions. This study aimed to compare self- and other expert musicians’ perception of expressive bodily behaviors observed in marimba performance. We hypothesized that analyses of expressive bodily behaviors differ between expert musicians according to their specialist motor expertise and familiarity with the music. Two professional percussionists and experienced marimba players, and one professional classical singer took part in the study. Participants independently conducted Laban effort-shape analysis – proposing that intentions manifest in bodily activity are understood through shared embodied processes – of a marimbists’ expressive bodily behaviors in an audio-visual performance recording. For one percussionist, this was a self-reflective analysis. The work was unfamiliar to the other percussionist and singer. Perception of the performer’s expressive bodily behaviors appeared to differ according to participants’ individual instrumental or vocal motor expertise, and familiarity with the music. Furthermore, individual type of motor experience appeared to direct participants’ attention in approaching the analyses. Findings support forward and inverse perception–action models, and embodied cognitive theory. Implications offer scientific rigor and artistic interest for how performance practitioners can reflectively analyze performance to improve expressive communication. PMID

  6. Action and familiarity effects on self and other expert musicians’ Laban effort-shape analyses of expressive bodily behaviors in instrumental music performance: A case study approach

    Mary C Broughton

    2014-10-01

    Full Text Available Self-reflective performance review and expert evaluation are features of Western music performance practice. While music is usually the focus, visual information provided by performing musicians’ expressive bodily behaviors communicates expressiveness to musically trained and untrained observers. Yet, within a seemingly homogenous group such as one of musically trained individuals, diversity of experience exists. Individual differences potentially affect perception of the subtleties of expressive performance, and performers’ effective communication of their expressive intentions. This study aimed to compare self- and other expert musicians’ perception of expressive bodily behaviors observed in marimba performance. We hypothesised that analyses of expressive expressive bodily behaviors differ between expert musicians according to their specialist motor expertise and familiarity with the music. Two professional percussionists and experienced marimba players, and one professional classical singer took part in the study. Participants independently conducted Laban effort-shape analysis – proposing that intentions manifest in bodily activity are understood through shared embodied processes – of a marimbists’ expressive bodily behaviors in an audio-visual performance recording. For one percussionist, this was a self-reflective analysis. The work was unfamiliar to the other percussionist and singer. Perception of the performer’s expressive bodily behaviors differed according to participants’ individual instrumental or vocal motor expertise, and familiarity with the music. Furthermore, individual type of motor experience appeared to direct participants’ attention in approaching the analyses. Findings support forward and inverse perception-action models, and embodied cognitive theory. Implications offer scientific rigour and artistic interest for how performance practitioners can reflectively analyze performance to improve expressive

  7. The ConCom Safety Management Scale: developing and testing a measurement instrument for control-based and commitment-based safety management approaches in hospitals.

    Alingh, Carien W; Strating, Mathilde M H; van Wijngaarden, Jeroen D H; Paauwe, Jaap; Huijsman, Robbert

    2018-03-06

    Nursing management is considered important for patient safety. Prior research has predominantly focused on charismatic leadership styles, although it is questionable whether these best characterise the role of nurse managers. Managerial control is also relevant. Therefore, we aimed to develop and test a measurement instrument for control-based and commitment-based safety management of nurse managers in clinical hospital departments. A cross-sectional survey design was used to test the newly developed questionnaire in a sample of 2378 nurses working in clinical departments. The nurses were asked about their perceptions of the leadership behaviour and management practices of their direct supervisors. Psychometric properties were evaluated using confirmatory factor analysis and reliability estimates. The final 33-item questionnaire showed acceptable goodness-of-fit indices and internal consistency (Cronbach's α of the subscales range: 0.59-0.90). The factor structure revealed three subdimensions for control-based safety management: (1) stressing the importance of safety rules and regulations; (2) monitoring compliance; and (3) providing employees with feedback. Commitment-based management consisted of four subdimensions: (1) showing role modelling behaviour; (2) creating safety awareness; (3) showing safety commitment; and (4) encouraging participation. Construct validity of the scale was supported by high factor loadings and provided preliminary evidence that control-based and commitment-based safety management are two distinct yet related constructs. The findings were reconfirmed in a cross-validation procedure. The results provide initial support for the construct validity and reliability of our ConCom Safety Management Scale. Both management approaches were found to be relevant for managing patient safety in clinical hospital departments. The scale can be used to deepen our understanding of the influence of patient safety management on healthcare professionals

  8. Instrument Remote Control via the Astronomical Instrument Markup Language

    Sall, Ken; Ames, Troy; Warsaw, Craig; Koons, Lisa; Shafer, Richard

    1998-01-01

    The Instrument Remote Control (IRC) project ongoing at NASA's Goddard Space Flight Center's (GSFC) Information Systems Center (ISC) supports NASA's mission by defining an adaptive intranet-based framework that provides robust interactive and distributed control and monitoring of remote instruments. An astronomical IRC architecture that combines the platform-independent processing capabilities of Java with the power of Extensible Markup Language (XML) to express hierarchical data in an equally platform-independent, as well as human readable manner, has been developed. This architecture is implemented using a variety of XML support tools and Application Programming Interfaces (API) written in Java. IRC will enable trusted astronomers from around the world to easily access infrared instruments (e.g., telescopes, cameras, and spectrometers) located in remote, inhospitable environments, such as the South Pole, a high Chilean mountaintop, or an airborne observatory aboard a Boeing 747. Using IRC's frameworks, an astronomer or other scientist can easily define the type of onboard instrument, control the instrument remotely, and return monitoring data all through the intranet. The Astronomical Instrument Markup Language (AIML) is the first implementation of the more general Instrument Markup Language (IML). The key aspects of our approach to instrument description and control applies to many domains, from medical instruments to machine assembly lines. The concepts behind AIML apply equally well to the description and control of instruments in general. IRC enables us to apply our techniques to several instruments, preferably from different observatories.

  9. Time dependent analysis of assay comparability: a novel approach to understand intra- and inter-site variability over time

    Winiwarter, Susanne; Middleton, Brian; Jones, Barry; Courtney, Paul; Lindmark, Bo; Page, Ken M.; Clark, Alan; Landqvist, Claire

    2015-09-01

    We demonstrate here a novel use of statistical tools to study intra- and inter-site assay variability of five early drug metabolism and pharmacokinetics in vitro assays over time. Firstly, a tool for process control is presented. It shows the overall assay variability but allows also the following of changes due to assay adjustments and can additionally highlight other, potentially unexpected variations. Secondly, we define the minimum discriminatory difference/ratio to support projects to understand how experimental values measured at different sites at a given time can be compared. Such discriminatory values are calculated for 3 month periods and followed over time for each assay. Again assay modifications, especially assay harmonization efforts, can be noted. Both the process control tool and the variability estimates are based on the results of control compounds tested every time an assay is run. Variability estimates for a limited set of project compounds were computed as well and found to be comparable. This analysis reinforces the need to consider assay variability in decision making, compound ranking and in silico modeling.

  10. Drivers of Seasonal Variability in Marine Boundary Layer Aerosol Number Concentration Investigated Using a Steady State Approach

    Mohrmann, Johannes; Wood, Robert; McGibbon, Jeremy; Eastman, Ryan; Luke, Edward

    2018-01-01

    Marine boundary layer (MBL) aerosol particles affect the climate through their interaction with MBL clouds. Although both MBL clouds and aerosol particles have pronounced seasonal cycles, the factors controlling seasonal variability of MBL aerosol particle concentration are not well constrained. In this paper an aerosol budget is constructed representing the effects of wet deposition, free-tropospheric entrainment, primary surface sources, and advection on the MBL accumulation mode aerosol number concentration (Na). These terms are then parameterized, and by assuming that on seasonal time scales Na is in steady state, the budget equation is rearranged to form a diagnostic equation for Na based on observable variables. Using data primarily collected in the subtropical northeast Pacific during the MAGIC campaign (Marine ARM (Atmospheric Radiation Measurement) GPCI (GCSS Pacific Cross-Section Intercomparison) Investigation of Clouds), estimates of both mean summer and winter Na concentrations are made using the simplified steady state model and seasonal mean observed variables. These are found to match well with the observed Na. To attribute the modeled difference between summer and winter aerosol concentrations to individual observed variables (e.g., precipitation rate and free-tropospheric aerosol number concentration), a local sensitivity analysis is combined with the seasonal difference in observed variables. This analysis shows that despite wintertime precipitation frequency being lower than summer, the higher winter precipitation rate accounted for approximately 60% of the modeled seasonal difference in Na, which emphasizes the importance of marine stratocumulus precipitation in determining MBL aerosol concentrations on longer time scales.

  11. The Effect of Macroeconomic Variables on Value-Added Agriculture: Approach of Vector Autoregresive Bayesian Model (BVAR

    E. Pishbahar

    2015-05-01

    Full Text Available There are different ideas and opinions about the effects of macroeconomic variables on real and nominal variables. To answer the question of whether changes in macroeconomic variables as a political tool is useful over a business cycle, understanding the effect of macroeconomic variables on economic growth is important. In the present study, the Bayesian Vector autoregresive model and seasonality data for the years between 1991 and 2013 was used to determine the impact of monetary policy on value-added agriculture. Predicts of Vector autoregresive model are usually divertaed due to a lot of parameters in the model. Bayesian vector autoregresive model estimates more reliable predictions due to reducing the number of included parametrs and considering the former models. Compared to the Vector Autoregressive model, the coefficients are estimated more accurately. Based on the results of RMSE in this study, previous function Nrmal-Vyshart was identified as a suitable previous disteribution. According to the results of the impulse response function, the sudden effects of shocks in macroeconomic variables on the value added in agriculture and domestic venture capital are stable. The effects on the exchange rates, tax revenues and monetary will bemoderated after 7, 5 and 4periods. Monetary policy shocks ,in the first half of the year, increased the value added of agriculture, while in the second half of the year had a depressing effect on the value added.

  12. Annual dynamics of daylight variability and contrast a simulation-based approach to quantifying visual effects in architecture

    Rockcastle, Siobhan

    2013-01-01

    Daylight is a dynamic source of illumination in architectural space, creating diverse and ephemeral configurations of light and shadow within the built environment. Perceptual qualities of daylight, such as contrast and temporal variability, are essential to our understanding of both material and visual effects in architecture. Although spatial contrast and light variability are fundamental to the visual experience of architecture, architects still rely primarily on intuition to evaluate their designs because there are few metrics that address these factors. Through an analysis of contemporary

  13. Variability of the western Galician upwelling system (NW Spain) during an intensively sampled annual cycle. An EOF analysis approach

    Herrera, J. L.; Rosón, G.; Varela, R. A.; Piedracoba, S.

    2008-07-01

    The key features of the western Galician shelf hydrography and dynamics are analyzed on a solid statistical and experimental basis. The results allowed us to gather together information dispersed in previous oceanographic works of the region. Empirical orthogonal functions analysis and a canonical correlation analysis were applied to a high-resolution dataset collected from 47 surveys done on a weekly frequency from May 2001 to May 2002. The main results of these analyses are summarized bellow. Salinity, temperature and the meridional component of the residual current are correlated with the relevant local forcings (the meridional coastal wind component and the continental run-off) and with a remote forcing (the meridional temperature gradient at latitude 37°N). About 80% of the salinity and temperature total variability over the shelf, and 37% of the residual meridional current total variability are explained by two EOFs for each variable. Up to 22% of the temperature total variability and 14% of the residual meridional current total variability is devoted to the set up of cross-shore gradients of the thermohaline properties caused by the wind-induced Ekman transport. Up to 11% and 10%, respectively, is related to the variability of the meridional temperature gradient at the Western Iberian Winter Front. About 30% of the temperature total variability can be explained by the development and erosion of the seasonal thermocline and by the seasonal variability of the thermohaline properties of the central waters. This thermocline presented unexpected low salinity values due to the trapping during spring and summer of the high continental inputs from the River Miño recorded in 2001. The low salinity plumes can be traced on the Galician shelf during almost all the annual cycle; they tend to be extended throughout the entire water column under downwelling conditions and concentrate in the surface layer when upwelling favourable winds blow. Our evidences point to the

  14. Analysis of instrumentation technology for SMART

    Hur, Seop; Koo, I. S.; Park, H. Y.; Lee, C. K.; Kim, D. H.; Suh, Y. S.; Seong, S. H.; Jang, G. S.

    1998-03-01

    It is necessary that development requirements, techniques to be developed, and development tasks and approach are established to develop the SMART instrumentation system. It is important to establish the development strategies for input for developing SMART instrumentation system. To meet above needs, the industry general and nuclear instrumentation techniques were analyzed and reviewed, respectively, based on the classification of instrumentation to analyze the industrial instrumentation techniques, and analysis results which described the inherent merits and demerits of each technique can be used for inputs to select the instruments for SMART. For the instrumentation techniques for nuclear environments, the major instrumentation techniques were reviewed, and the instrumentation system were established. The following development approaches were established based on the development requirements and the analysis results of research and development trends of industrial and nuclear instrumentation techniques. (author). 90 refs., 38 tabs., 33 figs

  15. Quantitative analysis of the Kawase versus the modified Dolenc-Kawase approach for middle cranial fossa lesions with variable anteroposterior extension.

    Tripathi, Manjul; Deo, Rama Chandra; Suri, Ashish; Srivastav, Vinkle; Baby, Britty; Kumar, Subodh; Kalra, Prem; Banerjee, Subhashis; Prasad, Sanjiva; Paul, Kolin; Roy, Tara Sankar; Lalwani, Sanjeev

    2015-07-01

    The surgical corridor to the upper third of the clivus and ventral brainstem is hindered by critical neurovascular structures, such as the cavernous sinus, petrous apex, and tentorium. The traditional Kawase approach provides a 10 × 5-mm fenestration at the petrous apex of the temporal bone between the 5th cranial nerve and internal auditory canal. Due to interindividual variability, sometimes this area proves to be insufficient as a corridor to the posterior cranial fossa. The authors describe a modification to the technique of the extradural anterior petrosectomy consisting of additional transcavernous exploration and medial mobilization of the cisternal component of the trigeminal nerve. This approach is termed the modified Dolenc-Kawase (MDK) approach. The authors describe a volumetric analysis of temporal bones with 3D laser scanning of dry and drilled bones for respective triangles and rhomboid areas, and they compare the difference of exposure with traditional versus modified approaches on cadaver dissection. Twelve dry temporal bones were laser scanned, and mesh-based volumetric analysis was done followed by drilling of the Kawase triangle and MDK rhomboid. Five cadaveric heads were drilled on alternate sides with both approaches for evaluation of the area exposed, surgical freedom, and angle of approach. The MDK approach provides an approximately 1.5 times larger area and 2.0 times greater volume of bone at the anterior petrous apex compared with the Kawase's approach. Cadaver dissection objectified the technical feasibility of the MDK approach, providing nearly 1.5-2 times larger fenestration with improved view and angulation to the posterior cranial fossa. Practical application in 6 patients with different lesions proves clinical applicability of the MDK approach. The larger fenestration at the petrous apex achieved with the MDK approach provides greater surgical freedom at the Dorello canal, gasserian ganglion, and prepontine area and better

  16. Variability of worked examples and transfer of geometrical problem-solving skills : a cognitive-load approach

    Paas, Fred G.W.C.; van Merrienboer, Jeroen J.G.; van Merrienboer, J.J.G.

    1994-01-01

    Four computer-based training strategies for geometrical problem solving in the domain of computer numerically controlled machinery programming were studied with regard to their effects on training performance, transfer performance, and cognitive load. A low- and a high-variability conventional

  17. Estimation of exhaust gas aerodynamic force on the variable geometry turbocharger actuator: 1D flow model approach

    Ahmed, Fayez Shakil; Laghrouche, Salah; Mehmood, Adeel; El Bagdouri, Mohammed

    2014-01-01

    Highlights: • Estimation of aerodynamic force on variable turbine geometry vanes and actuator. • Method based on exhaust gas flow modeling. • Simulation tool for integration of aerodynamic force in automotive simulation software. - Abstract: This paper provides a reliable tool for simulating the effects of exhaust gas flow through the variable turbine geometry section of a variable geometry turbocharger (VGT), on flow control mechanism. The main objective is to estimate the resistive aerodynamic force exerted by the flow upon the variable geometry vanes and the controlling actuator, in order to improve the control of vane angles. To achieve this, a 1D model of the exhaust flow is developed using Navier–Stokes equations. As the flow characteristics depend upon the volute geometry, impeller blade force and the existing viscous friction, the related source terms (losses) are also included in the model. In order to guarantee stability, an implicit numerical solver has been developed for the resolution of the Navier–Stokes problem. The resulting simulation tool has been validated through comparison with experimentally obtained values of turbine inlet pressure and the aerodynamic force as measured at the actuator shaft. The simulator shows good compliance with experimental results

  18. The identification of high potential archers based on relative psychological coping skills variables: A Support Vector Machine approach

    Taha, Zahari; Muazu Musa, Rabiu; Majeed, A. P. P. Abdul; Razali Abdullah, Mohamad; Aizzat Zakaria, Muhammad; Muaz Alim, Muhammad; Arif Mat Jizat, Jessnor; Fauzi Ibrahim, Mohamad

    2018-03-01

    Support Vector Machine (SVM) has been revealed to be a powerful learning algorithm for classification and prediction. However, the use of SVM for prediction and classification in sport is at its inception. The present study classified and predicted high and low potential archers from a collection of psychological coping skills variables trained on different SVMs. 50 youth archers with the average age and standard deviation of (17.0 ±.056) gathered from various archery programmes completed a one end shooting score test. Psychological coping skills inventory which evaluates the archers level of related coping skills were filled out by the archers prior to their shooting tests. k-means cluster analysis was applied to cluster the archers based on their scores on variables assessed. SVM models, i.e. linear and fine radial basis function (RBF) kernel functions, were trained on the psychological variables. The k-means clustered the archers into high psychologically prepared archers (HPPA) and low psychologically prepared archers (LPPA), respectively. It was demonstrated that the linear SVM exhibited good accuracy and precision throughout the exercise with an accuracy of 92% and considerably fewer error rate for the prediction of the HPPA and the LPPA as compared to the fine RBF SVM. The findings of this investigation can be valuable to coaches and sports managers to recognise high potential athletes from the selected psychological coping skills variables examined which would consequently save time and energy during talent identification and development programme.

  19. Detecting relationships between the interannual variability in climate records and ecological time series using a multivariate statistical approach - four case studies for the North Sea region

    Heyen, H. [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Gewaesserphysik

    1998-12-31

    A multivariate statistical approach is presented that allows a systematic search for relationships between the interannual variability in climate records and ecological time series. Statistical models are built between climatological predictor fields and the variables of interest. Relationships are sought on different temporal scales and for different seasons and time lags. The possibilities and limitations of this approach are discussed in four case studies dealing with salinity in the German Bight, abundance of zooplankton at Helgoland Roads, macrofauna communities off Norderney and the arrival of migratory birds on Helgoland. (orig.) [Deutsch] Ein statistisches, multivariates Modell wird vorgestellt, das eine systematische Suche nach potentiellen Zusammenhaengen zwischen Variabilitaet in Klima- und oekologischen Zeitserien erlaubt. Anhand von vier Anwendungsbeispielen wird der Klimaeinfluss auf den Salzgehalt in der Deutschen Bucht, Zooplankton vor Helgoland, Makrofauna vor Norderney, und die Ankunft von Zugvoegeln auf Helgoland untersucht. (orig.)

  20. A Poisson regression approach to model monthly hail occurrence in Northern Switzerland using large-scale environmental variables

    Madonna, Erica; Ginsbourger, David; Martius, Olivia

    2018-05-01

    In Switzerland, hail regularly causes substantial damage to agriculture, cars and infrastructure, however, little is known about its long-term variability. To study the variability, the monthly number of days with hail in northern Switzerland is modeled in a regression framework using large-scale predictors derived from ERA-Interim reanalysis. The model is developed and verified using radar-based hail observations for the extended summer season (April-September) in the period 2002-2014. The seasonality of hail is explicitly modeled with a categorical predictor (month) and monthly anomalies of several large-scale predictors are used to capture the year-to-year variability. Several regression models are applied and their performance tested with respect to standard scores and cross-validation. The chosen model includes four predictors: the monthly anomaly of the two meter temperature, the monthly anomaly of the logarithm of the convective available potential energy (CAPE), the monthly anomaly of the wind shear and the month. This model well captures the intra-annual variability and slightly underestimates its inter-annual variability. The regression model is applied to the reanalysis data back in time to 1980. The resulting hail day time series shows an increase of the number of hail days per month, which is (in the model) related to an increase in temperature and CAPE. The trend corresponds to approximately 0.5 days per month per decade. The results of the regression model have been compared to two independent data sets. All data sets agree on the sign of the trend, but the trend is weaker in the other data sets.

  1. ReOBS: a new approach to synthesize long-term multi-variable dataset and application to the SIRTA supersite

    Chiriaco, Marjolaine; Dupont, Jean-Charles; Bastin, Sophie; Badosa, Jordi; Lopez, Julio; Haeffelin, Martial; Chepfer, Helene; Guzman, Rodrigo

    2018-05-01

    A scientific approach is presented to aggregate and harmonize a set of 60 geophysical variables at hourly timescale over a decade, and to allow multiannual and multi-variable studies combining atmospheric dynamics and thermodynamics, radiation, clouds and aerosols from ground-based observations. Many datasets from ground-based observations are currently in use worldwide. They are very valuable because they contain complete and precise information due to their spatio-temporal co-localization over more than a decade. These datasets, in particular the synergy between different type of observations, are under-used because of their complexity and diversity due to calibration, quality control, treatment, format, temporal averaging, metadata, etc. Two main results are presented in this article: (1) a set of methods available for the community to robustly and reliably process ground-based data at an hourly timescale over a decade is described and (2) a single netCDF file is provided based on the SIRTA supersite observations. This file contains approximately 60 geophysical variables (atmospheric and in ground) hourly averaged over a decade for the longest variables. The netCDF file is available and easy to use for the community. In this article, observations are re-analyzed. The prefix re refers to six main steps: calibration, quality control, treatment, hourly averaging, homogenization of the formats and associated metadata, as well as expertise on more than a decade of observations. In contrast, previous studies (i) took only some of these six steps into account for each variable, (ii) did not aggregate all variables together in a single file and (iii) did not offer an hourly resolution for about 60 variables over a decade (for the longest variables). The approach described in this article can be applied to different supersites and to additional variables. The main implication of this work is that complex atmospheric observations are made readily available for scientists

  2. Dealing with randomness and vagueness in business and management sciences: the fuzzy-probabilistic approach as a tool for the study of statistical relationships between imprecise variables

    Fabrizio Maturo

    2016-06-01

    Full Text Available In practical applications relating to business and management sciences, there are many variables that, for their own nature, are better described by a pair of ordered values (i.e. financial data. By summarizing this measurement with a single value, there is a loss of information; thus, in these situations, data are better described by interval values rather than by single values. Interval arithmetic studies and analyzes this type of imprecision; however, if the intervals has no sharp boundaries, fuzzy set theory is the most suitable instrument. Moreover, fuzzy regression models are able to overcome some typical limitation of classical regression because they do not need the same strong assumptions. In this paper, we present a review of the main methods introduced in the literature on this topic and introduce some recent developments regarding the concept of randomness in fuzzy regression.

  3. On Solutions for Linear and Nonlinear Schrödinger Equations with Variable Coefficients: A Computational Approach

    Gabriel Amador

    2016-05-01

    Full Text Available In this work, after reviewing two different ways to solve Riccati systems, we are able to present an extensive list of families of integrable nonlinear Schrödinger (NLS equations with variable coefficients. Using Riccati equations and similarity transformations, we are able to reduce them to the standard NLS models. Consequently, we can construct bright-, dark- and Peregrine-type soliton solutions for NLS with variable coefficients. As an important application of solutions for the Riccati equation with parameters, by means of computer algebra systems, it is shown that the parameters change the dynamics of the solutions. Finally, we test numerical approximations for the inhomogeneous paraxial wave equation by the Crank-Nicolson scheme with analytical solutions found using Riccati systems. These solutions include oscillating laser beams and Laguerre and Gaussian beams.

  4. Structural Variability within Frontoparietal Networks and Individual Differences in Attentional Functions: An Approach Using the Theory of Visual Attention.

    Chechlacz, Magdalena; Gillebert, Celine R; Vangkilde, Signe A; Petersen, Anders; Humphreys, Glyn W

    2015-07-29

    Visuospatial attention allows us to select and act upon a subset of behaviorally relevant visual stimuli while ignoring distraction. Bundesen's theory of visual attention (TVA) (Bundesen, 1990) offers a quantitative analysis of the different facets of attention within a unitary model and provides a powerful analytic framework for understanding individual differences in attentional functions. Visuospatial attention is contingent upon large networks, distributed across both hemispheres, consisting of several cortical areas interconnected by long-association frontoparietal pathways, including three branches of the superior longitudinal fasciculus (SLF I-III) and the inferior fronto-occipital fasciculus (IFOF). Here we examine whether structural variability within human frontoparietal networks mediates differences in attention abilities as assessed by the TVA. Structural measures were based on spherical deconvolution and tractography-derived indices of tract volume and hindrance-modulated orientational anisotropy (HMOA). Individual differences in visual short-term memory (VSTM) were linked to variability in the microstructure (HMOA) of SLF II, SLF III, and IFOF within the right hemisphere. Moreover, VSTM and speed of information processing were linked to hemispheric lateralization within the IFOF. Differences in spatial bias were mediated by both variability in microstructure and volume of the right SLF II. Our data indicate that the microstructural and macrostrucutral organization of white matter pathways differentially contributes to both the anatomical lateralization of frontoparietal attentional networks and to individual differences in attentional functions. We conclude that individual differences in VSTM capacity, processing speed, and spatial bias, as assessed by TVA, link to variability in structural organization within frontoparietal pathways. Copyright © 2015 Chechlacz et al.

  5. The identification of high potential archers based on fitness and motor ability variables: A Support Vector Machine approach.

    Taha, Zahari; Musa, Rabiu Muazu; P P Abdul Majeed, Anwar; Alim, Muhammad Muaz; Abdullah, Mohamad Razali

    2018-02-01

    Support Vector Machine (SVM) has been shown to be an effective learning algorithm for classification and prediction. However, the application of SVM for prediction and classification in specific sport has rarely been used to quantify/discriminate low and high-performance athletes. The present study classified and predicted high and low-potential archers from a set of fitness and motor ability variables trained on different SVMs kernel algorithms. 50 youth archers with the mean age and standard deviation of 17.0 ± 0.6 years drawn from various archery programmes completed a six arrows shooting score test. Standard fitness and ability measurements namely hand grip, vertical jump, standing broad jump, static balance, upper muscle strength and the core muscle strength were also recorded. Hierarchical agglomerative cluster analysis (HACA) was used to cluster the archers based on the performance variables tested. SVM models with linear, quadratic, cubic, fine RBF, medium RBF, as well as the coarse RBF kernel functions, were trained based on the measured performance variables. The HACA clustered the archers into high-potential archers (HPA) and low-potential archers (LPA), respectively. The linear, quadratic, cubic, as well as the medium RBF kernel functions models, demonstrated reasonably excellent classification accuracy of 97.5% and 2.5% error rate for the prediction of the HPA and the LPA. The findings of this investigation can be valuable to coaches and sports managers to recognise high potential athletes from a combination of the selected few measured fitness and motor ability performance variables examined which would consequently save cost, time and effort during talent identification programme. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Gaussian anamorphosis in the analysis step of the EnKF: a joint state-variable/observation approach

    Javier Amezcua

    2014-09-01

    Full Text Available The analysis step of the (ensemble Kalman filter is optimal when (1 the distribution of the background is Gaussian, (2 state variables and observations are related via a linear operator, and (3 the observational error is of additive nature and has Gaussian distribution. When these conditions are largely violated, a pre-processing step known as Gaussian anamorphosis (GA can be applied. The objective of this procedure is to obtain state variables and observations that better fulfil the Gaussianity conditions in some sense. In this work we analyse GA from a joint perspective, paying attention to the effects of transformations in the joint state-variable/observation space. First, we study transformations for state variables and observations that are independent from each other. Then, we introduce a targeted joint transformation with the objective to obtain joint Gaussianity in the transformed space. We focus primarily in the univariate case, and briefly comment on the multivariate one. A key point of this paper is that, when (1–(3 are violated, using the analysis step of the EnKF will not recover the exact posterior density in spite of any transformations one may perform. These transformations, however, provide approximations of different quality to the Bayesian solution of the problem. Using an example in which the Bayesian posterior can be analytically computed, we assess the quality of the analysis distributions generated after applying the EnKF analysis step in conjunction with different GA options. The value of the targeted joint transformation is particularly clear for the case when the prior is Gaussian, the marginal density for the observations is close to Gaussian, and the likelihood is a Gaussian mixture.

  7. Variable Stars in the Field of V729 Aql

    Cagaš, P.

    2017-04-01

    Wide field instruments can be used to acquire light curves of tens or even hundreds of variable stars per night, which increases the probability of new discoveries of interesting variable stars and generally increases the efficiency of observations. At the same time, wide field instruments produce a large amount of data, which must be processed using advanced software. The traditional approach, typically used by amateur astronomers, requires an unacceptable amount of time needed to process each data set. New functionality, built into SIPS software package, can shorten the time needed to obtain light curves by several orders of magnitude. Also, newly introduced SILICUPS software is intended for post-processing of stored light curves. It can be used to visualize observations from many nights, to find variable star periods, evaluate types of variability, etc. This work provides an overview of tools used to process data from the large field of view around the variable star V729 Aql. and demonstrates the results.

  8. Environmental instrumentation

    Fritschen, L.J.; Gay, L.W.

    1979-01-01

    This book is designed to be used as a text for advanced students and a guide or manual for researchers in the field. The purpose is to present the basic theory of environmental variables and transducers, to report experiences with methodology and use, and to provide certain essential tables. Attention is given to measurements of temperature, soil heat flux, radiation, humidity and moisture, wind speed and direction, and pressure. Data acquisition concepts are summarized

  9. Transgressive or Instrumental?

    Chemi, Tatiana

    2018-01-01

    Contemporary practices that connect the arts with learning are widespread at all level of educational systems and in organisations, but they include very diverse approaches, multiple methods and background values. Regardless of explicit learning benefits, the arts/learning partnerships bring about...... creativity and the other on practices of arts-integration. My final point rests on the belief that the opposition of transgression and instrumentality is a deceiving perspective on the arts, against the background of the aesthetic plurality and hybridity....

  10. Multi-scale approach to the environmental factors effects on spatio-temporal variability of Chironomus salinarius (Diptera: Chironomidae) in a French coastal lagoon

    Cartier, V.; Claret, C.; Garnier, R.; Fayolle, S.; Franquet, E.

    2010-03-01

    The complexity of the relationships between environmental factors and organisms can be revealed by sampling designs which consider the contribution to variability of different temporal and spatial scales, compared to total variability. From a management perspective, a multi-scale approach can lead to time-saving. Identifying environmental patterns that help maintain patchy distribution is fundamental in studying coastal lagoons, transition zones between continental and marine waters characterised by great environmental variability on spatial and temporal scales. They often present organic enrichment inducing decreased species richness and increased densities of opportunist species like C hironomus salinarius, a common species that tends to swarm and thus constitutes a nuisance for human populations. This species is dominant in the Bolmon lagoon, a French Mediterranean coastal lagoon under eutrophication. Our objective was to quantify variability due to both spatial and temporal scales and identify the contribution of different environmental factors to this variability. The population of C. salinarius was sampled from June 2007 to June 2008 every two months at 12 sites located in two areas of the Bolmon lagoon, at two different depths, with three sites per area-depth combination. Environmental factors (temperature, dissolved oxygen both in sediment and under water surface, sediment organic matter content and grain size) and microbial activities (i.e. hydrolase activities) were also considered as explanatory factors of chironomid densities and distribution. ANOVA analysis reveals significant spatial differences regarding the distribution of chironomid larvae for the area and the depth scales and their interaction. The spatial effect is also revealed for dissolved oxygen (water), salinity and fine particles (area scale), and for water column depth. All factors but water column depth show a temporal effect. Spearman's correlations highlight the seasonal effect

  11. A Bayesian approach to study the risk variables for tuberculosis occurrence in domestic and wild ungulates in South Central Spain

    Rodríguez-Prieto Víctor

    2012-08-01

    Full Text Available Abstract Background Bovine tuberculosis (bTB is a chronic infectious disease mainly caused by Mycobacterium bovis. Although eradication is a priority for the European authorities, bTB remains active or even increasing in many countries, causing significant economic losses. The integral consideration of epidemiological factors is crucial to more cost-effectively allocate control measures. The aim of this study was to identify the nature and extent of the association between TB distribution and a list of potential risk factors regarding cattle, wild ungulates and environmental aspects in Ciudad Real, a Spanish province with one of the highest TB herd prevalences. Results We used a Bayesian mixed effects multivariable logistic regression model to predict TB occurrence in either domestic or wild mammals per municipality in 2007 by using information from the previous year. The municipal TB distribution and endemicity was clustered in the western part of the region and clearly overlapped with the explanatory variables identified in the final model: (1 incident cattle farms, (2 number of years of veterinary inspection of big game hunting events, (3 prevalence in wild boar, (4 number of sampled cattle, (5 persistent bTB-infected cattle farms, (6 prevalence in red deer, (7 proportion of beef farms, and (8 farms devoted to bullfighting cattle. Conclusions The combination of these eight variables in the final model highlights the importance of the persistence of the infection in the hosts, surveillance efforts and some cattle management choices in the circulation of M. bovis in the region. The spatial distribution of these variables, together with particular Mediterranean features that favour the wildlife-livestock interface may explain the M. bovis persistence in this region. Sanitary authorities should allocate efforts towards specific areas and epidemiological situations where the wildlife-livestock interface seems to critically hamper the definitive b

  12. Linking mean body size of pelagic Cladocera to environmental variables in Precambrian Shield lakes: A paleolimnological approach

    John P. SMOL

    2008-02-01

    Full Text Available Daphnia and Bosmina fragments were identified and measured in the surface sediments of 42 lakes in the Muskoka-Haliburton region of Ontario, Canada, in an attempt to identify environmental factors that may influence cladoceran body size. Specifically, pecten length on Daphnia post-abdominal claws, antennule length on Bosmina headshields, and carapace and mucro lengths of Bosmina carapaces were measured. These measurements were then compared to limnological variables previously identified as possibly influencing cladoceran size, including dissolved organic carbon (DOC, total phosphorus (TP, pH, calcium (Ca, Chaoborus density, and fish presence/absence. Cladoceran size displayed a linear relationship to TP, with larger Bosmina and Daphnia present in lakes with lower nutrient levels. We suspect that, as larger individuals are more efficient grazers, they may competitively exclude smaller individuals when nutrients are limiting in these lakes. Bosmina mucro length and cladoceran community size structure displayed a step response to DOC, with mean size significantly smaller when DOC concentrations were higher than 5.89 mg L-1. Daphnia pecten length displayed a negative linear relationship to DOC above a concentration of 4.90 mg l-1. Reduced predation pressure from gape-limited macroinvertebrate predators, such as Chaoborus, may have influenced these relationships. DOC was also highly correlated to TP in these lakes, and size trends might be responding to the TP gradient rather than the DOC gradient. Mean cladoceran body size in acidic lakes (pH 6.0. There was no relationship between size structure and Ca concentrations, attributed to a narrow Ca gradient in these lakes. Predation effects were examined using limited Chaoborus density and fish presence/absence data. Although there were no significant relationships between cladoceran size and Chaoborus density, some significant relationships between size variables and fish predation were identified. The

  13. A Bayesian approach to study the risk variables for tuberculosis occurrence in domestic and wild ungulates in South Central Spain.

    Rodríguez-Prieto, Víctor; Martínez-López, Beatriz; Barasona, José Angel; Acevedo, Pelayo; Romero, Beatriz; Rodriguez-Campos, Sabrina; Gortázar, Christian; Sánchez-Vizcaíno, José Manuel; Vicente, Joaquín

    2012-08-30

    Bovine tuberculosis (bTB) is a chronic infectious disease mainly caused by Mycobacterium bovis. Although eradication is a priority for the European authorities, bTB remains active or even increasing in many countries, causing significant economic losses. The integral consideration of epidemiological factors is crucial to more cost-effectively allocate control measures. The aim of this study was to identify the nature and extent of the association between TB distribution and a list of potential risk factors regarding cattle, wild ungulates and environmental aspects in Ciudad Real, a Spanish province with one of the highest TB herd prevalences. We used a Bayesian mixed effects multivariable logistic regression model to predict TB occurrence in either domestic or wild mammals per municipality in 2007 by using information from the previous year. The municipal TB distribution and endemicity was clustered in the western part of the region and clearly overlapped with the explanatory variables identified in the final model: (1) incident cattle farms, (2) number of years of veterinary inspection of big game hunting events, (3) prevalence in wild boar, (4) number of sampled cattle, (5) persistent bTB-infected cattle farms, (6) prevalence in red deer, (7) proportion of beef farms, and (8) farms devoted to bullfighting cattle. The combination of these eight variables in the final model highlights the importance of the persistence of the infection in the hosts, surveillance efforts and some cattle management choices in the circulation of M. bovis in the region. The spatial distribution of these variables, together with particular Mediterranean features that favour the wildlife-livestock interface may explain the M. bovis persistence in this region. Sanitary authorities should allocate efforts towards specific areas and epidemiological situations where the wildlife-livestock interface seems to critically hamper the definitive bTB eradication success.

  14. Understanding north-western Mediterranean climate variability: a multi-proxy and multi-sequence approach based on wavelet analysis.

    Azuara, Julien; Lebreton, Vincent; Jalali, Bassem; Sicre, Marie-Alexandrine; Sabatier, Pierre; Dezileau, Laurent; Peyron, Odile; Frigola, Jaime; Combourieu-Nebout, Nathalie

    2017-04-01

    Forcings and physical mechanisms underlying Holocene climate variability still remain poorly understood. Comparison of different paleoclimatic reconstructions using spectral analysis allows to investigate their common periodicities and helps to understand the causes of past climate changes. Wavelet analysis applied on several proxy time series from the Atlantic domain already revealed the first key-issues on the origin of Holocene climate variability. However the differences in duration, resolution and variance between the time-series are important issues for comparing paleoclimatic sequences in the frequency domain. This work compiles 7 paleoclimatic proxy records from 4 time-series from the north-western Mediterranean all ranging from 7000 to 1000 yrs cal BP: -pollen and clay mineral contents from the lagoonal sediment core PB06 recovered in southern France, -Sea Surface Temperatures (SST) derived from alkenones, concentration of terrestrial alkanes and their average chain length (ACL) from core KSGC-31_GolHo-1B recovered in the Gulf of Lion inner-shelf, - δ18O record from speleothems recovered in the Asiul Cave in north-western Spain, -grain size record from the deep basin sediment drift core MD99-2343 north of Minorca island. A comparison of their frequency content is proposed using wavelet analysis and cluster analysis of wavelet power spectra. Common cyclicities are assessed using cross-wavelet analysis. In addition, a new algorithm is used in order to propagate the age model errors within wavelet power spectra. Results are consistents with a non-stationnary Holocene climate variability. The Halstatt cycles (2000-2500 years) depicted in many proxies (ACL, errestrial alkanes and SSTs) demonstrate solar activity influence in the north-western Mediterranean climate. Cluster analysis shows that pollen and ACL proxies, both indicating changes in aridity, are clearly distinct from other proxies and share significant common periodicities around 1000 and 600 years

  15. Superfocusing modes of surface plasmon polaritons in conical geometry based on the quasi-separation of variables approach

    Kurihara, Kazuyoshi; Otomo, Akira; Syouji, Atsushi; Takahara, Junichi; Suzuki, Koji; Yokoyama, Shiyoshi

    2007-01-01

    Analytic solutions to the superfocusing modes of surface plasmon polaritons in a conical geometry are theoretically studied using an ingenious method called the quasi-separation of variables. This method can be used to look for fundamental solutions to the wave equation for a field that must satisfy boundary conditions at all points on the continuous surface of tapered geometries. The set of differential equations exclusively separated from the wave equation can be consistently solved in combination with perturbation methods. This paper presents the zeroth-order perturbation solution of conical superfocusing modes with azimuthal symmetry and graphically represents them in electric field-line patterns

  16. Developing an Instrument to Characterise Peer-Led Groups in Collaborative Learning Environments: Assessing Problem-Solving Approach and Group Interaction

    Pazos, Pilar; Micari, Marina; Light, Gregory

    2010-01-01

    Collaborative learning is being used extensively by educators at all levels. Peer-led team learning in a version of collaborative learning that has shown consistent success in science, technology, engineering and mathematics disciplines. Using a multi-phase research study we describe the development of an observation instrument that can be used to…

  17. Innovative approaches in European sustainable consumption policies: assessing the potential of various instruments for sustainable consumption practises and greening of the market (ASCEE)

    Rubik, F.; Scholl, G.; Biedenkopf, K.; Kalimo, H.; Mohaupt, F.; Söebech, Ó.; Stø, E.; Strandbakken, P.; Turnheim, B.

    2009-01-01

    The report summarises the outcomes of the project "Assessing the potential of various instruments for sustainable consumption practices and greening of the market" (ASCEE). The scope of the ASCEE project was to consider the latest trends in policies supporting sustainable consumption and production

  18. A Systematic Protein Refolding Screen Method using the DGR Approach Reveals that Time and Secondary TSA are Essential Variables.

    Wang, Yuanze; van Oosterwijk, Niels; Ali, Ameena M; Adawy, Alaa; Anindya, Atsarina L; Dömling, Alexander S S; Groves, Matthew R

    2017-08-24

    Refolding of proteins derived from inclusion bodies is very promising as it can provide a reliable source of target proteins of high purity. However, inclusion body-based protein production is often limited by the lack of techniques for the detection of correctly refolded protein. Thus, the selection of the refolding conditions is mostly achieved using trial and error approaches and is thus a time-consuming process. In this study, we use the latest developments in the differential scanning fluorimetry guided refolding approach as an analytical method to detect correctly refolded protein. We describe a systematic buffer screen that contains a 96-well primary pH-refolding screen in conjunction with a secondary additive screen. Our research demonstrates that this approach could be applied for determining refolding conditions for several proteins. In addition, it revealed which "helper" molecules, such as arginine and additives are essential. Four different proteins: HA-RBD, MDM2, IL-17A and PD-L1 were used to validate our refolding approach. Our systematic protocol evaluates the impact of the "helper" molecules, the pH, buffer system and time on the protein refolding process in a high-throughput fashion. Finally, we demonstrate that refolding time and a secondary thermal shift assay buffer screen are critical factors for improving refolding efficiency.

  19. Designing Online Software for Teaching the Concept of Variable That Facilitates Mental Interaction with the Material: Systemic Approach

    Koehler, Natalya A.; Thompson, Ann D.; Correia, Ana-Paula; Hagedorn, Linda Serra

    2015-01-01

    Our case study is a response to the need for research and reporting on specific strategies employed by software designers to produce effective multimedia instructional solutions. A systemic approach for identifying appropriate software features and conducting a formative evaluation that evaluates both the overall effectiveness of the multimedia…

  20. Beam Instrumentation and Diagnostics

    Strehl, Peter

    2006-01-01

    This treatise covers all aspects of the design and the daily operations of a beam diagnostic system for a large particle accelerator. A very interdisciplinary field, it involves contributions from physicists, electrical and mechanical engineers and computer experts alike so as to satisfy the ever-increasing demands for beam parameter variability for a vast range of operation modi and particles. The author draws upon 40 years of research and work, most of them spent as the head of the beam diagnostics group at GSI. He has illustrated the more theoretical aspects with many real-life examples that will provide beam instrumentation designers with ideas and tools for their work.

  1. Seismic instrumentation

    1984-06-01

    RFS or Regles Fondamentales de Surete (Basic Safety Rules) applicable to certain types of nuclear facilities lay down requirements with which compliance, for the type of facilities and within the scope of application covered by the RFS, is considered to be equivalent to compliance with technical French regulatory practice. The object of the RFS is to take advantage of standardization in the field of safety, while allowing for technical progress in that field. They are designed to enable the operating utility and contractors to know the rules pertaining to various subjects which are considered to be acceptable by the Service Central de Surete des Installations Nucleaires, or the SCSIN (Central Department for the Safety of Nuclear Facilities). These RFS should make safety analysis easier and lead to better understanding between experts and individuals concerned with the problems of nuclear safety. The SCSIN reserves the right to modify, when considered necessary, any RFS and specify, if need be, the terms under which a modification is deemed retroactive. The aim of this RFS is to define the type, location and operating conditions for seismic instrumentation needed to determine promptly the seismic response of nuclear power plants features important to safety to permit comparison of such response with that used as the design basis

  2. Meteorological instrumentation

    1982-06-01

    RFS or ''Regles Fondamentales de Surete'' (Basic Safety Rules) applicable to certain types of nuclear facilities lay down requirements with which compliance, for the type of facilities and within the scope of application covered by the RFS, is considered to be equivalent to compliance with technical French regulatory practice. The object of the RFS is to take advantage of standardization in the field of safety , while allowing for technical progress in that field. They are designed to enable the operating utility and contractors to know the rules pertaining to various subjects which are considered to be acceptable by the ''Service Central de Surete des Installations Nucleaires'' or the SCSIN (Central Department for the Safety of Nuclear Facilities). These RFS should make safety analysis easier and lead to better understanding between experts and individuals concerned with the problems of nuclear safety. The SCSIN reserves the right to modify, when considered necessary any RFS and specify, if need be, the terms under which a modification is deemed retroactive. The purpose of this RFS is to specify the meteorological instrumentation required at the site of each nuclear power plant equipped with at least one pressurized water reactor

  3. Instrumentation Cables Test Plan

    Muna, Alice Baca [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); LaFleur, Chris Bensdotter [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-10-01

    A fire at a nuclear power plant (NPP) has the potential to damage structures, systems, and components important to safety, if not promptly detected and suppressed. At Browns Ferry Nuclear Power Plant on March 22, 1975, a fire in the reactor building damaged electrical power and control systems. Damage to instrumentation cables impeded the function of both normal and standby reactor coolant systems, and degraded the operators’ plant monitoring capability. This event resulted in additional NRC involvement with utilities to ensure that NPPs are properly protected from fire as intended by the NRC principle design criteria (i.e., general design criteria 3, Fire Protection). Current guidance and methods for both deterministic and performance based approaches typically make conservative (bounding) assumptions regarding the fire-induced failure modes of instrumentation cables and those failure modes effects on component and system response. Numerous fire testing programs have been conducted in the past to evaluate the failure modes and effects of electrical cables exposed to severe thermal conditions. However, that testing has primarily focused on control circuits with only a limited number of tests performed on instrumentation circuits. In 2001, the Nuclear Energy Institute (NEI) and the Electric Power Research Institute (EPRI) conducted a series of cable fire tests designed to address specific aspects of the cable failure and circuit fault issues of concern1. The NRC was invited to observe and participate in that program. The NRC sponsored Sandia National Laboratories to support this participation, whom among other things, added a 4-20 mA instrumentation circuit and instrumentation cabling to six of the tests. Although limited, one insight drawn from those instrumentation circuits tests was that the failure characteristics appeared to depend on the cable insulation material. The results showed that for thermoset insulated cables, the instrument reading tended to drift

  4. Solving the Omitted Variables Problem of Regression Analysis Using the Relative Vertical Position of Observations

    Jonathan E. Leightner

    2012-01-01

    Full Text Available The omitted variables problem is one of regression analysis’ most serious problems. The standard approach to the omitted variables problem is to find instruments, or proxies, for the omitted variables, but this approach makes strong assumptions that are rarely met in practice. This paper introduces best projection reiterative truncated projected least squares (BP-RTPLS, the third generation of a technique that solves the omitted variables problem without using proxies or instruments. This paper presents a theoretical argument that BP-RTPLS produces unbiased reduced form estimates when there are omitted variables. This paper also provides simulation evidence that shows OLS produces between 250% and 2450% more errors than BP-RTPLS when there are omitted variables and when measurement and round-off error is 1 percent or less. In an example, the government spending multiplier, , is estimated using annual data for the USA between 1929 and 2010.

  5. Intelligent approach to maximum power point tracking control strategy for variable-speed wind turbine generation system

    Lin, Whei-Min; Hong, Chih-Ming [Department of Electrical Engineering, National Sun Yat-Sen University, Kaohsiung 80424 (China)

    2010-06-15

    To achieve maximum power point tracking (MPPT) for wind power generation systems, the rotational speed of wind turbines should be adjusted in real time according to wind speed. In this paper, a Wilcoxon radial basis function network (WRBFN) with hill-climb searching (HCS) MPPT strategy is proposed for a permanent magnet synchronous generator (PMSG) with a variable-speed wind turbine. A high-performance online training WRBFN using a back-propagation learning algorithm with modified particle swarm optimization (MPSO) regulating controller is designed for a PMSG. The MPSO is adopted in this study to adapt to the learning rates in the back-propagation process of the WRBFN to improve the learning capability. The MPPT strategy locates the system operation points along the maximum power curves based on the dc-link voltage of the inverter, thus avoiding the generator speed detection. (author)

  6. Recurrent neural network approach to quantum signal: coherent state restoration for continuous-variable quantum key distribution

    Lu, Weizhao; Huang, Chunhui; Hou, Kun; Shi, Liting; Zhao, Huihui; Li, Zhengmei; Qiu, Jianfeng

    2018-05-01

    In continuous-variable quantum key distribution (CV-QKD), weak signal carrying information transmits from Alice to Bob; during this process it is easily influenced by unknown noise which reduces signal-to-noise ratio, and strongly impacts reliability and stability of the communication. Recurrent quantum neural network (RQNN) is an artificial neural network model which can perform stochastic filtering without any prior knowledge of the signal and noise. In this paper, a modified RQNN algorithm with expectation maximization algorithm is proposed to process the signal in CV-QKD, which follows the basic rule of quantum mechanics. After RQNN, noise power decreases about 15 dBm, coherent signal recognition rate of RQNN is 96%, quantum bit error rate (QBER) drops to 4%, which is 6.9% lower than original QBER, and channel capacity is notably enlarged.

  7. Advanced embedded nonlinear observer design and HIL validation using a Takagi-Sugeno approach with unmeasurable premise variables

    Olteanu, S C; Belkoura, L; Aitouche, A

    2014-01-01

    The article's goals are to illustrate the feasibility of implementing a Takagi Sugeno state observer on an embedded microcontroller based platform and secondly to present a methodology for validating a physical embedded system using a Hardware In The Loop architecture, where a simulation software replaces the process. As an application, a three water tank system was chosen. For the validation part, LMS AMESim software is employed to reproduce the process behaviour. The interface to the embedded platform is assured by Simulink on a Windows operating system, chosen as it is the most commonly used operating system. The lack of real time behaviour of the operating system is compensated by a real time kernel that manages to offer deterministic response times. The Takagi-Sugeno observer in the case of this process has the complex form that considers the premise variables to be unmeasurable. The embedded system consists of two Arduino boards connected in parallel, thus offering distributed resources

  8. Small-scale spatial variability of phenoxy acid mineralization potentials in transition zones with a multidisciplinary approach

    Pazarbasi, Meric Batioglu

    The phenoxy acid group of herbicides is widely used to control broadleaf weeds, and it contaminates groundwater and surface water by leaching from agricultural soil or landfills. Due to the distinct vertical and horizontal gradients in nutrients and hydrologic exchange in transition zones...... in two transition zones, (1) the interfaces of unsaturated and saturated zones and (2) groundwater and surface water. Small-scale spatial variability of phenoxy acids was previously shown in topsoil; however, such small-scale studies are scarce in subsurface environments. We therefore studied the factors...... classes in the different mineralization potentials of discharge zones. Understanding of the natural attenuation potential of groundwater-surface water transition zones is important for stream water protection. In landfill-impacted groundwater-surface water interface, we further analyzed bacterial...

  9. Advanced embedded nonlinear observer design and HIL validation using a Takagi-Sugeno approach with unmeasurable premise variables

    Olteanu, S C; Belkoura, L [University of Lille 1 (France); Aitouche, A [HEI Lille (France)

    2014-12-16

    The article's goals are to illustrate the feasibility of implementing a Takagi Sugeno state observer on an embedded microcontroller based platform and secondly to present a methodology for validating a physical embedded system using a Hardware In The Loop architecture, where a simulation software replaces the process. As an application, a three water tank system was chosen. For the validation part, LMS AMESim software is employed to reproduce the process behaviour. The interface to the embedded platform is assured by Simulink on a Windows operating system, chosen as it is the most commonly used operating system. The lack of real time behaviour of the operating system is compensated by a real time kernel that manages to offer deterministic response times. The Takagi-Sugeno observer in the case of this process has the complex form that considers the premise variables to be unmeasurable. The embedded system consists of two Arduino boards connected in parallel, thus offering distributed resources.

  10. Assessing potential impacts of climate change and variability on the Great Lakes-St. Lawrence Basin: A binational approach

    Quinn, F.H.; Mortsch, L.D.

    1997-01-01

    The potential impacts of climate change and variability on the Great Lakes environment are serious and complex. The Great Lakes-St. Lawrence Basin is home to 42.5 million US and Canadian citizens and is the industrial and commercial heartland of both nations. The region is rich in human and natural resources, with diverse economic activities and substantial infrastructure which would be affected by major shifts in climate. For example, water level changes could affect wetland distribution and functioning; reductions in streamflow would alter assimilative capacities while warmer water temperatures would influence spring and fall turnover and incidence of anoxia. A binational program has been initiated to conduct interdisciplinary, integrated impact assessments for the Great Lakes-St. Lawrence River Basin. The goal of this program is to undertake interdisciplinary, integrated studies to improve the understanding of the complex interactions between climate, the environment, and socioeconomic systems in order to develop informed regional adaptation responses

  11. Investigating the spatio-temporal variability in groundwater and surface water interactions: a multi-technical approach

    Unland, N. P.; Cartwright, I.; Andersen, M. S.; Rau, G. C.; Reed, J.; Gilfedder, B. S.; Atkinson, A. P.; Hofmann, H.

    2013-03-01

    The interaction between groundwater and surface water along the Tambo and Nicholson Rivers, southeast Australia, was investigated using 222Rn, Cl, differential flow gauging, head gradients, electrical conductivity (EC) and temperature profiling. Head gradients, temperature profiles, Cl concentrations and 222Rn activities all indicate higher groundwater fluxes to the Tambo River in areas of increased topographic variation where the potential to form large groundwater-surface water gradients is greater. Groundwater discharge to the Tambo River calculated by Cl mass balance was significantly lower (1.48 × 104 to 1.41 × 103 m3 day-1) than discharge estimated by 222Rn mass balance (5.35 × 105 to 9.56 × 103 m3 day-1) and differential flow gauging (5.41 × 105 to 6.30 × 103 m3 day-1). While groundwater sampling from the bank of the Tambo River was intended to account for the variability in groundwater chemistry associated with river-bank interaction, the spatial variability under which these interactions occurs remained unaccounted for, limiting the use of Cl as an effective tracer. Groundwater discharge to both the Tambo and Nicholson Rivers was the highest under high flow conditions in the days to weeks following significant rainfall, indicating that the rivers are well connected to a groundwater system that is responsive to rainfall. Groundwater constituted the lowest proportion of river discharge during times of increased rainfall that followed dry periods, while groundwater constituted the highest proportion of river discharge under baseflow conditions (21.4% of the Tambo in April 2010 and 18.9% of the Nicholson in September 2010).

  12. The Effects of Climate Variability on Phytoplankton Composition in the Equatorial Pacific Ocean using a Model and a Satellite-Derived Approach

    Rousseaux, C. S.; Gregg, W. W.

    2012-01-01

    Compared the interannual variation in diatoms, cyanobacteria, coccolithophores and chlorophytes from the NASA Ocean Biogeochemical Model with those derived from satellite data (Hirata et al. 2011) between 1998 and 2006 in the Equatorial Pacific. Using NOBM, La Ni a events were characterized by an increase in diatoms (correlation with MEI, r=-0.81, Pphytoplankton community in response to climate variability. However, satellite-derived phytoplankton groups were all negatively correlated with climate variability (r ranged from -0.39 for diatoms to -0.64 for coccolithophores, Pphytoplankton groups except diatoms than NOBM. However, the different responses of phytoplankton to intense interannual events in the Equatorial Pacific raises questions about the representation of phytoplankton dynamics in models and algorithms: is a phytoplankton community shift as in the model or an across-the-board change in abundances of all phytoplankton as in the satellite-derived approach.

  13. The interprocess NIR sampling as an alternative approach to multivariate statistical process control for identifying sources of product-quality variability.

    Marković, Snežana; Kerč, Janez; Horvat, Matej

    2017-03-01

    We are presenting a new approach of identifying sources of variability within a manufacturing process by NIR measurements of samples of intermediate material after each consecutive unit operation (interprocess NIR sampling technique). In addition, we summarize the development of a multivariate statistical process control (MSPC) model for the production of enteric-coated pellet product of the proton-pump inhibitor class. By developing provisional NIR calibration models, the identification of critical process points yields comparable results to the established MSPC modeling procedure. Both approaches are shown to lead to the same conclusion, identifying parameters of extrusion/spheronization and characteristics of lactose that have the greatest influence on the end-product's enteric coating performance. The proposed approach enables quicker and easier identification of variability sources during manufacturing process, especially in cases when historical process data is not straightforwardly available. In the presented case the changes of lactose characteristics are influencing the performance of the extrusion/spheronization process step. The pellet cores produced by using one (considered as less suitable) lactose source were on average larger and more fragile, leading to consequent breakage of the cores during subsequent fluid bed operations. These results were confirmed by additional experimental analyses illuminating the underlying mechanism of fracture of oblong pellets during the pellet coating process leading to compromised film coating.

  14. Radiological instrument

    Kronenberg, S.; McLaughlin, W.L.; Seibentritt, C.R. Jr.

    1986-01-01

    An instrument is described for measuring radiation, particularly nuclear radiation, comprising: a radiation sensitive structure pivoted toward one end and including a pair of elongated solid members contiguously joined together along their length dimensions and having a common planar interface therebetween. One of the pairs of members is comprised of radiochromic material whose index of refraction changes due to anomolous dispersion as a result of being exposed to nuclear radiation. The pair of members further has mutually different indices of refraction with the member having the larger index of refraction further being transparent for the passage of light and of energy therethrough; means located toward the other end of the structure for varying the angle of longitudinal elevation of the pair of members; means for generating and projecting a beam of light into one end of the member having the larger index of refraction. The beam of light is projected toward the planar interface where it is reflected out of the other end of the same member as a first output beam; means projecting a portion of the beam of light into one end of the member having the larger index of refraction where it traverses therethrough without reflection and out of the other end of the same member as a second output beam; and means adjacent the structure for receiving the first and second output beams, whereby a calibrated change in the angle of elevation of the structure between positions of equal intensity of the first and second output beams prior to and following exposure provides a measure of the radiation sensed due to a change of refraction of the radiochromic material

  15. Confirming theoretical pay constructs of a variable pay scheme

    Sibangilizwe Ncube

    2013-05-01

    Full Text Available Orientation: Return on the investment in variable pay programmes remains controversial because their cost versus contribution cannot be empirically justified. Research purpose: This study validates the findings of the model developed by De Swardt on the factors related to successful variable pay programmes. Motivation for the study: Many organisations blindly implement variable pay programmes without any means to assess the impact these programmes have on the company’s performance. This study was necessary to validate the findings of an existing instrument that validates the contribution of variable pay schemes. Research design, approach and method: The study was conducted using quantitative research. A total of 300 completed questionnaires from a non-purposive sample of 3000 participants in schemes across all South African industries were returned and analysed. Main findings: Using exploratory and confirmatory factor analysis, it was found that the validation instrument developed by De Swardt is still largely valid in evaluating variable pay schemes. The differences between the study and the model were reported. Practical/managerial implications: The study confirmed the robustness of an existing model that enables practitioners to empirically validate the use of variable pay plans. This model assists in the design and implementation of variable pay programmes that meet critical success factors. Contribution/value-add: The study contributed to the development of a measurement instrument that will assess whether a variable pay plan contributes to an organisation’s success.

  16. A preliminary approach to characterizing variability and uncertainty in the mammalian PCDD/F and PCB TEFs

    Haws, L. [Exponent, Austin, TX (United States); Harris, M.; Santamaria, A. [Exponent, Houston, TX (United States); Su, S. [Exponent, New York, NY (United States); Walker, N. [National Institute of Environmental Health Sciences, Research Triangle Park, NC (United States); Birnbaum, L.; DeVito, M. [U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Farland, W. [U.S. Environmental Protection Agency, Washington, DC (United States); Connor, K. [Exponent, Natick, MA (United States); Finley, B. [Exponent, Santa Rosa, CA (United States)

    2004-09-15

    The current toxic equivalency factors (TEFs) for PCDD/Fs and ''dioxin-like'' PCBs represent consensus-based values that were recommended by an international panel of experts convened by the World Health Organization (WHO) in June of 1997. As a part of the development of the mammalian TEFs, the WHO expert panel considered an extensive body of in vivo and in vitro studies compiled into a database of relative potency (REP) values by scientists at the Karolinska Institute in Stockholm Sweden (hereafter referred to as the Karolinska database). The final TEFs recommended by the WHO expert panel were determined based on scientific judgment and represent order-of magnitude estimates of potency for each of the congeners relative to 2,3,7,8-tetrachlorodibenzo-pdioxin (TCDD). As has been indicated by a number of investigators, the REP values for many congeners are derived from a highly heterogeneous data set, and for most TEFs, the range of underlying REP values often spans several orders of magnitude. However, the degree to which the current ''point estimate'' TEFs introduce variability and uncertainty into the health risk assessment process cannot be characterized in a quantitative fashion. Such characterizations may be important in settings where numerous PCDD/F and PCB congeners contribute to potential health risk. We believe that the use of REP distributions, as a supplement to or in place of ''point estimate'' TEFs, would facilitate such characterizations. Specifically, use of a range of REP values, perhaps with a clearly identified ''central tendency'' (e.g., 50{sup th} percentile) and/or ''upper bound'' (e.g., 90{sup th} or 95{sup th} percentile), would permit more informed discussions regarding the degree to which the TEFs contribute to variability and uncertainty in health risk estimates. This is important given the widespread use of the TEFs by numerous governmental

  17. Exploring the effects of climatic variables on monthly precipitation variation using a continuous wavelet-based multiscale entropy approach.

    Roushangar, Kiyoumars; Alizadeh, Farhad; Adamowski, Jan

    2018-08-01

    Understanding precipitation on a regional basis is an important component of water resources planning and management. The present study outlines a methodology based on continuous wavelet transform (CWT) and multiscale entropy (CWME), combined with self-organizing map (SOM) and k-means clustering techniques, to measure and analyze the complexity of precipitation. Historical monthly precipitation data from 1960 to 2010 at 31 rain gauges across Iran were preprocessed by CWT. The multi-resolution CWT approach segregated the major features of the original precipitation series by unfolding the structure of the time series which was often ambiguous. The entropy concept was then applied to components obtained from CWT to measure dispersion, uncertainty, disorder, and diversification of subcomponents. Based on different validity indices, k-means clustering captured homogenous areas more accurately, and additional analysis was performed based on the outcome of this approach. The 31 rain gauges in this study were clustered into 6 groups, each one having a unique CWME pattern across different time scales. The results of clustering showed that hydrologic similarity (multiscale variation of precipitation) was not based on geographic contiguity. According to the pattern of entropy across the scales, each cluster was assigned an entropy signature that provided an estimation of the entropy pattern of precipitation data in each cluster. Based on the pattern of mean CWME for each cluster, a characteristic signature was assigned, which provided an estimation of the CWME of a cluster across scales of 1-2, 3-8, and 9-13 months relative to other stations. The validity of the homogeneous clusters demonstrated the usefulness of the proposed approach to regionalize precipitation. Further analysis based on wavelet coherence (WTC) was performed by selecting central rain gauges in each cluster and analyzing against temperature, wind, Multivariate ENSO index (MEI), and East Atlantic (EA) and

  18. Estimation of the impacts of different homogenization approaches on the variability of temperature series in Catalonia (North Eastern-Spain), Andorra and South Eastern - France. An experiment under the umbrella of the HOME-COST action.

    Aguilar, E.; Prohom, M.; Mestre, O.; Esteban, P.; Kuglitsch, F. G.; Gruber, C.; Herrero, M.

    2008-12-01

    The almost unanimously accepted fact of climate change has brought many scientists to investigate the seasonal and interannual variability and change in instrumental climatic records. Unfortunately, these records are nearly always affected by homogeneity problems caused by changes in the station or its environment. The European Cooperation in the Field of Scientific and Technical Research (COST) is sponsoring the action COST-ES0601: Advances in homogenisation methods of climate series: an integrated approach (HOME), which aims amongst others to investigate the impacts of different homogenisation ap-proaches on the observed data series. In this work, we apply different detection/correction methods (SNHT, RhTest, Caussinus-Mestre, Vincent Interpolation Method, HOM Method) to annual, sea-sonal, monthly and daily data of a multi-country quality controlled dataset (17 stations in Catalonia (NE Spain); 3 stations in Andorra and 11 stations in SE France). The different outputs are analysed and the differences in the final se-ries studied. After this experiment, we can state that - although all the applied methods im-prove the homogeneity of the original series - the conclusions extracted from the analysis of the homogenised annual, seasonal, monthly data and extreme indices derived from daily data demonstrate important differences. As an exam-ple, some methods (SNHT) tend to detect fewer breakpoints than others (Caussinus-Mestre). Even if metadata or a pre-identified list of breakpoints is available, the correction factors calculated by the different approaches differ both in annual, seasonal, monthly and daily scales. In the latter case, some methods like HOM - based on the modelling of a candidate series against a reference series - present a richest solution than others based on the mere in-terpolation of monthly factors (Vincent Method), although the former are not al-ways applicable due to lack of good reference stations. In order to identify the best performing

  19. A theoretical approach to the deposition and clearance of fibers with variable size in the human respiratory tract

    Sturm, Robert; Hofmann, Werner

    2009-01-01

    In the study presented here, a mathematical approach for the deposition and clearance of rigid and chemically stable fibers in the human respiratory tract (HRT) is described in detail. For the simulation of fiber transport and deposition in lung airways an advanced concept of the aerodynamic diameter is applied to a stochastic lung model with individual particle trajectories computed according to a random walk algorithm. Interception of fibrous material at airway bifurcations is considered by implementation of correction factors obtained from previously published numerical approaches to fiber deposition in short bronchial sequences. Fiber clearance is simulated on the basis of a multicompartment model, within which separate clearance scenarios are assumed for the alveolar, bronchiolar, and bronchial lung region and evacuation of fibrous material commonly takes place via the airway and extrathoracic path to the gastrointestinal tract (GIT) or via the transepithelial path to the lymph nodes and blood vessels. Deposition of fibrous particles in the HRT is controlled by the fiber aspect ratio β in as much as particles with diameters <0.1 μm deposit less effectively with increasing β, while larger particles exhibit a positive correlation between their deposition efficiencies and β. A change from sitting to light-work breathing conditions causes only insignificant modifications of total fiber deposition in the HRT, whereas alveolar and, above all, tubular deposition of fibrous particles with a diameter ≥0.1 μm are affected remarkably. For these particles enhancement of the inhalative flow rate results in an increase of the extrathoracic and bronchial deposition fractions. Concerning the clearance of fibers from the HRT, 24-h retention is noticeably influenced by β and, not less important, by the preferential deposition sites of the simulated particles. The significance of β with respect to particle size may be regarded as similar to that determined for the

  20. Assessing the Impact of Forest Change and Climate Variability on Dry Season Runoff by an Improved Single Watershed Approach: A Comparative Study in Two Large Watersheds, China

    Yiping Hou

    2018-01-01

    Full Text Available Extensive studies on hydrological responses to forest change have been published for centuries, yet partitioning the hydrological effects of forest change, climate variability and other factors in a large watershed remains a challenge. In this study, we developed a single watershed approach combining the modified double mass curve (MDMC and the time series multivariate autoregressive integrated moving average model (ARIMAX to separate the impact of forest change, climate variability and other factors on dry season runoff variation in two large watersheds in China. The Zagunao watershed was examined for the deforestation effect, while the Meijiang watershed was examined to study the hydrological impact of reforestation. The key findings are: (1 both deforestation and reforestation led to significant reductions in dry season runoff, while climate variability yielded positive effects in the studied watersheds; (2 the hydrological response to forest change varied over time due to changes in soil infiltration and evapotranspiration after vegetation regeneration; (3 changes of subalpine natural forests produced greater impact on dry season runoff than alteration of planted forests. These findings are beneficial to water resource and forest management under climate change and highlight a better planning of forest operations and management incorporated trade-off between carbon and water in different forests.

  1. Analysis on inter-annual variability of CO2 exchange in Arctic tundra: a model-data approach

    López Blanco, E.; Lund, M.; Christensen, T. R.; Smallman, T. L.; Slevin, D.; Westergaard-Nielsen, A.; Tamstorf, M. P.; Williams, M.

    2017-12-01

    Arctic ecosystems are exposed to rapid changes triggered by the climate variability, thus there is a growing concern about how the carbon (C) exchange balance will respond to climate change. There is a lack of knowledge about the mechanisms that drive the interactions between photosynthesis and ecological respiration with changes in C stocks in the Arctic tundra across full annual cycles. The reduction of uncertainties can be addressed through process-based modelling efforts. Here, we report the independent predictions of net ecosystem exchange (NEE), gross primary production (GPP) and ecosystem respiration (Reco) calculated from the soil-plant-atmosphere (SPA) model across eight years. The model products are validated with observational data obtained from the Greenland Ecosystem Monitoring (GEM) program in West Greenland tundra (64° N). Overall, the model results explain 71%, 73% and 51% of the variance in NEE, GPP and Reco respectively using data on meteorology and local vegetation and soil structure. The estimated leaf area index (LAI) is able to explain 80% of the plant greenness variation, which was used as a plant phenology proxy. The full annual cumulated NEE during the 2008-2015 period was -0.13 g C m-2 on average (range -30.6 to 34.1 g C m-2), while GPP was -214.6 g C m-2 (-126.2 to -332.8 g C m-2) and Reco was 214.4 g C m-2 (213.9 to 302.2 g C m-2). We found that the model supports the main finding from our previous analysis on flux responses to meteorological variations and biological disturbance. Here, large inter-annual variations in GPP and Reco are also compensatory, and so NEE remains stable across climatically diverse snow-free seasons. Further, we note evidence that leaf maintenance and root growth respiration are highly correlated with GPP (R2 = 0.92 and 0.83, p < 0.001), concluding that these relations likely drive the insensitivity of NEE. Interestingly, the model quantifies the contribution of the larvae outbreak occurred in 2011 in about 27

  2. New gentle-wing high-shear granulator: impact of processing variables on granules and tablets characteristics of high-drug loading formulation using design of experiment approach.

    Fayed, Mohamed H; Abdel-Rahman, Sayed I; Alanazi, Fars K; Ahmed, Mahrous O; Tawfeek, Hesham M; Al-Shdefat, Ramadan I

    2017-10-01

    The aim of this work was to study the application of design of experiment (DoE) approach in defining design space for granulation and tableting processes using a novel gentle-wing high-shear granulator. According to quality-by-design (QbD) prospective, critical attributes of granules, and tablets should be ensured by manufacturing process design. A face-centered central composite design has been employed in order to investigate the effect of water amount (X 1 ), impeller speed (X 2 ), wet massing time (X 3 ), and water addition rate (X 4 ) as independent process variables on granules and tablets characteristics. Acetaminophen was used as a model drug and granulation experiments were carried out using dry addition of povidone k30. The dried granules have been analyzed for their size distribution, density, and flow pattern. Additionally, the produced tablets have been investigated for; weight uniformity, breaking force, friability and percent capping, disintegration time, and drug dissolution. Results of regression analysis showed that water amount, impeller speed and wet massing time have significant (p tablets characteristics. However, the water amount had the most pronounced effect as indicated by its higher parameter estimate. On the other hand, water addition rate showed a minimal impact on granules and tablets properties. In conclusion, water amount, impeller speed, and wet massing time could be considered as critical process variables. Thus, understanding the relationship between these variables and quality attributes of granules and corresponding tablets provides the basis for adjusting granulation variables in order to optimize product performance.

  3. Quantitative comparisons of three modeling approaches for characterizing drought response of a highly variable, widely grown crop species

    Pleban, J. R.; Mackay, D. S.; Aston, T.; Ewers, B. E.; Wienig, C.

    2013-12-01

    Quantifying the drought tolerance of crop species and genotypes is essential in order to predict how water stress may impact agricultural productivity. As climate models predict an increase in both frequency and severity of drought corresponding plant hydraulic and biochemical models are needed to accurately predict crop drought tolerance. Drought can result in cavitation of xylem conduits and related loss of plant hydraulic conductivity. This study tested the hypothesis that a model incorporating a plants vulnerability to cavitation would best assess drought tolerance in Brassica rapa. Four Brassica genotypes were subjected to drought conditions at a field site in Laramie, WY. Concurrent leaf gas exchange, volumetric soil moisture content and xylem pressure measurements were made during the drought period. Three models were used to access genotype specific drought tolerance. All 3 models rely on the Farquhar biochemical/biophysical model of leaf level photosynthesis, which is integrated into the Terrestrial Regional Ecosystem Exchange Simulator (TREES). The models differ in how TREES applies the environmental driving data and plant physiological mechanisms; specifically how water availability at the site of photosynthesis is derived. Model 1 established leaf water availability from a modeled soil moisture content; Model 2 input soil moisture measurements directly to establish leaf water availability; Model 3 incorporated the Sperry soil-plant transport model, which calculates flows and pressure along the soil-plant water transport pathway to establish leaf water availability. This third model incorporated measured xylem pressures thus constraining leaf water availability via genotype specific vulnerability curves. A multi-model intercomparison was made using a Bayesian approach, which assessed the interaction between uncertainty in model results and data. The three models were further evaluated by assessing model accuracy and complexity via deviance information

  4. The Biographical Personality Interview (BPI)--a new approach to the assessment of premorbid personality in psychiatric research. Part I: Development of the instrument.

    von Zerssen, D; Pössl, J; Hecht, H; Black, C; Garczynski, E; Barthelmes, H

    1998-01-01

    The Biographical Personality Interview (BPI) is a research instrument for the retrospective assessment of premorbid personality traits of psychiatric patients. Its construction is based on results of a series of investigations in which biographical data from psychiatric case notes were analysed with respect to premorbid personality traits. In order to avoid methodological shortcomings of the utilisation of clinical records, an interview technique was developed. It is applied by two independent, specially trained investigators who are kept "blind" regarding any clinical data of the subject under study. One of them has to conduct the interview of a clinically remitted patient and to provide an interview protocol, the other one has to rate personality traits from that protocol along a large series of purely descriptive items. Sum scores for six personality structures ("types") are calculated and the case is then assigned to the intra-individually dominating personality type according to the highest of these scores.

  5. An alternative approach to exact wave functions for time-dependent coupled oscillator model of charged particle in variable magnetic field

    Menouar, Salah; Maamache, Mustapha; Choi, Jeong Ryeol

    2010-01-01

    The quantum states of time-dependent coupled oscillator model for charged particles subjected to variable magnetic field are investigated using the invariant operator methods. To do this, we have taken advantage of an alternative method, so-called unitary transformation approach, available in the framework of quantum mechanics, as well as a generalized canonical transformation method in the classical regime. The transformed quantum Hamiltonian is obtained using suitable unitary operators and is represented in terms of two independent harmonic oscillators which have the same frequencies as that of the classically transformed one. Starting from the wave functions in the transformed system, we have derived the full wave functions in the original system with the help of the unitary operators. One can easily take a complete description of how the charged particle behaves under the given Hamiltonian by taking advantage of these analytical wave functions.

  6. Evaluating musical instruments

    Campbell, D. Murray

    2014-01-01

    Scientific measurements of sound generation and radiation by musical instruments are surprisingly hard to correlate with the subtle and complex judgments of instrumental quality made by expert musicians

  7. SU-E-I-51: Quantitative Assessment of X-Ray Imaging Detector Performance in a Clinical Setting - a Simple Approach Using a Commercial Instrument

    Sjoeberg, J; Bujila, R; Omar, A; Nowik, P; Mobini-Kesheh, S; Lindstroem, J [Karolinska University Hospital, Solna (Sweden)

    2015-06-15

    Purpose: To measure and compare the performance of X-ray imaging detectors in a clinical setting using a dedicated instrument for the quantitative determination of detector performance. Methods: The DQEPro (DQE Instruments Inc., London, Ontario Canada) was used to determine the MTF, NPS and DQE using an IEC compliant methodology for three different imaging modalities: conventional radiography (CsI-based detector), general-purpose radioscopy (CsI-based detector), and mammography (a-Se based detector). The radiation qualities (IEC) RQA-5 and RQA-M-2 were used for the CsI-based and a-Se-based detectors, respectively. The DQEPro alleviates some of the difficulties associated with DQE measurements by automatically positioning test devices over the detector, guiding the user through the image acquisition process and providing software for calculations. Results: A comparison of the NPS showed that the image noise of the a-Se detector was less correlated than the CsI detectors. A consistently higher performance was observed for the a-Se detector at all spatial frequencies (MTF: 0.97@0.25 cy/mm, DQE: 0.72@0.25 cy/mm) and the DQE drops off slower than for the CsI detectors. The CsI detector used for conventional radiography displayed a higher performance at low spatial frequencies compared to the CsI detector used for radioscopy (DQE: 0.65 vs 0.60@0.25 cy/mm). However, at spatial frequencies above 1.3 cy/mm, the radioscopy detector displayed better performance than the conventional radiography detector (DQE: 0.35 vs 0.24@2.00 cy/mm). Conclusion: The difference in the MTF, NPS and DQE that was observed for the two different CsI detectors and the a-Se detector reflect the imaging tasks that the different detector types are intended for. The DQEPro has made the determination and calculation of quantitative metrics of X-ray imaging detector performance substantially more convenient and accessible to undertake in a clinical setting.

  8. Economic instruments for environmental mitigation

    Wilkinson, A.

    1995-01-01

    A joint International Chamber of Commerce (ICC)/World Energy Council (WEC) Working Group has been studying a range of policy instruments which are being used or considered for use to address the question of ever increasing energy demand versus environmental protection, and pollution reduction. Economic instruments for such environmental protection include direct regulation, market-based instruments, and voluntary approaches. No single policy or device was likely to suffice in addressing the diversity of environmental problems currently faced. Altering energy prices must be seen in a social context, but some direct regulation may also be inevitable. Generally economic instruments of change were preferred as these were viewed as more flexible and cost-effective. (UK)

  9. Technical Training seminar: Texas Instruments

    2006-01-01

    Monday 6 November TECHNICAL TRAINING SEMINAR 14:00 to 17:30 - Training Centre Auditorium (bldg. 593) Texas Instruments Technical Seminar Michael Scholtholt, Field Application Engineer / TEXAS INSTRUMENTS (US, D, CH) POWER - A short approach to Texas Instruments power products Voltage mode vs. current mode control Differentiating DC/DC converters by analyzing control and compensation schemes: line / load regulation, transient response, BOM, board space, ease-of-use Introduction to the SWIFT software FPGA + CPLD power solutions WIRELESS / CHIPCON Decision criteria when choosing a RF platform Introduction to Texas Instruments wireless products: standardized platforms proprietary platforms ( 2.4 GHz / sub 1 GHz) development tools Antenna design: example for 2.4 GHz questions, discussion Industrial partners: Robert Medioni, François Caloz / Spoerle Electronic, CH-1440 Montagny (VD), Switzerland Phone: +41 24 447 0137, email: RMedioni@spoerle.com, http://www.spoerle.com Language: English. Free s...

  10. Spatial variability of excess mortality during prolonged dust events in a high-density city: a time-stratified spatial regression approach.

    Wong, Man Sing; Ho, Hung Chak; Yang, Lin; Shi, Wenzhong; Yang, Jinxin; Chan, Ta-Chien

    2017-07-24

    Dust events have long been recognized to be associated with a higher mortality risk. However, no study has investigated how prolonged dust events affect the spatial variability of mortality across districts in a downwind city. In this study, we applied a spatial regression approach to estimate the district-level mortality during two extreme dust events in Hong Kong. We compared spatial and non-spatial models to evaluate the ability of each regression to estimate mortality. We also compared prolonged dust events with non-dust events to determine the influences of community factors on mortality across the city. The density of a built environment (estimated by the sky view factor) had positive association with excess mortality in each district, while socioeconomic deprivation contributed by lower income and lower education induced higher mortality impact in each territory planning unit during a prolonged dust event. Based on the model comparison, spatial error modelling with the 1st order of queen contiguity consistently outperformed other models. The high-risk areas with higher increase in mortality were located in an urban high-density environment with higher socioeconomic deprivation. Our model design shows the ability to predict spatial variability of mortality risk during an extreme weather event that is not able to be estimated based on traditional time-series analysis or ecological studies. Our spatial protocol can be used for public health surveillance, sustainable planning and disaster preparation when relevant data are available.

  11. Characterising an intense PM pollution episode in March 2015 in France from multi-site approach and near real time data: Climatology, variabilities, geographical origins and model evaluation

    Petit, J.-E.; Amodeo, T.; Meleux, F.; Bessagnet, B.; Menut, L.; Grenier, D.; Pellan, Y.; Ockler, A.; Rocq, B.; Gros, V.; Sciare, J.; Favez, O.

    2017-04-01

    During March 2015, a severe and large-scale particulate matter (PM) pollution episode occurred in France. Measurements in near real-time of the major chemical composition at four different urban background sites across the country (Paris, Creil, Metz and Lyon) allowed the investigation of spatiotemporal variabilities during this episode. A climatology approach showed that all sites experienced clear unusual rain shortage, a pattern that is also found on a longer timescale, highlighting the role of synoptic conditions over Wester-Europe. This episode is characterized by a strong predominance of secondary pollution, and more particularly of ammonium nitrate, which accounted for more than 50% of submicron aerosols at all sites during the most intense period of the episode. Pollution advection is illustrated by similar variabilities in Paris and Creil (distant of around 100 km), as well as trajectory analyses applied on nitrate and sulphate. Local sources, especially wood burning, are however found to contribute to local/regional sub-episodes, notably in Metz. Finally, simulated concentrations from Chemistry-Transport model CHIMERE were compared to observed ones. Results highlighted different patterns depending on the chemical components and the measuring site, reinforcing the need of such exercises over other pollution episodes and sites.

  12. An integrated quality by design and mixture-process variable approach in the development of a capillary electrophoresis method for the analysis of almotriptan and its impurities.

    Orlandini, S; Pasquini, B; Stocchero, M; Pinzauti, S; Furlanetto, S

    2014-04-25

    The development of a capillary electrophoresis (CE) method for the assay of almotriptan (ALM) and its main impurities using an integrated Quality by Design and mixture-process variable (MPV) approach is described. A scouting phase was initially carried out by evaluating different CE operative modes, including the addition of pseudostationary phases and additives to the background electrolyte, in order to approach the analytical target profile. This step made it possible to select normal polarity microemulsion electrokinetic chromatography (MEEKC) as operative mode, which allowed a good selectivity to be achieved in a low analysis time. On the basis of a general Ishikawa diagram for MEEKC methods, a screening asymmetric matrix was applied in order to screen the effects of the process variables (PVs) voltage, temperature, buffer concentration and buffer pH, on critical quality attributes (CQAs), represented by critical separation values and analysis time. A response surface study was then carried out considering all the critical process parameters, including both the PVs and the mixture components (MCs) of the microemulsion (borate buffer, n-heptane as oil, sodium dodecyl sulphate/n-butanol as surfactant/cosurfactant). The values of PVs and MCs were simultaneously changed in a MPV study, making it possible to find significant interaction effects. The design space (DS) was defined as the multidimensional combination of PVs and MCs where the probability for the different considered CQAs to be acceptable was higher than a quality level π=90%. DS was identified by risk of failure maps, which were drawn on the basis of Monte-Carlo simulations, and verification points spanning the design space were tested. Robustness testing of the method, performed by a D-optimal design, and system suitability criteria allowed a control strategy to be designed. The optimized method was validated following ICH Guideline Q2(R1) and was applied to a real sample of ALM coated tablets

  13. Evaluating the role of soil variability on groundwater pollution and recharge at regional scale by integrating a process-based vadose zone model in a stochastic approach

    Coppola, Antonio; Comegna, Alessandro; Dragonetti, Giovanna; Lamaddalena, Nicola; Zdruli, Pandi

    2013-04-01

    the lack of information on vertical variability of soil properties. It is our opinion that, with sufficient information on soil horizonation and with an appropriate horizontal resolution, it may be demonstrated that model outputs may be largely sensitive to the vertical variability of stream tubes, even at applicative scales. Horizon differentiation is one of the main observations made by pedologists while describing soils and most analytical data are given according to soil horizons. Over the last decades, soil horizonation has been subjected to regular monitoring for mapping soil variation at regional scales. Accordingly, this study mainly aims to developing a regional-scale simulation approach for vadose zone flow and transport that use real soil profiles data based on information on vertical variability of soils. As to the methodology, the parallel column concept was applied to account for the effect of vertical heterogeneity on variability of water flow and solute transport in the vadose zone. Even if the stream tube approach was mainly introduced for (unrealistic) vertically homogeneous soils, we extended their use to real vertically variable soils. The approach relies on available datasets coming from different sources and offers quantitative answers to soil and groundwater vulnerability to non-point source of chemicals and pathogens at regional scale within a defined confidence interval. This result will be pursued through the design and building up of a spatial database containing 1). Detailed pedological information, 2). Hydrological properties mainly measured in the investigated area in different soil horizons, 3). Water table depth, 4). Spatially distributed climatic temporal series, and 5). Land use. The area of interest for the study is located in the sub-basin of Metaponto agricultural site, located in southern Basilicata Region in Italy, covering approximately 11,698 hectares, crossed by two main rivers, Sinni and Agri and from many secondary water

  14. A GC Instrument Simulator

    Armitage, D. Bruce

    1999-02-01

    the difference between the boiling point of the component and the temperature of the column. The polarity difference between the column packing and the component is also used to modify the retention time. The retention time decreases as the difference between the boiling point of the component and the temperature of the column increases, and retention time increases as the polarity of the component approaches the polarity of the column. If the temperature of the column is too low, a warning message is given and the chromatogram does not show that component. There is no "carry-over" to the next chromatogram, as might be the case for an actual instrument. Carrier-gas flow rate is fixed and is not part of the retention-time calculation. Because of this latter condition and the method used to determine retention time, this simulator is not useful for gas chromatography method development and is not intended for such use. The purpose of the simulator is to give a beginning student experience in what happens as column temperature is varied, why one might need temperature programming, why an autosampler might be useful, and the pitfalls of "smart" integrators. When students make mistakes in instrument setup with the simulator the consequences are not damaging to the simulator but might cause serious problems with a real instrument. Hardware and Software Requirements Hardware and software requirements for A GC Instrument Simulator are shown in Table 1. right> right> Shown (right to left) are the main instrument control window and the manual injection window from A GC Instrument Simulator.

  15. A Combined Syntactical and Statistical Approach for R Peak Detection in Real-Time Long-Term Heart Rate Variability Analysis

    David Pang

    2018-06-01

    Full Text Available Long-term heart rate variability (HRV analysis is useful as a noninvasive technique for autonomic nervous system activity assessment. It provides a method for assessing many physiological and pathological factors that modulate the normal heartbeat. The performance of HRV analysis systems heavily depends on a reliable and accurate detection of the R peak of the QRS complex. Ectopic beats caused by misdetection or arrhythmic events can introduce bias into HRV results, resulting in significant problems in their interpretation. This study presents a novel method for long-term detection of normal R peaks (which represent the normal heartbeat in electrocardiographic signals, intended specifically for HRV analysis. The very low computational complexity of the proposed method, which combines and exploits the advantages of syntactical and statistical approaches, enables real-time applications. The approach was validated using the Massachusetts Institute of Technology–Beth Israel Hospital Normal Sinus Rhythm and the Fantasia database, and has a sensitivity, positive predictivity, detection error rate, and accuracy of 99.998, 99.999, 0.003, and 99.996%, respectively.

  16. Component- and system-level degradation modeling of digital Instrumentation and Control systems based on a Multi-State Physics Modeling Approach

    Wang, Wei; Di Maio, Francesco; Zio, Enrico

    2016-01-01

    Highlights: • A Multi-State Physics Modeling (MSPM) framework for reliability assessment is proposed. • Monte Carlo (MC) simulation is utilized to estimate the degradation state probability. • Due account is given to stochastic uncertainty and deterministic degradation progression. • The MSPM framework is applied to the reliability assessment of a digital I&C system. • Results are compared with the results obtained with a Markov Chain Model (MCM). - Abstract: A system-level degradation modeling is proposed for the reliability assessment of digital Instrumentation and Control (I&C) systems in Nuclear Power Plants (NPPs). At the component level, we focus on the reliability assessment of a Resistance Temperature Detector (RTD), which is an important digital I&C component used to guarantee the safe operation of NPPs. A Multi-State Physics Model (MSPM) is built to describe this component degradation progression towards failure and Monte Carlo (MC) simulation is used to estimate the probability of sojourn in any of the previously defined degradation states, by accounting for both stochastic and deterministic processes that affect the degradation progression. The MC simulation relies on an integrated modeling of stochastic processes with deterministic aging of components that results to be fundamental for estimating the joint cumulative probability distribution of finding the component in any of the possible degradation states. The results of the application of the proposed degradation model to a digital I&C system of literature are compared with the results obtained by a Markov Chain Model (MCM). The integrated stochastic-deterministic process here proposed to drive the MC simulation is viable to integrate component-level models into a system-level model that would consider inter-system or/and inter-component dependencies and uncertainties.

  17. IOT Overview: IR Instruments

    Mason, E.

    In this instrument review chapter the calibration plans of ESO IR instruments are presented and briefly reviewed focusing, in particular, on the case of ISAAC, which has been the first IR instrument at VLT and whose calibration plan served as prototype for the coming instruments.

  18. Health physics instrument manual

    Gupton, E.D.

    1978-08-01

    The purpose of this manual is to provide apprentice health physics surveyors and other operating groups not directly concerned with radiation detection instruments a working knowledge of the radiation detection and measuring instruments in use at the Laboratory. The characteristics and applications of the instruments are given. Portable instruments, stationary instruments, personnel monitoring instruments, sample counters, and miscellaneous instruments are described. Also, information sheets on calibration sources, procedures, and devices are included. Gamma sources, beta sources, alpha sources, neutron sources, special sources, a gamma calibration device for badge dosimeters, and a calibration device for ionization chambers are described

  19. Astronomical Instruments in India

    Sarma, Sreeramula Rajeswara

    The earliest astronomical instruments used in India were the gnomon and the water clock. In the early seventh century, Brahmagupta described ten types of instruments, which were adopted by all subsequent writers with minor modifications. Contact with Islamic astronomy in the second millennium AD led to a radical change. Sanskrit texts began to lay emphasis on the importance of observational instruments. Exclusive texts on instruments were composed. Islamic instruments like the astrolabe were adopted and some new types of instruments were developed. Production and use of these traditional instruments continued, along with the cultivation of traditional astronomy, up to the end of the nineteenth century.

  20. A novel approach to the quantitative detection of anabolic steroids in bovine muscle tissue by means of a hybrid quadrupole time-of-flight-mass spectrometry instrument.

    Bussche, Julie Vanden; Decloedt, Anneleen; Van Meulebroek, Lieven; De Clercq, Nathalie; Lock, Stephen; Stahl-Zeng, Jianru; Vanhaecke, Lynn

    2014-09-19

    In recent years, the analysis of veterinary drugs and growth-promoting agents has shifted from target-oriented procedures, mainly based on liquid chromatography coupled to triple-quadrupole mass spectrometry (LC-QqQ-MS), towards accurate mass full scan MS (such as Time-of-Flight (ToF) and Fourier Transform (FT)-MS). In this study, the performance of a hybrid analysis instrument (i.e. UHPLC-QuadrupoleTime-of-Flight-MS (QqToF-MS)), able to exploit both full scan HR and MS/MS capabilities within a single analytical platform, was evaluated for confirmatory analysis of anabolic steroids (gestagens, estrogens including stilbenes and androgens) in meat. The validation data was compared to previously obtained results (CD 2002/657/EC) for QqQ-MS and single stage Orbitrap-MS. Additionally, a fractional factorial design was used to shorten and optimize the sample extraction. Validation according to CD 2002/657/EC demonstrated that steroid analysis using QqToF has a higher competing value towards QqQ-MS in terms of selectivity/specificity, compared to single stage Orbitrap-MS. While providing excellent linearity, based on lack-of-fit calculations (F-test, α=0.05 for all steroids except 17β-ethinylestradiol: α=0.01), the sensitivity of QqToF-MS proved for 61.8% and 85.3% of the compounds more sensitive compared to QqQ-MS and Orbitrap-MS, respectively. Indeed, the CCα values, obtained upon ToF-MS/MS detection, ranged from 0.02 to 1.74μgkg(-1) for the 34 anabolic steroids, while for QqQ-MS and Orbitrap-MS values ranged from 0.04 to 0.88μgkg(-1) and from 0.07 to 2.50μgkg(-1), respectively. Using QqToF-MS and QqQ-MS, adequate precision was obtained as relative standard deviations for repeatability and within-laboratory reproducibility, were below 20%. In case of Orbitrap-MS, some compounds (i.e. some estrogens) displayed poor precision, which was possibly caused by some lack of sensitivity at lower concentrations and the absence of MRM-like experiments. Overall, it can be

  1. Troubleshooting in nuclear instruments

    1987-06-01

    This report on troubleshooting of nuclear instruments is the product of several scientists and engineers, who are closely associated with nuclear instrumentation and with the IAEA activities in the field. The text covers the following topics: Preamplifiers, amplifiers, scalers, timers, ratemeters, multichannel analyzers, dedicated instruments, tools, instruments, accessories, components, skills, interfaces, power supplies, preventive maintenance, troubleshooting in systems, radiation detectors. The troubleshooting and repair of instruments is illustrated by some real examples

  2. The Development of the Alpha-Omega Completed Sentence Form (AOCSF): An Instrument to Aid in the Measurement, Identification, and Assessment of an Individual's "Adaptational Approach(es)" to the Stressful Event of Death and Related Issues.

    Klein, Ronald; And Others

    The Alpha Omega Completed Sentence Form (AOCSF) was developed to identify and measure a person's adaptational approaches to information concerning their own death or the possible death of a significant other. In contrast to the Kubler-Ross stage theory, the adaptational approach recognizes a person's capacity to assimilate new information which…

  3. Interrelationships in the Variability of Root Canal Anatomy among the Permanent Teeth: A Full-Mouth Approach by Cone-Beam CT.

    Monsarrat, Paul; Arcaute, Bertrand; Peters, Ove A; Maury, Elisabeth; Telmon, Norbert; Georgelin-Gurgel, Marie; Maret, Delphine

    2016-01-01

    In endodontic practice, clinicians should be aware of possible root canal anatomic variations. The aim of this study was to assess using CBCT acquisitions regarding whether one root canal anatomy of a tooth is associated with a specific anatomy of another tooth. A total of 106 CBCT acquisitions were obtained using a CBCT scanner with 200μm voxel size. Numbers of roots and canals of the entire dentition were described. Bivariate analyses and logistic regressions were conducted to explore root canal anatomy on one tooth according to age, gender, jaw, side and the others teeth. Multiple correspondence analysis (MCA) was performed to correlate the different numbers of canals profiles. A total of 2424 teeth were analyzed. Independently from the other variables, the presence of an additional root canal on a mandibular incisor increases the risk of having an additional root canal on a mandibular premolar (OR [95%] 3.7 [1.0;13.2]). The mandibular molar variability increases in women compared to men (OR [95%] 0.4 [0.1; 0.9]). MCA showed correspondence between 2-canals maxillary incisor and canines and 5-canals maxillary molars, and some correlation between additional canal on maxillary and mandibular premolars. Although CBCT examinations are conducted in the first intention of making a diagnosis or prognostic evaluation, medium FOV acquisitions could be used as an initial database thus furnishing preliminary evaluations and information. In endodontic practice, clinicians should be aware of possible root canal anatomic variations. The visualization of all canals is considered essential in endodontic therapy. The use of multi-correspondence analysis for statistics in endodontic research is a new approach as a prognostic tool.

  4. State-space approach to evaluate spatial variability of field measured soil water status along a line transect in a volcanic-vesuvian soil

    A. Comegna

    2010-12-01

    Full Text Available Unsaturated hydraulic properties and their spatial variability today are analyzed in order to use properly mathematical models developed to simulate flow of the water and solute movement at the field-scale soils. Many studies have shown that observations of soil hydraulic properties should not be considered purely random, given that they possess a structure which may be described by means of stochastic processes. The techniques used for analyzing such a structure have essentially been based either on the theory of regionalized variables or to a lesser extent, on the analysis of time series. This work attempts to use the time-series approach mentioned above by means of a study of pressure head h and water content θ which characterize soil water status, in the space-time domain. The data of the analyses were recorded in the open field during a controlled drainage process, evaporation being prevented, along a 50 m transect in a volcanic Vesuvian soil. The isotropic hypothesis is empirical proved and then the autocorrelation ACF and the partial autocorrelation functions PACF were used to identify and estimate the ARMA(1,1 statistical model for the analyzed series and the AR(1 for the extracted signal. Relations with a state-space model are investigated, and a bivariate AR(1 model fitted. The simultaneous relations between θ and h are considered and estimated. The results are of value for sampling strategies and they should incite to a larger use of time and space series analysis.

  5. Interrelationships in the Variability of Root Canal Anatomy among the Permanent Teeth: A Full-Mouth Approach by Cone-Beam CT.

    Paul Monsarrat

    Full Text Available In endodontic practice, clinicians should be aware of possible root canal anatomic variations. The aim of this study was to assess using CBCT acquisitions regarding whether one root canal anatomy of a tooth is associated with a specific anatomy of another tooth.A total of 106 CBCT acquisitions were obtained using a CBCT scanner with 200μm voxel size. Numbers of roots and canals of the entire dentition were described. Bivariate analyses and logistic regressions were conducted to explore root canal anatomy on one tooth according to age, gender, jaw, side and the others teeth. Multiple correspondence analysis (MCA was performed to correlate the different numbers of canals profiles.A total of 2424 teeth were analyzed. Independently from the other variables, the presence of an additional root canal on a mandibular incisor increases the risk of having an additional root canal on a mandibular premolar (OR [95%] 3.7 [1.0;13.2]. The mandibular molar variability increases in women compared to men (OR [95%] 0.4 [0.1; 0.9]. MCA showed correspondence between 2-canals maxillary incisor and canines and 5-canals maxillary molars, and some correlation between additional canal on maxillary and mandibular premolars.Although CBCT examinations are conducted in the first intention of making a diagnosis or prognostic evaluation, medium FOV acquisitions could be used as an initial database thus furnishing preliminary evaluations and information. In endodontic practice, clinicians should be aware of possible root canal anatomic variations. The visualization of all canals is considered essential in endodontic therapy. The use of multi-correspondence analysis for statistics in endodontic research is a new approach as a prognostic tool.

  6. Evaluation of Validity and Reliability for Hierarchical Scales Using Latent Variable Modeling

    Raykov, Tenko; Marcoulides, George A.

    2012-01-01

    A latent variable modeling method is outlined, which accomplishes estimation of criterion validity and reliability for a multicomponent measuring instrument with hierarchical structure. The approach provides point and interval estimates for the scale criterion validity and reliability coefficients, and can also be used for testing composite or…

  7. Instrumented Pipeline Initiative

    Thomas Piro; Michael Ream

    2010-07-31

    This report summarizes technical progress achieved during the cooperative agreement between Concurrent Technologies Corporation (CTC) and U.S. Department of Energy to address the need for a for low-cost monitoring and inspection sensor system as identified in the Department of Energy (DOE) National Gas Infrastructure Research & Development (R&D) Delivery Reliability Program Roadmap.. The Instrumented Pipeline Initiative (IPI) achieved the objective by researching technologies for the monitoring of pipeline delivery integrity, through a ubiquitous network of sensors and controllers to detect and diagnose incipient defects, leaks, and failures. This report is organized by tasks as detailed in the Statement of Project Objectives (SOPO). The sections all state the objective and approach before detailing results of work.

  8. Performing the Super Instrument

    Kallionpaa, Maria

    2016-01-01

    can empower performers by producing super instrument works that allow the concert instrument to become an ensemble controlled by a single player. The existing instrumental skills of the performer can be multiplied and the qualities of regular acoustic instruments extended or modified. Such a situation......The genre of contemporary classical music has seen significant innovation and research related to new super, hyper, and hybrid instruments, which opens up a vast palette of expressive potential. An increasing number of composers, performers, instrument designers, engineers, and computer programmers...... have become interested in different ways of “supersizing” acoustic instruments in order to open up previously-unheard instrumental sounds. Super instruments vary a great deal but each has a transformative effect on the identity and performance practice of the performing musician. Furthermore, composers...

  9. Advances in control and instrumentation

    Surendar, Ch.

    1994-01-01

    Control and instrumentation systems have seen significant changes from pneumatic to electronic with the advent of transistors and integrated circuits. Miniaturization was realised. With the introduction of microprocessors there has been a revolutionary change in the approach in instrumentation and control systems in the areas of sensors, data acquisition/transmission, processing for control, and presentation of the information to the operator. An effort is made to give some insight into these areas, with some idea of the advantages to which these systems are being put to use in the nuclear facilities, particularly nuclear power reactors. (author)

  10. Interplanetary variability recorded by the sled instrument aboard the Phobos spacecraft during that period of solar cycle 22 characterized by a transition from solar minimum- to solar maximum-dominated conditions

    McKenna-Lawlor, S.M.P. (Saint Patrick' s Coll., Maynooth (Ireland)); Afonin, V.V.; Gringauz, K.I. (AN SSSR, Moscow (USSR). Space Research Inst.) (and others)

    Twin telescope particle detector systems SLED-1 and SLED-2, with the capability of monitoring electron and ion fluxes within an energy range spanning approximately 30 keV to a few megaelectron volts, were individually launched on the two spacecraft (Phobos-2 and Phobos-1, respectively) of the Soviet Phobos Mission to Mars and its moons in July 1988. A short description of the SLED instrument and a preliminary account of representative solar-related particle enhancements recorded by SLED-1 and SLED-2 during the Cruise Phase, and by SLED-1 in the near Martian environment (within the interval 25 July 1988-26 March 1989) are presented. These observations were made while the interplanetary medium was in the course of changing over from solar minimum- to solar maximum-dominated conditions and examples are presented of events associated with each of these phenomenological states. (author).

  11. Combining data visualization and statistical approaches for interpreting measurements and meta-data: Integrating heatmaps, variable clustering, and mixed regression models

    The advent of new higher throughput analytical instrumentation has put a strain on interpreting and explaining the results from complex studies. Contemporary human, environmental, and biomonitoring data sets are comprised of tens or hundreds of analytes, multiple repeat measures...

  12. On Representative Spaceflight Instrument and Associated Instrument Sensor Web Framework

    Kizhner, Semion; Patel, Umeshkumar; Vootukuru, Meg

    2007-01-01

    Sensor Web-based adaptation and sharing of space flight mission resources, including those of the Space-Ground and Control-User communication segment, could greatly benefit from utilization of heritage Internet Protocols and devices applied for Spaceflight (SpaceIP). This had been successfully demonstrated by a few recent spaceflight experiments. However, while terrestrial applications of Internet protocols are well developed and understood (mostly due to billions of dollars in investments by the military and industry), the spaceflight application of Internet protocols is still in its infancy. Progress in the developments of SpaceIP-enabled instrument components will largely determine the SpaceIP utilization of those investments and acceptance in years to come. Likewise SpaceIP, the development of commercial real-time and instrument colocated computational resources, data compression and storage, can be enabled on-board a spacecraft and, in turn, support a powerful application to Sensor Web-based design of a spaceflight instrument. Sensor Web-enabled reconfiguration and adaptation of structures for hardware resources and information systems will commence application of Field Programmable Arrays (FPGA) and other aerospace programmable logic devices for what this technology was intended. These are a few obvious potential benefits of Sensor Web technologies for spaceflight applications. However, they are still waiting to be explored. This is because there is a need for a new approach to spaceflight instrumentation in order to make these mature sensor web technologies applicable for spaceflight. In this paper we present an approach in developing related and enabling spaceflight instrument-level technologies based on the new concept of a representative spaceflight Instrument Sensor Web (ISW).

  13. Vapor-liquid phase behavior of a size-asymmetric model of ionic fluids confined in a disordered matrix: The collective-variables-based approach

    Patsahan, O. V.; Patsahan, T. M.; Holovko, M. F.

    2018-02-01

    We develop a theory based on the method of collective variables to study the vapor-liquid equilibrium of asymmetric ionic fluids confined in a disordered porous matrix. The approach allows us to formulate the perturbation theory using an extension of the scaled particle theory for a description of a reference system presented as a two-component hard-sphere fluid confined in a hard-sphere matrix. Treating an ionic fluid as a size- and charge-asymmetric primitive model (PM) we derive an explicit expression for the relevant chemical potential of a confined ionic system which takes into account the third-order correlations between ions. Using this expression, the phase diagrams for a size-asymmetric PM are calculated for different matrix porosities as well as for different sizes of matrix and fluid particles. It is observed that general trends of the coexistence curves with the matrix porosity are similar to those of simple fluids under disordered confinement, i.e., the coexistence region gets narrower with a decrease of porosity and, simultaneously, the reduced critical temperature Tc* and the critical density ρi,c * become lower. At the same time, our results suggest that an increase in size asymmetry of oppositely charged ions considerably affects the vapor-liquid diagrams leading to a faster decrease of Tc* and ρi,c * and even to a disappearance of the phase transition, especially for the case of small matrix particles.

  14. Patient perspective: choosing or developing instruments.

    Kirwan, John R; Fries, James F; Hewlett, Sarah; Osborne, Richard H

    2011-08-01

    Previous Outcome Measures in Rheumatology (OMERACT) meetings recognized that patients view outcomes of intervention from a different perspective. This preconference position paper briefly sets out 2 patient-reported outcome (PRO) instrument approaches, the PROMISE computer adaptive testing (CAT) system and development of a rheumatoid arthritis-specific questionnaire to measure fatigue; a tentative proposal for a PRO instrument development pathway is also made.

  15. Effect of geotropism on instrument readings

    Rolph, James T.

    2006-01-01

    A review of gravity's effect on instrument readings, also referred to as geotropism. In this essay a review of meter movement construction and the effect are reviewed as it applies to portable radiation instruments. Reference to the three ANSI standards and their requirements are reviewed. An alternate approach to test for the effects is offered

  16. Minimally invasive scoliosis surgery assisted by O-arm navigation for Lenke Type 5C adolescent idiopathic scoliosis: a comparison with standard open approach spinal instrumentation.

    Zhu, Weiguo; Sun, Weixiang; Xu, Leilei; Sun, Xu; Liu, Zhen; Qiu, Yong; Zhu, Zezhang

    2017-04-01

    alternative to open surgery for patients with Lenke Type 5C AIS. Compared with results of the open approach, the outcomes of MISS are promising, with reduced morbidity. Before the routine use of MISS, however, long-term data are needed.

  17. Instrument Modeling and Synthesis

    Horner, Andrew B.; Beauchamp, James W.

    During the 1970s and 1980s, before synthesizers based on direct sampling of musical sounds became popular, replicating musical instruments using frequency modulation (FM) or wavetable synthesis was one of the “holy grails” of music synthesis. Synthesizers such as the Yamaha DX7 allowed users great flexibility in mixing and matching sounds, but were notoriously difficult to coerce into producing sounds like those of a given instrument. Instrument design wizards practiced the mysteries of FM instrument design.

  18. Nuclear reactor instrumentation

    Duncombe, E.; McGonigal, G.

    1975-01-01

    A liquid metal cooled nuclear reactor is described which has an equal number of fuel sub-assemblies and sensing instruments. Each instrument senses temperature and rate of coolant flow of a coolant derived from a group of three sub-assemblies so that an abnormal value for one sub-assembly will be indicated on three instruments thereby providing for redundancy of up to two of the three instruments. The abnormal value may be a precurser to unstable boiling of coolant

  19. Aeroacoustics of Musical Instruments

    Fabre, B.; Gilbert, J.; Hirschberg, Abraham; Pelorson, X.

    2012-01-01

    We are interested in the quality of sound produced by musical instruments and their playability. In wind instruments, a hydrodynamic source of sound is coupled to an acoustic resonator. Linear acoustics can predict the pitch of an instrument. This can significantly reduce the trial-and-error process

  20. Variability Bugs:

    Melo, Jean

    . Although many researchers suggest that preprocessor-based variability amplifies maintenance problems, there is little to no hard evidence on how actually variability affects programs and programmers. Specifically, how does variability affect programmers during maintenance tasks (bug finding in particular......)? How much harder is it to debug a program as variability increases? How do developers debug programs with variability? In what ways does variability affect bugs? In this Ph.D. thesis, I set off to address such issues through different perspectives using empirical research (based on controlled...... experiments) in order to understand quantitatively and qualitatively the impact of variability on programmers at bug finding and on buggy programs. From the program (and bug) perspective, the results show that variability is ubiquitous. There appears to be no specific nature of variability bugs that could...

  1. Musical Sound, Instruments, and Equipment

    Photinos, Panos

    2017-12-01

    'Musical Sound, Instruments, and Equipment' offers a basic understanding of sound, musical instruments and music equipment, geared towards a general audience and non-science majors. The book begins with an introduction of the fundamental properties of sound waves, and the perception of the characteristics of sound. The relation between intensity and loudness, and the relation between frequency and pitch are discussed. The basics of propagation of sound waves, and the interaction of sound waves with objects and structures of various sizes are introduced. Standing waves, harmonics and resonance are explained in simple terms, using graphics that provide a visual understanding. The development is focused on musical instruments and acoustics. The construction of musical scales and the frequency relations are reviewed and applied in the description of musical instruments. The frequency spectrum of selected instruments is explored using freely available sound analysis software. Sound amplification and sound recording, including analog and digital approaches, are discussed in two separate chapters. The book concludes with a chapter on acoustics, the physical factors that affect the quality of the music experience, and practical ways to improve the acoustics at home or small recording studios. A brief technical section is provided at the end of each chapter, where the interested reader can find the relevant physics and sample calculations. These quantitative sections can be skipped without affecting the comprehension of the basic material. Questions are provided to test the reader's understanding of the material. Answers are given in the appendix.

  2. Status of safeguards instrumentation

    Higinbotham, W.A.

    The International Atomic Energy Agency is performing safeguards at some nuclear power reactors, 50 bulk processing facilities, and 170 research facilities. Its verification activities require the use of instruments to measure nuclear materials and of surveillance instruments to maintain continuity of knowledge of the locations of nuclear materials. Instruments that are in use and under development to measure weight, volume, concentration, and isotopic composition of nuclear materials, and the major surveillance instruments, are described in connection with their uses at representative nuclear facilities. The current status of safeguards instrumentation and the needs for future development are discussed

  3. Early modern mathematical instruments.

    Bennett, Jim

    2011-12-01

    In considering the appropriate use of the terms "science" and "scientific instrument," tracing the history of "mathematical instruments" in the early modern period is offered as an illuminating alternative to the historian's natural instinct to follow the guiding lights of originality and innovation, even if the trail transgresses contemporary boundaries. The mathematical instrument was a well-defined category, shared across the academic, artisanal, and commercial aspects of instrumentation, and its narrative from the sixteenth to the eighteenth century was largely independent from other classes of device, in a period when a "scientific" instrument was unheard of.

  4. Communication: Wigner functions in action-angle variables, Bohr-Sommerfeld quantization, the Heisenberg correspondence principle, and a symmetrical quasi-classical approach to the full electronic density matrix

    Miller, William H.; Cotton, Stephen J.

    2016-01-01

    It is pointed out that the classical phase space distribution in action-angle (a-a) variables obtained from a Wigner function depends on how the calculation is carried out: if one computes the standard Wigner function in Cartesian variables (p, x), and then replaces p and x by their expressions in terms of a-a variables, one obtains a different result than if the Wigner function is computed directly in terms of the a-a variables. Furthermore, the latter procedure gives a result more consistent with classical and semiclassical theory—e.g., by incorporating the Bohr-Sommerfeld quantization condition (quantum states defined by integer values of the action variable) as well as the Heisenberg correspondence principle for matrix elements of an operator between such states—and has also been shown to be more accurate when applied to electronically non-adiabatic applications as implemented within the recently developed symmetrical quasi-classical (SQC) Meyer-Miller (MM) approach. Moreover, use of the Wigner function (obtained directly) in a-a variables shows how our standard SQC/MM approach can be used to obtain off-diagonal elements of the electronic density matrix by processing in a different way the same set of trajectories already used (in the SQC/MM methodology) to obtain the diagonal elements.

  5. Communication: Wigner functions in action-angle variables, Bohr-Sommerfeld quantization, the Heisenberg correspondence principle, and a symmetrical quasi-classical approach to the full electronic density matrix.

    Miller, William H; Cotton, Stephen J

    2016-08-28

    It is pointed out that the classical phase space distribution in action-angle (a-a) variables obtained from a Wigner function depends on how the calculation is carried out: if one computes the standard Wigner function in Cartesian variables (p, x), and then replaces p and x by their expressions in terms of a-a variables, one obtains a different result than if the Wigner function is computed directly in terms of the a-a variables. Furthermore, the latter procedure gives a result more consistent with classical and semiclassical theory-e.g., by incorporating the Bohr-Sommerfeld quantization condition (quantum states defined by integer values of the action variable) as well as the Heisenberg correspondence principle for matrix elements of an operator between such states-and has also been shown to be more accurate when applied to electronically non-adiabatic applications as implemented within the recently developed symmetrical quasi-classical (SQC) Meyer-Miller (MM) approach. Moreover, use of the Wigner function (obtained directly) in a-a variables shows how our standard SQC/MM approach can be used to obtain off-diagonal elements of the electronic density matrix by processing in a different way the same set of trajectories already used (in the SQC/MM methodology) to obtain the diagonal elements.

  6. Communication: Wigner functions in action-angle variables, Bohr-Sommerfeld quantization, the Heisenberg correspondence principle, and a symmetrical quasi-classical approach to the full electronic density matrix

    Miller, William H., E-mail: millerwh@berkeley.edu; Cotton, Stephen J., E-mail: StephenJCotton47@gmail.com [Department of Chemistry and Kenneth S. Pitzer Center for Theoretical Chemistry, University of California, and Chemical Sciences Division, Lawrence Berkeley National Laboratory, Berkeley, California 94720 (United States)

    2016-08-28

    It is pointed out that the classical phase space distribution in action-angle (a-a) variables obtained from a Wigner function depends on how the calculation is carried out: if one computes the standard Wigner function in Cartesian variables (p, x), and then replaces p and x by their expressions in terms of a-a variables, one obtains a different result than if the Wigner function is computed directly in terms of the a-a variables. Furthermore, the latter procedure gives a result more consistent with classical and semiclassical theory—e.g., by incorporating the Bohr-Sommerfeld quantization condition (quantum states defined by integer values of the action variable) as well as the Heisenberg correspondence principle for matrix elements of an operator between such states—and has also been shown to be more accurate when applied to electronically non-adiabatic applications as implemented within the recently developed symmetrical quasi-classical (SQC) Meyer-Miller (MM) approach. Moreover, use of the Wigner function (obtained directly) in a-a variables shows how our standard SQC/MM approach can be used to obtain off-diagonal elements of the electronic density matrix by processing in a different way the same set of trajectories already used (in the SQC/MM methodology) to obtain the diagonal elements.

  7. Instrumentation a reader

    Pope, P

    1990-01-01

    This book contains a selection of papers and articles in instrumentation previously pub­ lished in technical periodicals and journals of learned societies. Our selection has been made to illustrate aspects of current practice and applications of instrumentation. The book does not attempt to be encyclopaedic in its coverage of the subject, but to provide some examples of general transduction techniques, of the sensing of particular measurands, of components of instrumentation systems and of instrumentation practice in two very different environments, the food industry and the nuclear power industry. We have made the selection particularly to provide papers appropriate to the study of the Open University course T292 Instrumentation. The papers have been chosen so that the book covers a wide spectrum of instrumentation techniques. Because of this, the book should be of value not only to students of instrumen­ tation, but also to practising engineers and scientists wishing to glean ideas from areas of instrumen...

  8. Instrumentation for Nuclear Applications

    1998-01-01

    The objective of this project was to develop and coordinate nuclear instrumentation standards with resulting economies for the nuclear and radiation fields. There was particular emphasis on coordination and management of the Nuclear Instrument Module (NIM) System, U.S. activity involving the CAMAC international standard dataway system, the FASTBUS modular high-speed data acquisition and control system and processing and management of national nuclear instrumentation and detector standards, as well as a modest amount of assistance and consultation services to the Pollutant Characterization and Safety Research Division of the Office of Health and Environmental Research. The principal accomplishments were the development and maintenance of the NIM instrumentation system that is the predominant instrumentation system in the nuclear and radiation fields worldwide, the CAMAC digital interface system in coordination with the ESONE Committee of European Laboratories, the FASTBUS high-speed system and numerous national and international nuclear instrumentation standards

  9. VIRUS instrument enclosures

    Prochaska, T.; Allen, R.; Mondrik, N.; Rheault, J. P.; Sauseda, M.; Boster, E.; James, M.; Rodriguez-Patino, M.; Torres, G.; Ham, J.; Cook, E.; Baker, D.; DePoy, Darren L.; Marshall, Jennifer L.; Hill, G. J.; Perry, D.; Savage, R. D.; Good, J. M.; Vattiat, Brian L.

    2014-08-01

    The Visible Integral-Field Replicable Unit Spectrograph (VIRUS) instrument will be installed at the Hobby-Eberly Telescope† in the near future. The instrument will be housed in two enclosures that are mounted adjacent to the telescope, via the VIRUS Support Structure (VSS). We have designed the enclosures to support and protect the instrument, to enable servicing of the instrument, and to cool the instrument appropriately while not adversely affecting the dome environment. The system uses simple HVAC air handling techniques in conjunction with thermoelectric and standard glycol heat exchangers to provide efficient heat removal. The enclosures also provide power and data transfer to and from each VIRUS unit, liquid nitrogen cooling to the detectors, and environmental monitoring of the instrument and dome environments. In this paper, we describe the design and fabrication of the VIRUS enclosures and their subsystems.

  10. Work-nonwork interference: Preliminary results on the psychometric properties of a new instrument

    Eileen Koekemoer

    2010-11-01

    Research purpose: The objectives of this study were to investigate the internal validity (construct, discriminant and convergent validity, reliability and external validity (relationship with theoretically relevant variables, including job characteristics, home characteristics, burnout, ill health and life satisfaction of the instrument. Motivation for the study: Work-family interaction is a key topic receiving significant research attention. In order to facilitate comparison across work-family studies, the use of psychometrically sound instruments is of great importance. Research design, approach and method: A cross-sectional survey design was used for the target population of married employees with children working at a tertiary institution in the North West province (n = 366. In addition to the new instrument, job characteristics, home characteristics, burnout, ill health and life satisfaction were measured. Main findings: The results provided evidence for construct, discriminant and convergent validity, reliability and significant relations with external variables. Practical/managerial implications: The new instrument can be used by researchers and managers as a test under development to investigate the interference between work and different nonwork roles (i.e. parental role, spousal role, work role, domestic role and specific relations with antecedents (e.g. job/home characteristics and well-being (e.g. burnout, ill health and life satisfaction. Contribution/value-add: This study provides preliminary information on the psychometric properties of a new instrument that measures the interference between work and nonwork.

  11. Radiation protection instrument 1993

    1993-04-01

    The Radiation Protection Instrument, 1993 (Legislative Instrument 1559) prescribes the powers and functions of the Radiation Protection Board established under the Ghana Atomic Energy Commission by the Atomic Energy Commission (Amendment) Law, 1993 (P.N.D.C. Law 308). Also included in the Legislative Instrument are schedules on control and use of ionising radiation and radiation sources as well as procedures for notification, licensing and inspection of ionising radiation facilities. (EAA)

  12. Networked Instrumentation Element

    National Aeronautics and Space Administration — Armstrong researchers have developed a networked instrumentation system that connects modern experimental payloads to existing analog and digital communications...

  13. Instrument validation project

    Reynolds, B.A.; Daymo, E.A.; Geeting, J.G.H.; Zhang, J.

    1996-06-01

    Westinghouse Hanford Company Project W-211 is responsible for providing the system capabilities to remove radioactive waste from ten double-shell tanks used to store radioactive wastes on the Hanford Site in Richland, Washington. The project is also responsible for measuring tank waste slurry properties prior to injection into pipeline systems, including the Replacement of Cross-Site Transfer System. This report summarizes studies of the appropriateness of the instrumentation specified for use in Project W-211. The instruments were evaluated in a test loop with simulated slurries that covered the range of properties specified in the functional design criteria. The results of the study indicate that the compact nature of the baseline Project W-211 loop does not result in reduced instrumental accuracy resulting from poor flow profile development. Of the baseline instrumentation, the Micromotion densimeter, the Moore Industries thermocouple, the Fischer and Porter magnetic flow meter, and the Red Valve Pressure transducer meet the desired instrumental accuracy. An alternate magnetic flow meter (Yokagawa) gave nearly identical results as the baseline fischer and Porter. The Micromotion flow meter did not meet the desired instrument accuracy but could potentially be calibrated so that it would meet the criteria. The Nametre on-line viscometer did not meet the desired instrumental accuracy and is not recommended as a quantitative instrument although it does provide qualitative information. The recommended minimum set of instrumentation necessary to ensure the slurry meets the Project W-058 acceptance criteria is the Micromotion mass flow meter and delta pressure cells

  14. Instrument performance evaluation

    Swinth, K.L.

    1993-03-01

    Deficiencies exist in both the performance and the quality of health physics instruments. Recognizing the implications of such deficiencies for the protection of workers and the public, in the early 1980s the DOE and the NRC encouraged the development of a performance standard and established a program to test a series of instruments against criteria in the standard. The purpose of the testing was to establish the practicality of the criteria in the standard, to determine the performance of a cross section of available instruments, and to establish a testing capability. Over 100 instruments were tested, resulting in a practical standard and an understanding of the deficiencies in available instruments. In parallel with the instrument testing, a value-impact study clearly established the benefits of implementing a formal testing program. An ad hoc committee also met several times to establish recommendations for the voluntary implementation of a testing program based on the studies and the performance standard. For several reasons, a formal program did not materialize. Ongoing tests and studies have supported the development of specific instruments and have helped specific clients understand the performance of their instruments. The purpose of this presentation is to trace the history of instrument testing to date and suggest the benefits of a centralized formal program

  15. [Controlling instruments in radiology].

    Maurer, M

    2013-10-01

    Due to the rising costs and competitive pressures radiological clinics and practices are now facing, controlling instruments are gaining importance in the optimization of structures and processes of the various diagnostic examinations and interventional procedures. It will be shown how the use of selected controlling instruments can secure and improve the performance of radiological facilities. A definition of the concept of controlling will be provided. It will be shown which controlling instruments can be applied in radiological departments and practices. As an example, two of the controlling instruments, material cost analysis and benchmarking, will be illustrated.

  16. Ocean Optics Instrumentation Systems

    Federal Laboratory Consortium — FUNCTION: Provides instrumentation suites for a wide variety of measurements to characterize the ocean’s optical environment. These packages have been developed to...

  17. Contribution to the discussion of P.M. Fayers and David J. Hand: Causal variables, indicator variables and measurement scales: an example from quality of life

    Keiding, Niels

    2002-01-01

    Causal variables; Clinimetric scales; Composite scales; Construct validity; Measurement scales; Multi-item scales; Quality-of-life instruments......Causal variables; Clinimetric scales; Composite scales; Construct validity; Measurement scales; Multi-item scales; Quality-of-life instruments...

  18. Pulsating variables

    1989-01-01

    The study of stellar pulsations is a major route to the understanding of stellar structure and evolution. At the South African Astronomical Observatory (SAAO) the following stellar pulsation studies were undertaken: rapidly oscillating Ap stars; solar-like oscillations in stars; 8-Scuti type variability in a classical Am star; Beta Cephei variables; a pulsating white dwarf and its companion; RR Lyrae variables and galactic Cepheids. 4 figs

  19. Transforming Image-Objects into Multiscale Fields: A GEOBIA Approach to Mitigate Urban Microclimatic Variability within H-Res Thermal Infrared Airborne Flight-Lines

    Mir Mustafizur Rahman

    2014-10-01

    Full Text Available In an effort to minimize complex urban microclimatic variability within high-resolution (H-Res airborne thermal infrared (TIR flight-lines, we describe the Thermal Urban Road Normalization (TURN algorithm, which is based on the idea of pseudo invariant features. By assuming a homogeneous road temperature within a TIR scene, we hypothesize that any variation observed in road temperature is the effect of local microclimatic variability. To model microclimatic variability, we define a road-object class (Road, compute the within-Road temperature variability, sample it at different spatial intervals (i.e., 10, 20, 50, and 100 m then interpolate samples over each flight-line to create an object-weighted variable temperature field (a TURN-surface. The optimal TURN-surface is then subtracted from the original TIR image, essentially creating a microclimate-free scene. Results at different sampling intervals are assessed based on their: (i ability to visually and statistically reduce overall scene variability and (ii computation speed. TURN is evaluated on three non-adjacent TABI-1800 flight-lines (~182 km2 that were acquired in 2012 at night over The City of Calgary, Alberta, Canada. TURN also meets a recent GEOBIA (Geospatial Object Based Image Analysis challenge by incorporating existing GIS vector objects within the GEOBIA workflow, rather than relying exclusively on segmentation methods.

  20. Overview of LOFT instrumentation

    Bixby, W.W.

    1979-01-01

    A description of instrumentation used in the Loss-of-Fluid Test (LOFT) large break Loss-of-Coolant Experiments is presented. Emphasis is placed on hydraulic and thermal measurements in the primary system piping and components, reactor vessel, and pressure suppression system. In addition, instrumentation which is being considered for measurement of phenomena during future small break testing is discussed