WorldWideScience

Sample records for variable importance measures

  1. Uncertainty importance measure for models with correlated normal variables

    International Nuclear Information System (INIS)

    Hao, Wenrui; Lu, Zhenzhou; Wei, Pengfei

    2013-01-01

    In order to explore the contributions by correlated input variables to the variance of the model output, the contribution decomposition of the correlated input variables based on Mara's definition is investigated in detail. By taking the quadratic polynomial output without cross term as an illustration, the solution of the contribution decomposition is derived analytically using the statistical inference theory. After the correction of the analytical solution is validated by the numerical examples, they are employed to two engineering examples to show their wide application. The derived analytical solutions can directly be used to recognize the contributions by the correlated input variables in case of the quadratic or linear polynomial output without cross term, and the analytical inference method can be extended to the case of higher order polynomial output. Additionally, the origins of the interaction contribution of the correlated inputs are analyzed, and the comparisons of the existing contribution indices are completed, on which the engineer can select the suitable indices to know the necessary information. At last, the degeneration of the correlated inputs to the uncorrelated ones and some computational issues are discussed in concept

  2. Bias in random forest variable importance measures: Illustrations, sources and a solution

    Directory of Open Access Journals (Sweden)

    Hothorn Torsten

    2007-01-01

    Full Text Available Abstract Background Variable importance measures for random forests have been receiving increased attention as a means of variable selection in many classification tasks in bioinformatics and related scientific fields, for instance to select a subset of genetic markers relevant for the prediction of a certain disease. We show that random forest variable importance measures are a sensible means for variable selection in many applications, but are not reliable in situations where potential predictor variables vary in their scale of measurement or their number of categories. This is particularly important in genomics and computational biology, where predictors often include variables of different types, for example when predictors include both sequence data and continuous variables such as folding energy, or when amino acid sequence data show different numbers of categories. Results Simulation studies are presented illustrating that, when random forest variable importance measures are used with data of varying types, the results are misleading because suboptimal predictor variables may be artificially preferred in variable selection. The two mechanisms underlying this deficiency are biased variable selection in the individual classification trees used to build the random forest on one hand, and effects induced by bootstrap sampling with replacement on the other hand. Conclusion We propose to employ an alternative implementation of random forests, that provides unbiased variable selection in the individual classification trees. When this method is applied using subsampling without replacement, the resulting variable importance measures can be used reliably for variable selection even in situations where the potential predictor variables vary in their scale of measurement or their number of categories. The usage of both random forest algorithms and their variable importance measures in the R system for statistical computing is illustrated and

  3. Importance measures

    International Nuclear Information System (INIS)

    Gomez Cobo, A.

    1997-01-01

    The presentation discusses the following: general concepts of importance measures; example fault tree, used to illustrate importance measures; Birnbaum's structural importance; criticality importance; Fussel-Vesely importance; upgrading function; risk achievement worth; risk reduction worth

  4. The behaviour of random forest permutation-based variable importance measures under predictor correlation.

    Science.gov (United States)

    Nicodemus, Kristin K; Malley, James D; Strobl, Carolin; Ziegler, Andreas

    2010-02-27

    Random forests (RF) have been increasingly used in applications such as genome-wide association and microarray studies where predictor correlation is frequently observed. Recent works on permutation-based variable importance measures (VIMs) used in RF have come to apparently contradictory conclusions. We present an extended simulation study to synthesize results. In the case when both predictor correlation was present and predictors were associated with the outcome (HA), the unconditional RF VIM attributed a higher share of importance to correlated predictors, while under the null hypothesis that no predictors are associated with the outcome (H0) the unconditional RF VIM was unbiased. Conditional VIMs showed a decrease in VIM values for correlated predictors versus the unconditional VIMs under HA and was unbiased under H0. Scaled VIMs were clearly biased under HA and H0. Unconditional unscaled VIMs are a computationally tractable choice for large datasets and are unbiased under the null hypothesis. Whether the observed increased VIMs for correlated predictors may be considered a "bias" - because they do not directly reflect the coefficients in the generating model - or if it is a beneficial attribute of these VIMs is dependent on the application. For example, in genetic association studies, where correlation between markers may help to localize the functionally relevant variant, the increased importance of correlated predictors may be an advantage. On the other hand, we show examples where this increased importance may result in spurious signals.

  5. An AUC-based permutation variable importance measure for random forests.

    Science.gov (United States)

    Janitza, Silke; Strobl, Carolin; Boulesteix, Anne-Laure

    2013-04-05

    The random forest (RF) method is a commonly used tool for classification with high dimensional data as well as for ranking candidate predictors based on the so-called random forest variable importance measures (VIMs). However the classification performance of RF is known to be suboptimal in case of strongly unbalanced data, i.e. data where response class sizes differ considerably. Suggestions were made to obtain better classification performance based either on sampling procedures or on cost sensitivity analyses. However to our knowledge the performance of the VIMs has not yet been examined in the case of unbalanced response classes. In this paper we explore the performance of the permutation VIM for unbalanced data settings and introduce an alternative permutation VIM based on the area under the curve (AUC) that is expected to be more robust towards class imbalance. We investigated the performance of the standard permutation VIM and of our novel AUC-based permutation VIM for different class imbalance levels using simulated data and real data. The results suggest that the new AUC-based permutation VIM outperforms the standard permutation VIM for unbalanced data settings while both permutation VIMs have equal performance for balanced data settings. The standard permutation VIM loses its ability to discriminate between associated predictors and predictors not associated with the response for increasing class imbalance. It is outperformed by our new AUC-based permutation VIM for unbalanced data settings, while the performance of both VIMs is very similar in the case of balanced classes. The new AUC-based VIM is implemented in the R package party for the unbiased RF variant based on conditional inference trees. The codes implementing our study are available from the companion website: http://www.ibe.med.uni-muenchen.de/organisation/mitarbeiter/070_drittmittel/janitza/index.html.

  6. Measuring lip force by oral screens. Part 1: Importance of screen size and individual variability.

    Science.gov (United States)

    Wertsén, Madeleine; Stenberg, Manne

    2017-06-01

    To reduce drooling and facilitate food transport in rehabilitation of patients with oral motor dysfunction, lip force can be trained using an oral screen. Longitudinal studies evaluating the effect of training require objective methods. The aim of this study was to evaluate a method for measuring lip strength, to investigate normal values and fluctuation of lip force in healthy adults on 1 occasion and over time, to study how the size of the screen affects the force, to evaluate the most appropriate measure of reliability, and to identify force performed in relation to gender. Three different sizes of oral screens were used to measure the lip force for 24 healthy adults on 3 different occasions, during a period of 6 months, using an apparatus based on strain gauge. The maximum lip force as evaluated with this method depends on the area of the screen size. By calculating the projected area of the screen, the lip force could be normalized to an oral screen pressure quantity expressed in kPa, which can be used for comparing measurements from screens with different sizes. Both the mean value and standard deviation were shown to vary between individuals. The study showed no differences regarding gender and only small variation with age. Normal variation over time (months) may be up to 3 times greater than the standard error of measurement at a certain occasion. The lip force increases in relation to the projected area of the screen. No general standard deviation can be assigned to the method and all measurements should be analyzed individually based on oral screen pressure to compensate for different screen sizes.

  7. Permutation importance: a corrected feature importance measure.

    Science.gov (United States)

    Altmann, André; Toloşi, Laura; Sander, Oliver; Lengauer, Thomas

    2010-05-15

    In life sciences, interpretability of machine learning models is as important as their prediction accuracy. Linear models are probably the most frequently used methods for assessing feature relevance, despite their relative inflexibility. However, in the past years effective estimators of feature relevance have been derived for highly complex or non-parametric models such as support vector machines and RandomForest (RF) models. Recently, it has been observed that RF models are biased in such a way that categorical variables with a large number of categories are preferred. In this work, we introduce a heuristic for normalizing feature importance measures that can correct the feature importance bias. The method is based on repeated permutations of the outcome vector for estimating the distribution of measured importance for each variable in a non-informative setting. The P-value of the observed importance provides a corrected measure of feature importance. We apply our method to simulated data and demonstrate that (i) non-informative predictors do not receive significant P-values, (ii) informative variables can successfully be recovered among non-informative variables and (iii) P-values computed with permutation importance (PIMP) are very helpful for deciding the significance of variables, and therefore improve model interpretability. Furthermore, PIMP was used to correct RF-based importance measures for two real-world case studies. We propose an improved RF model that uses the significant variables with respect to the PIMP measure and show that its prediction accuracy is superior to that of other existing models. R code for the method presented in this article is available at http://www.mpi-inf.mpg.de/ approximately altmann/download/PIMP.R CONTACT: altmann@mpi-inf.mpg.de, laura.tolosi@mpi-inf.mpg.de Supplementary data are available at Bioinformatics online.

  8. Variable importance in latent variable regression models

    NARCIS (Netherlands)

    Kvalheim, O.M.; Arneberg, R.; Bleie, O.; Rajalahti, T.; Smilde, A.K.; Westerhuis, J.A.

    2014-01-01

    The quality and practical usefulness of a regression model are a function of both interpretability and prediction performance. This work presents some new graphical tools for improved interpretation of latent variable regression models that can also assist in improved algorithms for variable

  9. Observing Two Important Teaching Variables.

    Science.gov (United States)

    Gustafson, John A.

    1986-01-01

    Two behaviors essential to good teaching, teacher expectation and teacher flexibility, have been incorporated into the observation system used in the student teacher program at the University of New Mexico. The importance of these behaviors in teaching and in evaluating student teachers is discussed. (MT)

  10. Importance measures and resource allocation

    International Nuclear Information System (INIS)

    Guey, C.N.; Morgan, T.; Hughes, E.A.

    1987-01-01

    This paper discusses various importance measures and their practical relevance to allocating resources. The characteristics of importance measures are illustrated through simple examples. Important factors associated with effectively allocating resources to improve plant system performance or to prevent system degradation are discussed. It is concluded that importance measures are only indicative of and not equal to the risk significance of a component, system, or event. A decision framework is suggested to provide a comprehensive basis for resource allocation

  11. Uses of risk importance measures

    International Nuclear Information System (INIS)

    Mankamo, T.; Poern, K.; Holmberg, J.

    1991-05-01

    Risk importance measures provide an understandable and practical way of presenting probabilistic safety analysis results which too often tend to remain abstract numbers without real insight into the content. The report clarifies the definitions, relationships and interpretations of the three most basic measures: Risk increase factor, risk decrease factor, and fractional contribution. The above three measures already cover the main types of risk importance measures. Many other importance measures presented in literature are close variants to some of these three measures. They are related in many cases so that, for a technical system considered, the two other measures can be derived from the one calculated first. However, the practical interpretations are different, and hence each three measures have their own uses and rights to existence. The fundamental aspect of importance measures is, that they express some specific influence of a basic event on the total risk. The basic failure or error events are the elements from which the reliability and risk models are constituted. The importance measures are relative, which is an advantage compared to absolute risk numbers, due to insensitivity with respect to quantification uncertainties. Therefore they are particularly adapted to give first hand guidance where to focus main interest from the system's risk and reliability point of view and wherefrom to continue the analysis with more sophisticated methods requiring more effort

  12. Infrared Measurement Variability Analysis.

    Science.gov (United States)

    1980-09-01

    collecting optics of the measurement system. The first equation for tile blackbody experiment has the form 4.0 pim _ Ae W ,T) r(X,D) 3.5 pm - 4.0 pm JrD2 f3.5...potential for noise reduction by identifying and reducing contributing system effects. The measurement variance ott . of an infinite population of possible...irradiance can be written 4.0 pm I r()A A+ A ) 2 4.0 X C1(, = W(XT + AT)d 3.5 pim I since c + Af =2 r +Ar I Using the two expressions juSt devclopCd

  13. A Framework for Categorizing Important Project Variables

    Science.gov (United States)

    Parsons, Vickie S.

    2003-01-01

    While substantial research has led to theories concerning the variables that affect project success, no universal set of such variables has been acknowledged as the standard. The identification of a specific set of controllable variables is needed to minimize project failure. Much has been hypothesized about the need to match project controls and management processes to individual projects in order to increase the chance for success. However, an accepted taxonomy for facilitating this matching process does not exist. This paper surveyed existing literature on classification of project variables. After an analysis of those proposals, a simplified categorization is offered to encourage further research.

  14. A new uncertainty importance measure

    International Nuclear Information System (INIS)

    Borgonovo, E.

    2007-01-01

    Uncertainty in parameters is present in many risk assessment problems and leads to uncertainty in model predictions. In this work, we introduce a global sensitivity indicator which looks at the influence of input uncertainty on the entire output distribution without reference to a specific moment of the output (moment independence) and which can be defined also in the presence of correlations among the parameters. We discuss its mathematical properties and highlight the differences between the present indicator, variance-based uncertainty importance measures and a moment independent sensitivity indicator previously introduced in the literature. Numerical results are discussed with application to the probabilistic risk assessment model on which Iman [A matrix-based approach to uncertainty and sensitivity analysis for fault trees. Risk Anal 1987;7(1):22-33] first introduced uncertainty importance measures

  15. Increasing importance of precipitation variability on global livestock grazing lands

    Science.gov (United States)

    Sloat, Lindsey L.; Gerber, James S.; Samberg, Leah H.; Smith, William K.; Herrero, Mario; Ferreira, Laerte G.; Godde, Cécile M.; West, Paul C.

    2018-03-01

    Pastures and rangelands underpin global meat and milk production and are a critical resource for millions of people dependent on livestock for food security1,2. Forage growth, which is highly climate dependent3,4, is potentially vulnerable to climate change, although precisely where and to what extent remains relatively unexplored. In this study, we assess climate-based threats to global pastures, with a specific focus on changes in within- and between-year precipitation variability (precipitation concentration index (PCI) and coefficient of variation of precipitation (CVP), respectively). Relating global satellite measures of vegetation greenness (such as the Normalized Difference Vegetation Index; NDVI) to key climatic factors reveals that CVP is a significant, yet often overlooked, constraint on vegetation productivity across global pastures. Using independent stocking data, we found that areas with high CVP support lower livestock densities than less-variable regions. Globally, pastures experience about a 25% greater year-to-year precipitation variation (CVP = 0.27) than the average global land surface area (0.21). Over the past century, CVP has generally increased across pasture areas, although both positive (49% of pasture area) and negative (31% of pasture area) trends exist. We identify regions in which livestock grazing is important for local food access and economies, and discuss the potential for pasture intensification in the context of long-term regional trends in precipitation variability.

  16. Importance analysis for models with correlated variables and its sparse grid solution

    International Nuclear Information System (INIS)

    Li, Luyi; Lu, Zhenzhou

    2013-01-01

    For structural models involving correlated input variables, a novel interpretation for variance-based importance measures is proposed based on the contribution of the correlated input variables to the variance of the model output. After the novel interpretation of the variance-based importance measures is compared with the existing ones, two solutions of the variance-based importance measures of the correlated input variables are built on the sparse grid numerical integration (SGI): double-loop nested sparse grid integration (DSGI) method and single loop sparse grid integration (SSGI) method. The DSGI method solves the importance measure by decreasing the dimensionality of the input variables procedurally, while SSGI method performs importance analysis through extending the dimensionality of the inputs. Both of them can make full use of the advantages of the SGI, and are well tailored for different situations. By analyzing the results of several numerical and engineering examples, it is found that the novel proposed interpretation about the importance measures of the correlated input variables is reasonable, and the proposed methods for solving importance measures are efficient and accurate. -- Highlights: •The contribution of correlated variables to the variance of the output is analyzed. •A novel interpretation for variance-based indices of correlated variables is proposed. •Two solutions for variance-based importance measures of correlated variables are built

  17. A comparison of RAM importance measures

    International Nuclear Information System (INIS)

    Atwood, C.L.; Wolford, A.J.; Wright, R.E.

    1989-01-01

    In this paper measures of importance of components and cut sets of a system are reviewed. The measures considered are based on reliability, availability, and maintainability, the three elements of the acronym RAM. They follow the approaches of Fussell and Vesely and of Birnbaum. A new Birnbaum-type unmaintainability importance measure is proposed. The measures are compared in a simple example, and the appropriate use of unmaintainability importance is discussed

  18. Reliability importance measures and their calculation

    International Nuclear Information System (INIS)

    Andsten, R.; Vaurio, J.K.

    1989-01-01

    The importance of a component to the system reliability or availability and to the system failure rate can be measured by a number of importance measures. Such measures can be used to guide the system design improvement actions as well as the diagnostic and repair actions. This report develops relationships between several importance measures, illustrates their meaning with interpretations and applications, and describes the computer program called IMPO that calculates importance measures when the system minimum cat sets and component parameters are given. A user's manual is included with illustrative examples

  19. Interpreting Multiple Linear Regression: A Guidebook of Variable Importance

    Science.gov (United States)

    Nathans, Laura L.; Oswald, Frederick L.; Nimon, Kim

    2012-01-01

    Multiple regression (MR) analyses are commonly employed in social science fields. It is also common for interpretation of results to typically reflect overreliance on beta weights, often resulting in very limited interpretations of variable importance. It appears that few researchers employ other methods to obtain a fuller understanding of what…

  20. Measures of risk importance and their applications

    International Nuclear Information System (INIS)

    Vesely, W.E.; Davis, T.C.; Denning, R.S.; Saltos, N.

    1983-07-01

    This work is part of a project being conducted for the Division of Risk Analysis (DRA) of the Nuclear Regulatory Commission (NRC). The objectives of the project are to evaluate the importances of containment, the different safety functions, and other various contributers as assessed in probabilistic risk analyses and to identify generic conclusions regarding the importances. Effective display of the importances is an important part of these objectives. To address these objectives, measures of risk importance need to be first identified and then they need to be evaluated for the different risk analyses which have been performed. This report describes the risk importance measures that were defined and were applied to the risk analyses which were performed as part of the Reactor Safety Study Methodology Applications Program (RSSMAP). The risk importance measures defined in this report measure the importance of features not only with regard to risk reduction but also with regard to reliability assurance, or risk maintenance. The goal of this report is not to identify new mathematical formulas for risk importance but to show how importance measures can be interpreted and can be applied

  1. Development and application of group importance measures

    International Nuclear Information System (INIS)

    Haskin, F.E.; Huang, Min; Sasser, M.K.; Stack, D.W.

    1992-01-01

    As part of a complete Level I probabilistic safety analysis of the K Production Reactor, three traditional importance measures-risk reduction, partial derivative, and variance reduction-have been extended to permit analyses of the relative importance of groups of basic and initiating events. None of the group importance measures require Monte Carlo sampling for their quantification. The group importance measures are quantified for the overall fuel damage equation and for dominant accident sequences using the following event groups: initiating events, electrical failures, instrumentation failures, common-cause failures, human errors, and nonrecovery events. Additional analyses are presented using other event groups. Collectively, these applications indicate both the utility and the versatility of the group importance measures

  2. A new importance measure for sensitivity analysis

    International Nuclear Information System (INIS)

    Liu, Qiao; Homma, Toshimitsu

    2010-01-01

    Uncertainty is an integral part of risk assessment of complex engineering systems, such as nuclear power plants and space crafts. The aim of sensitivity analysis is to identify the contribution of the uncertainty in model inputs to the uncertainty in the model output. In this study, a new importance measure that characterizes the influence of the entire input distribution on the entire output distribution was proposed. It represents the expected deviation of the cumulative distribution function (CDF) of the model output that would be obtained when one input parameter of interest were known. The applicability of this importance measure was tested with two models, a nonlinear nonmonotonic mathematical model and a risk model. In addition, a comparison of this new importance measure with several other importance measures was carried out and the differences between these measures were explained. (author)

  3. Variable screening and ranking using sampling-based sensitivity measures

    International Nuclear Information System (INIS)

    Wu, Y-T.; Mohanty, Sitakanta

    2006-01-01

    This paper presents a methodology for screening insignificant random variables and ranking significant important random variables using sensitivity measures including two cumulative distribution function (CDF)-based and two mean-response based measures. The methodology features (1) using random samples to compute sensitivities and (2) using acceptance limits, derived from the test-of-hypothesis, to classify significant and insignificant random variables. Because no approximation is needed in either the form of the performance functions or the type of continuous distribution functions representing input variables, the sampling-based approach can handle highly nonlinear functions with non-normal variables. The main characteristics and effectiveness of the sampling-based sensitivity measures are investigated using both simple and complex examples. Because the number of samples needed does not depend on the number of variables, the methodology appears to be particularly suitable for problems with large, complex models that have large numbers of random variables but relatively few numbers of significant random variables

  4. SALTSTONE VARIABILITY STUDY - MEASUREMENT OF POROSITY

    International Nuclear Information System (INIS)

    Harbour, J; Vickie Williams, V; Tommy Edwards, T; Russell Eibling, R; Ray Schumacher, R

    2007-01-01

    One of the goals of the Saltstone Variability Study is to identify the operational and compositional variables that control or influence the important processing and performance properties of Saltstone mixes. One of the key performance properties is porosity which is a measure of the volume percent of a cured grout that is occupied by salt solution (for the saturated case). This report presents (1) the results of efforts to develop a method for the measurement of porosity of grout samples and (2) initial results of porosity values for samples that have been previously produced as part of the Saltstone Variability Study. A cost effective measurement method for porosity was developed that provides reproducible results, is relatively fast (30 to 60 minutes per sample) and uses a Mettler Toledo HR83 Moisture Analyzer that is already operational and routinely calibrated at Aiken County Technology Laboratory. The method involves the heating of the sample at 105 C until no further mass loss is observed. This mass loss value, which is due to water evaporation, is then used to calculate the volume percent porosity of the mix. The results of mass loss for mixes at 105 C were equivalent to the results obtained using thermal gravimetric analysis. The method was validated by comparing measurements of mass loss at 105 C for cured portland cement in water mixes to values presented in the literature for this system. A stereopycnometer from Quantachrome Instruments was selected to measure the cured grout bulk densities. Density is a property that is required to calculate the porosities. A stereopycnometer was already operational at Aiken County Technology Laboratory, has been calibrated using a solid stainless steel sphere of known volume, is cost effective and fast (∼15 minutes per sample). Cured grout densities are important in their own right because they can be used to project the volume of waste form produced from a given amount of salt feed of known composition. For mixes

  5. Measure of uncertainty in regional grade variability

    NARCIS (Netherlands)

    Tutmez, B.; Kaymak, U.; Melin, P.; Castillo, O.; Gomez Ramirez, E.; Kacprzyk, J.; Pedrycz, W.

    2007-01-01

    Because the geological events are neither homogeneous nor isotropic, the geological investigations are characterized by particularly high uncertainties. This paper presents a hybrid methodology for measuring of uncertainty in regional grade variability. In order to evaluate the fuzziness in grade

  6. Additive measures of travel time variability

    DEFF Research Database (Denmark)

    Engelson, Leonid; Fosgerau, Mogens

    2011-01-01

    This paper derives a measure of travel time variability for travellers equipped with scheduling preferences defined in terms of time-varying utility rates, and who choose departure time optimally. The corresponding value of travel time variability is a constant that depends only on preference...... parameters. The measure is unique in being additive with respect to independent parts of a trip. It has the variance of travel time as a special case. Extension is provided to the case of travellers who use a scheduled service with fixed headway....

  7. The Importance of Landfill Gas Policy Measures

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2009-07-01

    The purpose of this document is to identify and examine global policies, measures, and incentives that appear to be stimulating LFG use. As certain countries have made great advances in LFGE development through effective policies, the intention of this report is to use information from the IEA's Global Renewable Energy and Energy Efficiency Measures and Policies Databases to identify and discuss policies. By consolidating this information and categorising it according to policy type, the attributes that are most appealing or applicable to the circumstances of a particular country or area -- technology demonstration, financial incentives, awareness campaigns, etc. -- are more easily identified. The report begins with background information on LFG and sanitary landfill practices, including a discussion of regional disparities, followed by a description of LFG mitigation technologies. Barriers to LFGE projects are then outlined. An explanation of the importance and effectiveness of policy measures leads into a discussion of types and examples of measures that are being used to overcome these barriers and encourage LFGE development. The report concludes with lessons learned, recommendations for further study, and resources where more information can be found.

  8. Measuring Variability in the Presence of Noise

    Science.gov (United States)

    Welsh, W. F.

    Quantitative measurements of a variable signal in the presence of noise requires very careful attention to subtle affects which can easily bias the measurements. This is not limited to the low-count rate regime, nor is the bias error necessarily small. In this talk I will mention some of the dangers in applying standard techniques which are appropriate for high signal to noise data but fail in the cases where the S/N is low. I will discuss methods for correcting the bias in the these cases, both for periodic and non-periodic variability, and will introduce the concept of the ``filtered de-biased RMS''. I will also illustrate some common abuses of power spectrum interpretation. All of these points will be illustrated with examples from recent work on CV and AGN variability.

  9. Bell inequalities for continuous-variable measurements

    International Nuclear Information System (INIS)

    He, Q. Y.; Reid, M. D.; Drummond, P. D.; Cavalcanti, E. G.

    2010-01-01

    Tests of local hidden-variable theories using measurements with continuous-variable (CV) outcomes are developed, and a comparison of different methods is presented. As examples, we focus on multipartite entangled Greenberger-Horne-Zeilinger and cluster states. We suggest a physical process that produces the states proposed here, and investigate experiments both with and without binning of the continuous variable. In the former case, the Mermin-Klyshko inequalities can be used directly. For unbinned outcomes, the moment-based Cavalcanti-Foster-Reid-Drummond inequalities are extended to functional inequalities by consideration of arbitrary functions of the measurements at each site. By optimizing these functions, we obtain more robust violations of local hidden-variable theories than with either binning or moments. Recent inequalities based on the algebra of quaternions and octonions are compared with these methods. Since the prime advantage of CV experiments is to provide a route to highly efficient detection via homodyne measurements, we analyze the effect of noise and detection losses in both binned and unbinned cases. The CV moment inequalities with an optimal function have greater robustness to both loss and noise. This could permit a loophole-free test of Bell inequalities.

  10. Variable importance and prediction methods for longitudinal problems with missing variables.

    Directory of Open Access Journals (Sweden)

    Iván Díaz

    Full Text Available We present prediction and variable importance (VIM methods for longitudinal data sets containing continuous and binary exposures subject to missingness. We demonstrate the use of these methods for prognosis of medical outcomes of severe trauma patients, a field in which current medical practice involves rules of thumb and scoring methods that only use a few variables and ignore the dynamic and high-dimensional nature of trauma recovery. Well-principled prediction and VIM methods can provide a tool to make care decisions informed by the high-dimensional patient's physiological and clinical history. Our VIM parameters are analogous to slope coefficients in adjusted regressions, but are not dependent on a specific statistical model, nor require a certain functional form of the prediction regression to be estimated. In addition, they can be causally interpreted under causal and statistical assumptions as the expected outcome under time-specific clinical interventions, related to changes in the mean of the outcome if each individual experiences a specified change in the variable (keeping other variables in the model fixed. Better yet, the targeted MLE used is doubly robust and locally efficient. Because the proposed VIM does not constrain the prediction model fit, we use a very flexible ensemble learner (the SuperLearner, which returns a linear combination of a list of user-given algorithms. Not only is such a prediction algorithm intuitive appealing, it has theoretical justification as being asymptotically equivalent to the oracle selector. The results of the analysis show effects whose size and significance would have been not been found using a parametric approach (such as stepwise regression or LASSO. In addition, the procedure is even more compelling as the predictor on which it is based showed significant improvements in cross-validated fit, for instance area under the curve (AUC for a receiver-operator curve (ROC. Thus, given that 1 our VIM

  11. Seasonal Variability in European Radon Measurements

    Science.gov (United States)

    Groves-Kirkby, C. J.; Denman, A. R.; Phillips, P. S.; Crockett, R. G. M.; Sinclair, J. M.

    2009-04-01

    In temperate climates, domestic radon concentration levels are generally seasonally dependent, the level in the home reflecting the convolution of two time-dependent functions. These are the source soil-gas radon concentration itself, and the principal force driving radon into the building from the soil, namely the pressure-difference between interior and exterior environment. While the meteorological influence can be regarded as relatively uniform on a European scale, its variability being defined largely by the influence of North-Atlantic weather systems, soil-gas radon is generally more variable as it is essentially geologically dependent. Seasonal variability of domestic radon concentration can therefore be expected to exhibit geographical variability, as is indeed the case. To compensate for the variability of domestic radon levels when assessing the long term radon health risks, the results of individual short-term measurements are generally converted to equivalent mean annual levels by application of a Seasonal Correction Factor (SCF). This is a multiplying factor, typically derived from measurements of a large number of homes, applied to the measured short-term radon concentration to provide a meaningful annual mean concentration for dose-estimation purposes. Following concern as to the universal applicability of a single SCF set, detailed studies in both the UK and France have reported location-specific SCF sets for different regions of each country. Further results indicate that SCFs applicable to the UK differ significantly from those applicable elsewhere in Europe and North America in both amplitude and phase, supporting the thesis that seasonal variability in indoor radon concentration cannot realistically be compensated for by a single national or international SCF scheme. Published data characterising the seasonal variability of European national domestic radon concentrations, has been collated and analysed, with the objective of identifying

  12. The variability of piezoelectric measurements. Material and measurement method contributions

    International Nuclear Information System (INIS)

    Stewart, M.; Cain, M.

    2002-01-01

    The variability of piezoelectric materials measurements has been investigated in order to separate the contributions from intrinsic instrumental variability, and the contributions from the variability in materials. The work has pinpointed several areas where weaknesses in the measurement methods result in high variability, and also show that good correlation between piezoelectric parameters allow simpler measurement methods to be used. The Berlincourt method has been shown to be unreliable when testing thin discs, however when testing thicker samples there is a good correlation between this and other methods. The high field permittivity and low field permittivity correlate well, so tolerances on low field measurements would predict high field performance. In trying to identify microstructural origins of samples that behave differently to others within a batch, no direct evidence was found to suggest that outliers originate from either differences in microstructure or crystallography. Some of the samples chosen as maximum outliers showed pin-holes, probably from electrical breakdown during poling, even though these defects would ordinarily be detrimental to piezoelectric output. (author)

  13. Importance of predictor variables for models of chemical function

    Data.gov (United States)

    U.S. Environmental Protection Agency — Importance of random forest predictors for all classification models of chemical function. This dataset is associated with the following publication: Isaacs , K., M....

  14. Measurement of very rapidly variable temperatures

    International Nuclear Information System (INIS)

    Elberg, S.; Mathonnet, P.

    1974-01-01

    Bibliographical research and visits to laboratories were undertaken in order to survey the different techniques used to measure rapidly variable temperatures, specifying the limits in maximum temperature and variation rate (time constant). On the basis of the bibliographical study these techniques were classified in three categories according to the physical meaning of their response time. Extension of the bibliographical research to methods using fast temperature variation measurement techniques and visits to research and industrial laboratories gave in an idea of the problems raised by the application of these methods. The use of these techniques in fields other than those for which they were developed can sometimes be awkward in the case of thermometric probe devices where the time constant cannot generally be specified [fr

  15. Spectral Data Captures Important Variability Between and Among Species and Functional Types

    Science.gov (United States)

    Townsend, P. A.; Serbin, S. P.; Kingdon, C.; Singh, A.; Couture, J. J.; Gamon, J. A.

    2013-12-01

    Narrowband spectral data in the visible, near and shortwave infrared (400-2500 nm) are being used increasingly in plant ecology to characterize the biochemical, physiological and water status of vegetation, as well as community composition. In particular, spectroscopic data have recently received considerable attention for their capacity to discriminate plants according to functional properties or 'optical types.' Such measurements can be acquired from airborne/satellite remote sensing imagery or field spectrometers and are commonly used to directly estimate or infer properties important to photosynthesis, carbon and water fluxes, nutrient dynamics, phenology, and disturbance. Spectral data therefore represent proxies for measurements that are otherwise time consuming or expensive to make, and - more importantly - provide the opportunity to characterize the spatial and temporal variability of taxonomic or functional groups. We have found that spectral variation within species and functional types can in fact exceed the variation between types. As such, we recommend that the traditional quantification of characteristics defining species and/or functional types must be modified to include the range of variability in those properties. We provide four examples of the importance of spectral data for describing within-species/functional type variation. First, within temperate forests, the spectral properties of foliage vary considerably with canopy position. This variability is strongly related to differences in specific leaf area between shade- and sun-lit leaves, and the resulting differences among leaves in strategies for light harvesting, photosynthesis, and leaf longevity. These results point to the need to better characterize leaf optical properties throughout a canopy, rather than basing the characterization of ecosystem functioning on only the sunlit portion of the canopy crown. Second, we show considerable differences in optical properties of foliage from

  16. The importance of local measurements for cosmology

    CERN Document Server

    Verde, Licia; Jimenez, Raul

    2013-01-01

    We explore how local, cosmology-independent measurements of the Hubble constant and the age of the Universe help to provide a powerful consistency check of the currently favored cosmological model (flat LambdaCDM) and model-independent constraints on cosmology. We use cosmic microwave background (CMB) data to define the model-dependent cosmological parameters, and add local measurements to assess consistency and determine whether extensions to the model are justified. At current precision, there is no significant tension between the locally measured Hubble constant and age of the Universe (with errors of 3% and 5% respectively) and the corresponding parameters derived from the CMB. However, if errors on the local measurements could be decreased by a factor of two, one could decisively conclude if there is tension or not. We also compare the local and CMB data assuming simple extensions of the flat, $\\Lambda$CDM model (including curvature, dark energy with a constant equation of state parameter not equal to -1...

  17. The active liquid Earth - importance of temporal and spatial variability

    Science.gov (United States)

    Arheimer, Berit

    2016-04-01

    The Planet Earth is indeed liquid and active - 71 percent of its surface is water-covered and this water never rests. Thanks to the water cycle, our planet's water supply is constantly moving from one place to another and from one form to another. Only 2.5% of the water is freshwater and it exists in the air as water vapor; it hits the ground as rain and snow; it flows on the surface from higher to lower altitudes in rivers, lakes, and glaciers; and it flows in the ground in soil, aquifers, and in all living organisms until it reaches the sea. On its way over the Earth's crust, some returns quickly to vapor again, while some is trapped and exposed to many "fill and spill" situations for a long journey. The variability in the water balance is crucial for hydrological understanding and modelling. The water cycle may appear simple, but magnitudes and rates in fluxes are very different from one place to another, resulting from variable drivers such as solar energy, precipitation and gravity in co-evolution with geology, soil, vegetation and fauna. The historical evolution, the temporal fluxes and diversity in space continue to fascinate hydrological scientists. Specific physical processes may be well known, but their boundary conditions, interactions and rate often remain unknown at a specific site and are difficult to monitor in nature. This results in mysterious features where trends in drivers do not match runoff, like the Sahelian Paradox or discharge to the Arctic Ocean. Humans have always interfered with the water cycle and engineering is fundamental for water regulation and re-allocation. Some 80% of the river flow from the northern part of the Earth is affected by fragmentation of the river channels by dams. In water management, there is always a tradeoff between upstream and downstream activities, not only regarding total water quantities but also for temporal patterns and water quality aspects. Sharing a water resource can generate conflicts but geopolitical

  18. The importance of measuring unmet healthcare needs.

    Science.gov (United States)

    Gauld, Robin; Raymont, Antony; Bagshaw, Philip F; Nicholls, M Gary; Frampton, Christopher M

    2014-10-17

    Major restructuring of the health sector has been undertaken in many countries, including New Zealand and England, yet objective assessment of the outcomes has rarely been recorded. In the absence of comprehensive objective data, the success or otherwise of health reforms has been inferred from narrowly-focussed data or anecdotal accounts. A recent example relates to a buoyant King's Fund report on the quest for integrated health and social care in Canterbury, New Zealand which prompted an equally supportive editorial article in the British Medical Journal (BMJ) suggesting it may contain lessons for England's National Health Service. At the same time, a report published in the New Zealand Medical Journal expressed concerns at the level of unmet healthcare needs in Canterbury. Neither report provided objective information about changes over time in the level of unmet healthcare needs in Canterbury. We propose that the performance of healthcare systems should be measured regularly, objectively and comprehensively through documentation of unmet healthcare needs as perceived by representative segments of the population at formal interview. Thereby the success or otherwise of organisational changes to a health system and its adequacy as demographics of the population evolve, even in the absence of major restructuring of the health sector, can be better documented.

  19. Using variable homography to measure emergent fibers on textile fabrics

    Science.gov (United States)

    Xu, Jun; Cudel, Christophe; Kohler, Sophie; Fontaine, Stéphane; Haeberlé, Olivier; Klotz, Marie-Louise

    2011-07-01

    A fabric's smoothness is a key factor to determine the quality of textile finished products and has great influence on the functionality of industrial textiles and high-end textile products. With popularization of the 'zero defect' industrial concept, identifying and measuring defective material in the early stage of production is of great interest for the industry. In the current market, many systems are able to achieve automatic monitoring and control of fabric, paper, and nonwoven material during the entire production process, however online measurement of hairiness is still an open topic and highly desirable for industrial applications. In this paper we propose a computer vision approach, based on variable homography, which can be used to measure the emergent fiber's length on textile fabrics. The main challenges addressed in this paper are the application of variable homography to textile monitoring and measurement, as well as the accuracy of the estimated calculation. We propose that a fibrous structure can be considered as a two-layer structure and then show how variable homography can estimate the length of the fiber defects. Simulations are carried out to show the effectiveness of this method to measure the emergent fiber's length. The true lengths of selected fibers are measured precisely using a digital optical microscope, and then the same fibers are tested by our method. Our experimental results suggest that smoothness monitored by variable homography is an accurate and robust method for quality control of important industrially fabrics.

  20. Variable Bone Density of Scaphoid: Importance of Subchondral Screw Placement.

    Science.gov (United States)

    Swanstrom, Morgan M; Morse, Kyle W; Lipman, Joseph D; Hearns, Krystle A; Carlson, Michelle G

    2018-02-01

    Background  Ideal internal fixation of the scaphoid relies on adequate bone stock for screw purchase; so, knowledge of regional bone density of the scaphoid is crucial. Questions/Purpose  The purpose of this study was to evaluate regional variations in scaphoid bone density. Materials and Methods  Three-dimensional CT models of fractured scaphoids were created and sectioned into proximal/distal segments and then into quadrants (volar/dorsal/radial/ulnar). Concentric shells in the proximal and distal pole were constructed in 2-mm increments moving from exterior to interior. Bone density was measured in Hounsfield units (HU). Results  Bone density of the distal scaphoid (453.2 ± 70.8 HU) was less than the proximal scaphoid (619.8 ± 124.2 HU). There was no difference in bone density between the four quadrants in either pole. In both the poles, the first subchondral shell was the densest. In both the proximal and distal poles, bone density decreased significantly in all three deeper shells. Conclusion  The proximal scaphoid had a greater density than the distal scaphoid. Within the poles, there was no difference in bone density between the quadrants. The subchondral 2-mm shell had the greatest density. Bone density dropped off significantly between the first and second shell in both the proximal and distal scaphoids. Clinical Relevance  In scaphoid fracture ORIF, optimal screw placement engages the subchondral 2-mm shell, especially in the distal pole, which has an overall lower bone density, and the second shell has only two-third the density of the first shell.

  1. Optimization of Drilling Resistance Measurement (DRM) user-controlled variables

    OpenAIRE

    Tudor, Dumitrescu; Pesce, Giovanni; Ball, Richard

    2017-01-01

    Drilling Resistance Measurement (DRM) is recognised as an important on-site micro-invasive procedure for assessment of construction materials. This paper presents a detailed investigation of user-controlled variables and their influence on drilling resistance. The study proves that the ratio of penetration rate/rotational speed (PR/RPM) is proportional to drilling resistance. Data from Bath stone and an artificial reference stone demonstrates how different materials can be compared using thei...

  2. Methods to quantify variable importance: implications for theanalysis of noisy ecological data

    OpenAIRE

    Murray, Kim; Conner, Mary M.

    2009-01-01

    Determining the importance of independent variables is of practical relevance to ecologists and managers concerned with allocating limited resources to the management of natural systems. Although techniques that identify explanatory variables having the largest influence on the response variable are needed to design management actions effectively, the use of various indices to evaluate variable importance is poorly understood. Using Monte Carlo simulations, we compared six different indices c...

  3. Measuring Repeatability of the Focus-variable Lenses

    Directory of Open Access Journals (Sweden)

    Jan Řezníček

    2014-12-01

    Full Text Available In the field of photogrammetry, the optical system, usually represented by the glass lens, is used for metric purposes. Therefore, the aberration characteristics of such a lens, inducing deviations from projective imaging, has to be well known. However, the most important property of the metric lens is the stability of its glass and mechanical elements, ensuring long-term reliability of the measured parameters. In case of a focus-variable lens, the repeatability of the lens setup is important as well. Lenses with a fixed focal length are usually considered as “fixed” though, in fact, most of them contain one or more movable glass elements, providing the focusing function. In cases where the lens is not equipped with fixing screws, the repeatability of the calibration parameters should be known. This paper derives simple mathematical formulas that can be used for measuring the repeatability of the focus-variable lenses, and gives a demonstrative example of such measuring. The given procedure has the advantage that only demanded parameters are estimated, hence, no unwanted correlations with the additional parameters exist. The test arrangement enables us to measure each demanded magnification of the optical system, which is important in close-range photogrammetry.

  4. Importance of fishing as a segmentation variable in the application of a social worlds model

    Science.gov (United States)

    Gigliotti, Larry M.; Chase, Loren

    2017-01-01

    Market segmentation is useful to understanding and classifying the diverse range of outdoor recreation experiences sought by different recreationists. Although many different segmentation methodologies exist, many are complex and difficult to measure accurately during in-person intercepts, such as that of creel surveys. To address that gap in the literature, we propose a single-item measure of the importance of fishing as a surrogate to often overly- or needlesslycomplex segmentation techniques. The importance of fishing item is a measure of the value anglers place on the activity or a coarse quantification of how central the activity is to the respondent’s lifestyle (scale: 0 = not important, 1 = slightly, 2 = moderately, 3 = very, and 4 = fishing is my most important recreational activity). We suggest the importance scale may be a proxy measurement for segmenting anglers using the social worlds model as a theoretical framework. Vaske (1980) suggested that commitment to recreational activities may be best understood in relation to social group participation and the social worlds model provides a rich theoretical framework for understanding social group segments. Unruh (1983) identified four types of actor involvement in social worlds: strangers, tourists, regulars, and insiders, differentiated by four characteristics (orientation, experiences, relationships, and commitment). We evaluated the importance of fishing as a segmentation variable using data collected by a mixed-mode survey of South Dakota anglers fishing in 2010. We contend that this straightforward measurement may be useful for segmenting outdoor recreation activities when more complicated segmentation schemes are not suitable. Further, this index, when coupled with the social worlds model, provides a valuable framework for understanding the segments and making management decisions.

  5. Limitations of the usual blood-pressure hypothesis and importance of variability, instability, and episodic hypertension.

    Science.gov (United States)

    Rothwell, Peter M

    2010-03-13

    Although hypertension is the most prevalent treatable vascular risk factor, how it causes end-organ damage and vascular events is poorly understood. Yet, a widespread belief exists that underlying usual blood pressure can alone account for all blood-pressure-related risk of vascular events and for the benefits of antihypertensive drugs, and this notion has come to underpin all major clinical guidelines on diagnosis and treatment of hypertension. Other potentially informative measures, such as variability in clinic blood pressure or maximum blood pressure reached, have been neglected, and effects of antihypertensive drugs on such measures are largely unknown. Clinical guidelines recommend that episodic hypertension is not treated, and the potential risks of residual variability in blood pressure in treated hypertensive patients have been ignored. This Review discusses shortcomings of the usual blood-pressure hypothesis, provides background to accompanying reports on the importance of blood-pressure variability in prediction of risk of vascular events and in accounting for benefits of antihypertensive drugs, and draws attention to clinical implications and directions for future research. Copyright 2010 Elsevier Ltd. All rights reserved.

  6. Emittance measurements by variable quadrupole method

    International Nuclear Information System (INIS)

    Toprek, D.

    2005-01-01

    The beam emittance is a measure of both the beam size and beam divergence, we cannot directly measure its value. If the beam size is measured at different locations or under different focusing conditions such that different parts of the phase space ellipse will be probed by the beam size monitor, the beam emittance can be determined. An emittance measurement can be performed by different methods. Here we will consider the varying quadrupole setting method.

  7. Development of a localized probabilistic sensitivity method to determine random variable regional importance

    International Nuclear Information System (INIS)

    Millwater, Harry; Singh, Gulshan; Cortina, Miguel

    2012-01-01

    There are many methods to identify the important variable out of a set of random variables, i.e., “inter-variable” importance; however, to date there are no comparable methods to identify the “region” of importance within a random variable, i.e., “intra-variable” importance. Knowledge of the critical region of an input random variable (tail, near-tail, and central region) can provide valuable information towards characterizing, understanding, and improving a model through additional modeling or testing. As a result, an intra-variable probabilistic sensitivity method was developed and demonstrated for independent random variables that computes the partial derivative of a probabilistic response with respect to a localized perturbation in the CDF values of each random variable. These sensitivities are then normalized in absolute value with respect to the largest sensitivity within a distribution to indicate the region of importance. The methodology is implemented using the Score Function kernel-based method such that existing samples can be used to compute sensitivities for negligible cost. Numerical examples demonstrate the accuracy of the method through comparisons with finite difference and numerical integration quadrature estimates. - Highlights: ► Probabilistic sensitivity methodology. ► Determines the “region” of importance within random variables such as left tail, near tail, center, right tail, etc. ► Uses the Score Function approach to reuse the samples, hence, negligible cost. ► No restrictions on the random variable types or limit states.

  8. Measures of uncertainty, importance and sensitivity of the SEDA code

    International Nuclear Information System (INIS)

    Baron, J.; Caruso, A.; Vinate, H.

    1996-01-01

    The purpose of this work is the estimation of the uncertainty on the results of the SEDA code (Sistema de Evaluacion de Dosis en Accidentes) in accordance with the input data and its parameters. The SEDA code has been developed by the Comision Nacional de Energia Atomica for the estimation of doses during emergencies in the vicinity of Atucha and Embalse, nuclear power plants. The user should feed the code with meteorological data, source terms and accident data (timing involved, release height, thermal content of the release, etc.) It is designed to be used during the emergency, and to bring fast results that enable to make decisions. The uncertainty in the results of the SEDA code is quantified in the present paper. This uncertainty is associated both with the data the user inputs to the code, and with the uncertain parameters of the code own models. The used method consisted in the statistical characterization of the parameters and variables, assigning them adequate probability distributions. These distributions have been sampled with the Latin Hypercube Sampling method, which is a stratified multi-variable Monte-Carlo technique. The code has been performed for each of the samples and finally, a result sample has been obtained. These results have been characterized from the statistical point of view (obtaining their mean, most probable value, distribution shape, etc.) for several distances from the source. Finally, the Partial Correlation Coefficients and Standard Regression Coefficients techniques have been used to obtain the relative importance of each input variable, and the Sensitivity of the code to its variations. The measures of Importance and Sensitivity have been obtained for several distances from the source and various cases of atmospheric stability, making comparisons possible. This paper allowed to confide in the results of the code, and the association of their uncertainty to them, as a way to know the limits in which the results can vary in a real

  9. Risk importance measures in the dynamic flowgraph methodology

    International Nuclear Information System (INIS)

    Tyrväinen, T.

    2013-01-01

    This paper presents new risk importance measures applicable to a dynamic reliability analysis approach with multi-state components. Dynamic reliability analysis methods are needed because traditional methods, such as fault tree analysis, can describe system's dynamical behaviour only in limited manner. Dynamic flowgraph methodology (DFM) is an approach used for analysing systems with time dependencies and feedback loops. The aim of DFM is to identify root causes of a top event, usually representing the system's failure. Components of DFM models are analysed at discrete time points and they can have multiple states. Traditional risk importance measures developed for static and binary logic are not applicable to DFM as such. Some importance measures have previously been developed for DFM but their ability to describe how components contribute to the top event is fairly limited. The paper formulates dynamic risk importance measures that measure the importances of states of components and take the time-aspect of DFM into account in a logical way that supports the interpretation of results. Dynamic risk importance measures are developed as generalisations of the Fussell-Vesely importance and the risk increase factor. -- Highlights: • New risk importance measures are developed for the dynamic flowgraph methodology. • Dynamic risk importance measures are formulated for states of components. • An approach to handle failure modes of a component in DFM is presented. • Dynamic risk importance measures take failure times into account. • Component's influence on the system's reliability can be analysed in detail

  10. Use of importance measures in risk-informed regulatory applications

    International Nuclear Information System (INIS)

    Cheok, Michael C.; Parry, Gareth W.; Sherry, Richard R.

    1998-01-01

    The use of importance measures to analyze PRA results is discussed. Commonly used importance measures are defined. Some issues that have been identified as potentially limiting their usefulness are addressed, namely: there is no simple relationship between importance measures evaluated at the single component level and those evaluated at the level of a group of components, and, as a result, some of the commonly used importance measures are not realistic measures of the sensitivity of the overall risk to parameter value changes; and, importance measures do not typically take into account parameter uncertainties which raises the question of the robustness of conclusions drawn from importance analyses. The issues are explored in the context of both ranking and categorization of structures, systems, and components (SSCs) with respect to risk-significance and safety-significance for use in risk-informed regulatory analyses

  11. Inter-observer variability in fetal biometric measurements.

    Science.gov (United States)

    Kilani, Rami; Aleyadeh, Wesam; Atieleh, Luay Abu; Al Suleimat, Abdul Mane; Khadra, Maysa; Hawamdeh, Hassan M

    2018-02-01

    To evaluate inter-observer variability and reproducibility of ultrasound measurements for fetal biometric parameters. A prospective cohort study was implemented in two tertiary care hospitals in Amman, Jordan; Prince Hamza Hospital and Albashir Hospital. 192 women with a singleton pregnancy at a gestational age of 18-36 weeks were the participants in the study. Transabdominal scans for fetal biometric parameter measurement were performed on study participants from the period of November 2014 to March 2015. Women who agreed to participate in the study were administered two ultrasound scans for head circumference, abdominal circumference and femur length. The correlation coefficient was calculated. Bland-Altman plots were used to analyze the degree of measurement agreement between observers. Limits of agreement ± 2 SD for the differences in fetal biometry measurements in proportions of the mean of the measurements were derived. Main outcome measures examine the reproducibility of fetal biometric measurements by different observers. High inter-observer inter-class correlation coefficient (ICC) was found for femur length (0.990) and abdominal circumference (0.996) where Bland-Altman plots showed high degrees of agreement. The highest degrees of agreement were noted in the measurement of abdominal circumference followed by head circumference. The lowest degree of agreement was found for femur length measurement. We used a paired-sample t-test and found that the mean difference between duplicate measurements was not significant (P > 0.05). Biometric fetal parameter measurements may be reproducible by different operators in the clinical setting with similar results. Fetal head circumference, abdominal circumference and femur length were highly reproducible. Large organized studies are needed to ensure accurate fetal measurements due to the important clinical implications of inaccurate measurements. Copyright © 2018. Published by Elsevier B.V.

  12. The Variability of Automated QRS Duration Measurement

    Czech Academy of Sciences Publication Activity Database

    Vančura, V.; Wichterle, D.; Ulč, I.; Šmíd, J.; Brabec, Marek; Zárybnická, M.; Rokyta, R.

    2017-01-01

    Roč. 19, č. 4 (2017), s. 636-643 ISSN 1099-5129 Institutional support: RVO:67985807 Keywords : ECG * QRS complex * automated measurement Subject RIV: FA - Cardiovascular Diseases incl. Cardiotharic Surgery OBOR OECD: Statistics and probability Impact factor: 4.530, year: 2016

  13. Mean importance measures for groups of events in fault trees

    Energy Technology Data Exchange (ETDEWEB)

    Haskin, F.E.; Huang, Min [New Mexico Univ., Albuquerque, NM (United States). Dept. of Chemical and Nuclear Engineering; Sasser, M.K.; Stack, D.W. [Los Alamos National Lab., NM (United States)

    1993-10-12

    The method of moments is applied to precisely determine the mean values of three importance measures: risk reduction, partial derivative, and variance reduction. Variance reduction calculations, in particular, are significantly improved by eliminating the imprecision associated with Monte Carlo estimates. The three importance measures are extended to permit analyses of the relative importance of groups of basic and initiating events. The partial derivative importance measure is extended by assessing the contribution of a group of events to the gradient of the top event frequency. The group importance measures are quantified for the overall fuel damage equation and for 14 dominant accident sequences from an independent probabilistic safety assessment of the K Production Reactor. This application demonstrates both the utility and the versatility of the group importance measures.

  14. Mean importance measures for groups of events in fault trees

    International Nuclear Information System (INIS)

    Haskin, F.E.; Huang, Min

    1993-01-01

    The method of moments is applied to precisely determine the mean values of three importance measures: risk reduction, partial derivative, and variance reduction. Variance reduction calculations, in particular, are significantly improved by eliminating the imprecision associated with Monte Carlo estimates. The three importance measures are extended to permit analyses of the relative importance of groups of basic and initiating events. The partial derivative importance measure is extended by assessing the contribution of a group of events to the gradient of the top event frequency. The group importance measures are quantified for the overall fuel damage equation and for 14 dominant accident sequences from an independent probabilistic safety assessment of the K Production Reactor. This application demonstrates both the utility and the versatility of the group importance measures

  15. A complex network-based importance measure for mechatronics systems

    Science.gov (United States)

    Wang, Yanhui; Bi, Lifeng; Lin, Shuai; Li, Man; Shi, Hao

    2017-01-01

    In view of the negative impact of functional dependency, this paper attempts to provide an alternative importance measure called Improved-PageRank (IPR) for measuring the importance of components in mechatronics systems. IPR is a meaningful extension of the centrality measures in complex network, which considers usage reliability of components and functional dependency between components to increase importance measures usefulness. Our work makes two important contributions. First, this paper integrates the literature of mechatronic architecture and complex networks theory to define component network. Second, based on the notion of component network, a meaningful IPR is brought into the identifying of important components. In addition, the IPR component importance measures, and an algorithm to perform stochastic ordering of components due to the time-varying nature of usage reliability of components and functional dependency between components, are illustrated with a component network of bogie system that consists of 27 components.

  16. Mean importance measures for groups of events in fault trees

    International Nuclear Information System (INIS)

    Haskin, F.E.; Huang, Min

    1994-01-01

    The method of moments is applied to precisely determine the mean values of three importance measures: risk reduction, partial derivative, and variance reduction. Variance reduction calculations, in particular, are significantly improved by eliminating the imprecision associated with Monte Carlo estimates. The three importance measures are extended to permit analyses of the relative importance of groups of basic and initiating events. The partial derivative importance measure is extended by assessing the contribution of a group of events to the gradient of the top event frequency. The group importance measures are quantified for the overall fuel damage equation and for 14 dominant accident sequences from an independent probabilistic safety assessment of the K Production Reactor. This application demonstrates both the utility and the versatility of the group importance measures

  17. Bayesian modeling of measurement error in predictor variables

    NARCIS (Netherlands)

    Fox, Gerardus J.A.; Glas, Cornelis A.W.

    2003-01-01

    It is shown that measurement error in predictor variables can be modeled using item response theory (IRT). The predictor variables, that may be defined at any level of an hierarchical regression model, are treated as latent variables. The normal ogive model is used to describe the relation between

  18. Prognostic importance of glycaemic variability on hospital mortality in patients hospitalised in Internal Medicine Departments.

    Science.gov (United States)

    Sáenz-Abad, D; Gimeno-Orna, J A; Pérez-Calvo, J I

    2015-12-01

    The objective was to assess the prognostic importance of various glycaemic control measures on hospital mortality. Retrospective, analytical cohort study that included patients hospitalised in internal medicine departments with a diagnosis related to diabetes mellitus (DM), excluding acute decompensations. The clinical endpoint was hospital mortality. We recorded clinical, analytical and glycaemic control-related variables (scheduled insulin administration, plasma glycaemia at admission, HbA1c, mean glycaemia (MG) and in-hospital glycaemic variability and hypoglycaemia). The measurement of hospital mortality predictors was performed using univariate and multivariate logistic regression. A total of 384 patients (50.3% men) were included. The mean age was 78.5 (SD, 10.3) years. The DM-related diagnoses were type 2 diabetes (83.6%) and stress hyperglycaemia (6.8%). Thirty-one (8.1%) patients died while in hospital. In the multivariate analysis, the best model for predicting mortality (R(2)=0.326; P<.0001) consisted, in order of importance, of age (χ(2)=8.19; OR=1.094; 95% CI 1.020-1.174; P=.004), Charlson index (χ(2)=7.28; OR=1.48; 95% CI 1.11-1.99; P=.007), initial glycaemia (χ(2)=6.05; OR=1.007; 95% CI 1.001-1.014; P=.014), HbA1c (χ(2)=5.76; OR=0.59; 95% CI 0.33-1; P=.016), glycaemic variability (χ(2)=4.41; OR=1.031; 95% CI 1-1.062; P=.036), need for corticosteroid treatment (χ(2)=4.03; OR=3.1; 95% CI 1-9.64; P=.045), administration of scheduled insulin (χ(2)=3.98; OR=0.26; 95% CI 0.066-1; P=.046) and systolic blood pressure (χ(2)=2.92; OR=0.985; 95% CI 0.97-1.003; P=.088). An increase in initial glycaemia and in-hospital glycaemic variability predict the risk of mortality for hospitalised patients with DM. Copyright © 2015 Elsevier España, S.L.U. y Sociedad Española de Medicina Interna (SEMI). All rights reserved.

  19. Interobserver Variability of Ki-67 Measurement in Breast Cancer

    Directory of Open Access Journals (Sweden)

    Yul Ri Chung

    2016-03-01

    Full Text Available Background: As measurement of Ki-67 proliferation index is an important part of breast cancer diagnostics, we conducted a multicenter study to examine the degree of concordance in Ki-67 counting and to find factors that lead to its variability. Methods: Thirty observers from thirty different institutions reviewed Ki-67–stained slides of 20 different breast cancers on whole sections and tissue microarray (TMA by online system. Ten of the 20 breast cancers had hot spots of Ki-67 expression. Each observer scored Ki-67 in two different ways: direct counting (average vs. hot spot method and categorical estimation. Intraclass correlation coefficient (ICC of Ki-67 index was calculated for comparative analysis. Results: For direct counting, ICC of TMA was slightly higher than that of whole sections using average method (0.895 vs 0.858. The ICC of tumors with hot spots was lower than that of tumors without (0.736 vs 0.874. In tumors with hot spots, observers took an additional counting from the hot spot; the ICC of whole sections using hot spot method was still lower than that of TMA (0.737 vs 0.895. In categorical estimation, Ki-67 index showed a wide distribution in some cases. Nevertheless, in tumors with hot spots, the range of distribution in Ki-67 categories was decreased with hot spot method and in TMA platform. Conclusions: Interobserver variability of Ki-67 index for direct counting and categorical estimation was relatively high. Tumors with hot spots showed greater interobserver variability as opposed to those without, and restricting the measurement area yielded lower interobserver variability.

  20. Gait variability measurements in lumbar spinal stenosis patients: part B. Preoperative versus postoperative gait variability

    International Nuclear Information System (INIS)

    Papadakis, N C; Christakis, D G; Tzagarakis, G N; Chlouverakis, G I; Kampanis, N A; Stergiopoulos, K N; Katonis, P G

    2009-01-01

    The objective of this study was to assess the gait variability of lumbar spinal stenosis (LSS) patients and to evaluate its postoperative progression. The hypothesis was that LSS patients' preoperative gait variability in the frequency domain was higher than the corresponding postoperative. A tri-axial accelerometer sensor was used for the gait measurement and a spectral differential entropy algorithm was used to measure the gait variability. Twelve subjects with LSS were measured before and after surgery. Preoperative measurements were performed 2 days before surgery. Postoperative measurements were performed 6 and 12 months after surgery. Preoperative gait variability was higher than the corresponding postoperative. Also, in most cases, gait variability appeared to decrease throughout the year

  1. Important variables in explaining real-time peak price in the independent power market of Ontario

    International Nuclear Information System (INIS)

    Rueda, I.E.A.; Marathe, A.

    2005-01-01

    This paper uses support vector machines (SVM) based learning algorithm to select important variables that help explain the real-time peak electricity price in the Ontario market. The Ontario market was opened to competition only in May 2002. Due to the limited number of observations available, finding a set of variables that can explain the independent power market of Ontario (IMO) real-time peak price is a significant challenge for the traders and analysts. The kernel regressions of the explanatory variables on the IMO real-time average peak price show that non-linear dependencies exist between the explanatory variables and the IMO price. This non-linear relationship combined with the low variable-observation ratio rule out conventional statistical analysis. Hence, we use an alternative machine learning technique to find the important explanatory variables for the IMO real-time average peak price. SVM sensitivity analysis based results find that the IMO's predispatch average peak price, the actual import peak volume, the peak load of the Ontario market and the net available supply after accounting for load (energy excess) are some of the most important variables in explaining the real-time average peak price in the Ontario electricity market. (author)

  2. Ecological niche models reveal the importance of climate variability for the biogeography of protosteloid amoebae.

    Science.gov (United States)

    Aguilar, María; Lado, Carlos

    2012-08-01

    Habitat availability and environmental preferences of species are among the most important factors in determining the success of dispersal processes and therefore in shaping the distribution of protists. We explored the differences in fundamental niches and potential distributions of an ecological guild of slime moulds-protosteloid amoebae-in the Iberian Peninsula. A large set of samples collected in a north-east to south-west transect of approximately 1000 km along the peninsula was used to test the hypothesis that, together with the existence of suitable microhabitats, climate conditions may determine the probability of survival of species. Although protosteloid amoebae share similar morphologies and life history strategies, canonical correspondence analyses showed that they have varied ecological optima, and that climate conditions have an important effect in niche differentiation. Maxent environmental niche models provided consistent predictions of the probability of presence of the species based on climate data, and they were used to generate maps of potential distribution in an 'everything is everywhere' scenario. The most important climatic factors were, in both analyses, variables that measure changes in conditions throughout the year, confirming that the alternation of fruiting bodies, cysts and amoeboid stages in the life cycles of protosteloid amoebae constitutes an advantage for surviving in a changing environment. Microhabitat affinity seems to be influenced by climatic conditions, which suggests that the micro-environment may vary at a local scale and change together with the external climate at a larger scale.

  3. Clinical Trials With Large Numbers of Variables: Important Advantages of Canonical Analysis.

    Science.gov (United States)

    Cleophas, Ton J

    2016-01-01

    Canonical analysis assesses the combined effects of a set of predictor variables on a set of outcome variables, but it is little used in clinical trials despite the omnipresence of multiple variables. The aim of this study was to assess the performance of canonical analysis as compared with traditional multivariate methods using multivariate analysis of covariance (MANCOVA). As an example, a simulated data file with 12 gene expression levels and 4 drug efficacy scores was used. The correlation coefficient between the 12 predictor and 4 outcome variables was 0.87 (P = 0.0001) meaning that 76% of the variability in the outcome variables was explained by the 12 covariates. Repeated testing after the removal of 5 unimportant predictor and 1 outcome variable produced virtually the same overall result. The MANCOVA identified identical unimportant variables, but it was unable to provide overall statistics. (1) Canonical analysis is remarkable, because it can handle many more variables than traditional multivariate methods such as MANCOVA can. (2) At the same time, it accounts for the relative importance of the separate variables, their interactions and differences in units. (3) Canonical analysis provides overall statistics of the effects of sets of variables, whereas traditional multivariate methods only provide the statistics of the separate variables. (4) Unlike other methods for combining the effects of multiple variables such as factor analysis/partial least squares, canonical analysis is scientifically entirely rigorous. (5) Limitations include that it is less flexible than factor analysis/partial least squares, because only 2 sets of variables are used and because multiple solutions instead of one is offered. We do hope that this article will stimulate clinical investigators to start using this remarkable method.

  4. Event group importance measures for top event frequency analyses

    International Nuclear Information System (INIS)

    1995-01-01

    Three traditional importance measures, risk reduction, partial derivative, nd variance reduction, have been extended to permit analyses of the relative importance of groups of underlying failure rates to the frequencies of resulting top events. The partial derivative importance measure was extended by assessing the contribution of a group of events to the gradient of the top event frequency. Given the moments of the distributions that characterize the uncertainties in the underlying failure rates, the expectation values of the top event frequency, its variance, and all of the new group importance measures can be quantified exactly for two familiar cases: (1) when all underlying failure rates are presumed independent, and (2) when pairs of failure rates based on common data are treated as being equal (totally correlated). In these cases, the new importance measures, which can also be applied to assess the importance of individual events, obviate the need for Monte Carlo sampling. The event group importance measures are illustrated using a small example problem and demonstrated by applications made as part of a major reactor facility risk assessment. These illustrations and applications indicate both the utility and the versatility of the event group importance measures

  5. Event group importance measures for top event frequency analyses

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-31

    Three traditional importance measures, risk reduction, partial derivative, nd variance reduction, have been extended to permit analyses of the relative importance of groups of underlying failure rates to the frequencies of resulting top events. The partial derivative importance measure was extended by assessing the contribution of a group of events to the gradient of the top event frequency. Given the moments of the distributions that characterize the uncertainties in the underlying failure rates, the expectation values of the top event frequency, its variance, and all of the new group importance measures can be quantified exactly for two familiar cases: (1) when all underlying failure rates are presumed independent, and (2) when pairs of failure rates based on common data are treated as being equal (totally correlated). In these cases, the new importance measures, which can also be applied to assess the importance of individual events, obviate the need for Monte Carlo sampling. The event group importance measures are illustrated using a small example problem and demonstrated by applications made as part of a major reactor facility risk assessment. These illustrations and applications indicate both the utility and the versatility of the event group importance measures.

  6. Metabolic Syndrome and Importance of Associated Variables in Children and Adolescents in Guabiruba - SC, Brazil

    Directory of Open Access Journals (Sweden)

    Nilton Rosini

    2015-07-01

    Full Text Available Background:The risk factors that characterize metabolic syndrome (MetS may be present in childhood and adolescence, increasing the risk of cardiovascular disease in adulthood.Objective:Evaluate the prevalence of MetS and the importance of its associated variables, including insulin resistance (IR, in children and adolescents in the city of Guabiruba-SC, Brazil.Methods:Cross-sectional study with 1011 students (6–14 years, 52.4% girls, 58.5% children. Blood samples were collected for measurement of biochemical parameters by routine laboratory methods. IR was estimated by the HOMA-IR index, and weight, height, waist circumference and blood pressure were determined. Multivariate logistic regression models were used to examine the associations between risk variables and MetS.Results:The prevalence of MetS, IR, overweight and obesity in the cohort were 14%, 8.5%, 21% and 13%, respectively. Among students with MetS, 27% had IR, 33% were overweight, 45.5% were obese and 22% were eutrophic. IR was more common in overweight (48% and obese (41% students when compared with eutrophic individuals (11%; p = 0.034. The variables with greatest influence on the development of MetS were obesity (OR = 32.7, overweight (OR = 6.1, IR (OR = 4.4; p ≤ 0.0001 for all and age (OR = 1.15; p = 0.014.Conclusion:There was a high prevalence of MetS in children and adolescents evaluated in this study. Students who were obese, overweight or insulin resistant had higher chances of developing the syndrome.

  7. A new importance measure for risk-informed decision making

    International Nuclear Information System (INIS)

    Borgonovo, E.; Apostolakis, G.E.

    2000-01-01

    Recently, several authors pointed out that the traditional importance measures had limitations. In this study, the problem through an analysis at the parameter level was investigated and a new measure was introduced. The measure was based on small parameter variations and is capable of accounting for the importance of a group of components/parameters. The definition, computational steps, and an application of a new importance measure for risk-informed decision making were presented here. Unlike traditional importance measures, differential importance measure (DIM) deals with changes in the various parameters that determine the unavailability/unreliability of a component, e.g., failure rates, common-cause failure rates, individual human errors. The importance of the component unavailability/unreliability can be calculated from the importance of the parameters. DIM can be calculated for the frequency of initiating events, while risk achievement worth (RAW) is limited to binary events, e.g., component unavailability. The changes in parameters are 'small'. This is more realistic than the drastic assumption in RAW that the component is always down. DIM is additive. This allows the evaluation of the impact of changes, such as the relaxation of quality assurance requirements, which affect groups of parameters, e.g., the failure rates of a group of pumps. (M.N.)

  8. Importance measures for use in PRAs and risk management

    International Nuclear Information System (INIS)

    Schmidt, E.R.; Jamali, K.M.; Parry, G.W.; Gibbon, S.H.

    1985-01-01

    There are many quantities estimated in probabilistic risk assessments (PRAs) to index the level of plant safety. If the PRA is to be used as a risk management tool to assist in the safe operation of the plant, it is essential that those elements of the plant design and its mode of operation that have the greatest impact on plant safety be identified. These elements may be identified by performing importance calculations. There are certain decisions that must be made before the importance calculation is carried out. The first is the definition of the events for which importance is to be evaluated; that is, to what level of resolution the analysis is to be performed. The second decision that must be made--and the major subject of this paper--is the choice of importance measure. Many measures of importance have been proposed; this discussion is restricted to three: the risk achievement (or degradation) worth, the risk reduction worth, and criticality importance. In the paper these measures of importance are defined, their interrelationships are discussed, and a generalized importance measure is introduced. The use of these three measures is compared and their advantages and disadvantages are discussed

  9. The cost of travel time variability: three measures with properties

    DEFF Research Database (Denmark)

    Engelson, Leonid; Fosgerau, Mogens

    2016-01-01

    This paper explores the relationships between three types of measures of the cost of travel time variability: measures based on scheduling preferences and implicit departure time choice, Bernoulli type measures based on a univariate function of travel time, and mean-dispersion measures. We...

  10. Variable importance analysis based on rank aggregation with applications in metabolomics for biomarker discovery.

    Science.gov (United States)

    Yun, Yong-Huan; Deng, Bai-Chuan; Cao, Dong-Sheng; Wang, Wei-Ting; Liang, Yi-Zeng

    2016-03-10

    Biomarker discovery is one important goal in metabolomics, which is typically modeled as selecting the most discriminating metabolites for classification and often referred to as variable importance analysis or variable selection. Until now, a number of variable importance analysis methods to discover biomarkers in the metabolomics studies have been proposed. However, different methods are mostly likely to generate different variable ranking results due to their different principles. Each method generates a variable ranking list just as an expert presents an opinion. The problem of inconsistency between different variable ranking methods is often ignored. To address this problem, a simple and ideal solution is that every ranking should be taken into account. In this study, a strategy, called rank aggregation, was employed. It is an indispensable tool for merging individual ranking lists into a single "super"-list reflective of the overall preference or importance within the population. This "super"-list is regarded as the final ranking for biomarker discovery. Finally, it was used for biomarkers discovery and selecting the best variable subset with the highest predictive classification accuracy. Nine methods were used, including three univariate filtering and six multivariate methods. When applied to two metabolic datasets (Childhood overweight dataset and Tubulointerstitial lesions dataset), the results show that the performance of rank aggregation has improved greatly with higher prediction accuracy compared with using all variables. Moreover, it is also better than penalized method, least absolute shrinkage and selectionator operator (LASSO), with higher prediction accuracy or less number of selected variables which are more interpretable. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. A hierarchical procedure for calculation of risk importance measures

    International Nuclear Information System (INIS)

    Poern, K.; Dinsmore, S.C.

    1987-01-01

    Starting with a general importance definition based on conditional probabilities, a hierarchical process for calculating risk importance measures from a PSA's numerical results is developed. By the appropriate choice of events in the general definition, measures such as the risk achievement worth and the risk reduction worth can be calculated without requantifying the PSA's models. Required approximations are clearly defined and the subsequent constraints on the applicability of the process discussed. (orig.)

  12. The Validity of Attribute-Importance Measurement: A Review

    NARCIS (Netherlands)

    Ittersum, van K.; Pennings, J.M.E.; Wansink, B.; Trijp, van J.C.M.

    2007-01-01

    A critical review of the literature demonstrates a lack of validity among the ten most common methods for measuring the importance of attributes in behavioral sciences. The authors argue that one of the key determinants of this lack of validity is the multi-dimensionality of attribute importance.

  13. The relationship between psychosocial variables and measures of ...

    African Journals Online (AJOL)

    was found to be the most important psychosocial variable in the present study, correlating with several .... It includes eight activities of daily living on which patients have to ..... Effects of aerobic exercise versus stress management treatment in.

  14. AC power flow importance measures considering multi-element failures

    International Nuclear Information System (INIS)

    Li, Jian; Dueñas-Osorio, Leonardo; Chen, Changkun; Shi, Congling

    2017-01-01

    Quantifying the criticality of individual components of power systems is essential for overall reliability and management. This paper proposes an AC-based power flow element importance measure, while considering multi-element failures. The measure relies on a proposed AC-based cascading failure model, which captures branch overflow, bus load shedding, and branch failures, via AC power flow and optimal power flow analyses. Taking the IEEE 30, 57 and 118-bus power systems as case studies, we find that N-3 analyses are sufficient to measure the importance of a bus or branch. It is observed that for a substation bus, its importance is statistically proportional to its power demand, but this trend is not observed for power plant buses. While comparing with other reliability, functionality, and topology-based importance measures popular today, we find that a DC power flow model, although better correlated with the benchmark AC model as a whole, still fails to locate some critical elements. This is due to the focus of DC-based models on real power that ignores reactive power. The proposed importance measure is aimed to inform decision makers about key components in complex systems, while improving cascading failure prevention, system backup setting, and overall resilience. - Highlights: • We propose a novel importance measure based on joint failures and AC power flow. • A cascading failure model considers both AC power flow and optimal power flow. • We find that N-3 analyses are sufficient to measure the importance of an element. • Power demand impacts the importance of substations but less so that of generators. • DC models fail to identify some key elements, despite correlating with AC models.

  15. The Relative Importance of Job Factors: A New Measurement Approach.

    Science.gov (United States)

    Nealey, Stanley M.

    This paper reports on a new two-phase measurement technique that permits a direct comparison of the perceived relative importance of economic vs. non-economic factors in a job situation in accounting for personnel retention, the willingness to produce, and job satisfaction. The paired comparison method was used to measure the preferences of 91…

  16. Relative Importance of Political Instability and Economic Variables on Perceived Country Creditworthiness

    OpenAIRE

    Suk Hun Lee

    1993-01-01

    This paper examines the relative importance of political instability and economic variables on perceived country creditworthiness. Our results indicate that both political instability and economic variables are taken into account in evaluating country creditworthiness; however, it appears that bankers assign larger weight to economic performances, which we except of reflect longer term political stability. In addition, the frequency of changes in the regime and armed conflict, both proxying f...

  17. Screening of variable importance for optimizing electrodialytic remediation of heavy metals from polluted harbour sediments

    DEFF Research Database (Denmark)

    Pedersen, Kristine B.; Lejon, Tore; Ottosen, Lisbeth M.

    2015-01-01

    Using multivariate design and modelling, the optimal conditions for electrodialytic remediation (EDR) of heavy metals were determined for polluted harbour sediments from Hammerfest harbour located in the geographic Arctic region of Norway. The comparative importance of the variables, current......) was computed and variable importance in the projection was used to assess the influence of the experimental variables. Current density and remediation time proved to have the highest influence on the remediation of the heavy metals Cr, Cu, Ni, Pb and Zn in the studied experimental domain. In addition......, it was shown that excluding the acidification time improved the PLS model, indicating the importance of applying a limited experimental domain that covers the removal phases of each heavy metal in the specific sediment. Based on PLS modelling, the optimal conditions for remediating the Hammerfest sediment were...

  18. Measuring psychosocial variables that predict older persons' oral health behaviour.

    Science.gov (United States)

    Kiyak, H A

    1996-12-01

    The importance of recognising psychosocial characteristics of older people that influence their oral health behaviours and the potential success of dental procedures is discussed. Three variables and instruments developed and tested by the author and colleagues are presented. A measure of perceived importance of oral health behaviours has been found to be a significant predictor of dental service utilization in three studies. Self-efficacy regarding oral health has been found to be lower than self-efficacy regarding general health and medication use among older adults, especially among non-Western ethnic minorities. The significance of self-efficacy for predicting changes in caries and periodontal disease is described. Finally, a measure of expectations regarding specific dental procedures has been used with older people undergoing implant therapy. Studies with this instrument reveal that patients have concerns about the procedure far different than those focused on by dental providers. All three instruments can be used in clinical practice as a means of understanding patients' values, perceived oral health abilities, and expectations from dental care. These instruments can enhance dentist-patient rapport and improve the chances of successful dental outcomes for older patients.

  19. Violation of Bell's Inequality Using Continuous Variable Measurements

    Science.gov (United States)

    Thearle, Oliver; Janousek, Jiri; Armstrong, Seiji; Hosseini, Sara; Schünemann Mraz, Melanie; Assad, Syed; Symul, Thomas; James, Matthew R.; Huntington, Elanor; Ralph, Timothy C.; Lam, Ping Koy

    2018-01-01

    A Bell inequality is a fundamental test to rule out local hidden variable model descriptions of correlations between two physically separated systems. There have been a number of experiments in which a Bell inequality has been violated using discrete-variable systems. We demonstrate a violation of Bell's inequality using continuous variable quadrature measurements. By creating a four-mode entangled state with homodyne detection, we recorded a clear violation with a Bell value of B =2.31 ±0.02 . This opens new possibilities for using continuous variable states for device independent quantum protocols.

  20. Streamlining air import operations by trade facilitation measures

    Directory of Open Access Journals (Sweden)

    Yuri da Cunha Ferreira

    2017-12-01

    Full Text Available Global operations are subject to considerable uncertainties. Due to the Trade Facilitation Agreement that became effective in February 2017, the study of measures to streamline customs controls is urgent. This study aims to assess the impact of trade facilitation measures on import flows. An experimental study was performed in the largest cargo airport in South America through discrete-event simulation and design of experiments. Operation impacts of three trade facilitation measures are assessed on import flow by air. We shed light in the following trade facilitation measures: the use of X-ray equipment for physical inspection; increase of the number of qualified companies in the trade facilitation program; performance targets for customs officials. All trade facilitation measures used indicated potential to provide more predictability, cost savings, time reduction, and increase in security in international supply chain.

  1. MEASUREMENT OPTIMALIZATION OF ZAKAT DISTRIBUTION AT LEMBAGA AMIL ZAKAT USING VARIABLE MEASUREMENT OF ECONOMY

    Directory of Open Access Journals (Sweden)

    Marissa Haque

    2016-09-01

    Full Text Available The aim of this research conducted is to optimalize the zakat distribution using economy variabel measurement. Design/Metodology. The Quantitative Research Method is used to analyze financial data, with Optimalize Model as Z variable design, Measurement of Economy as Y variable and Objective Output as X variable, using AMOS program and SEM as tool analysis to confirm that the model can be used as a measurement tool. Research result. Using some indicators to analyze every variable, obtaining output and objective result, influence optimalization of measurement of economy. Conclusion. The measurement of optimalization of zakat distribution using measurement of economy variable, with independent variable/output exogenous and objective, can be used as a model to measure Lembaga Amil Zakat performance. Furthermore, this research need to have some indicators’ development especially in the area of objective variable. Key words:   Output, Objective, Measurement of Economy, Distribution Optimalization, Zakat JEL Classification: D64

  2. On the extension of Importance Measures to complex components

    International Nuclear Information System (INIS)

    Dutuit, Yves; Rauzy, Antoine

    2015-01-01

    Importance Measures are indicators of the risk significance of the components of a system. They are widely used in various applications of Probabilistic Safety Analyses, off-line and on-line, in decision making for preventive and corrective purposes, as well as to rank components according to their contribution to the global risk. They are primarily defined for the case the support model is a coherent fault tree and failures of components are described by basic events of this fault tree. In this article, we study their extension to complex components, i.e. components whose failures are modeled by a gate rather than just a basic event. Although quite natural, such an extension has not received much attention in the literature. We show that it raises a number of problems. The Birnbaum Importance Measure and the notion of Critical States concentrate these difficulties. We present alternative solutions for the extension of these notions. We discuss their respective advantages and drawbacks. This article gives a new point of view on the mathematical foundations of Importance Measures and helps us to clarify their physical meaning. - Highlights: • We propose an extension of Importance Measures to complex components. • We define our extension in term minterms, i.e. states of the system. • We discuss the physical interpretation of Importance Measures in light of this interpretation

  3. Measurement of neutron importance by a dynamic method

    International Nuclear Information System (INIS)

    Dmitriev, V.M.; Matusevich, E.S.; Regushevskij, V.I.; Sazonov, S.P.; Usikov, D.A.

    1977-01-01

    A procedure is proposed for measuring neutron importance spatial distribution in a critical reactor by determining the parameters of its run-up with a constant neutron source. 252 Cf quasiisotropic point source was used. The measurements were performed at a critical assembly with a highly enriched uranium core and beryllium reflector. Importance distributions in critical and subsritical assemblies were compared for various degrees of subcriticality. Absolute normalization for the importance was obtained, and some new integral reactor characteristics were determined experimentally on its basis. An experimental data acquisition and processing system was developed on the basis of the ELECTRONIKA-100 computer. An algorithm was also developed for statistical treatment of the data. The importance distributions in critical and subcritical assemblies proved to coincide up to a rather deep subcriticality

  4. Focusing on a Probability Element: Parameter Selection of Message Importance Measure in Big Data

    OpenAIRE

    She, Rui; Liu, Shanyun; Dong, Yunquan; Fan, Pingyi

    2017-01-01

    Message importance measure (MIM) is applicable to characterize the importance of information in the scenario of big data, similar to entropy in information theory. In fact, MIM with a variable parameter can make an effect on the characterization of distribution. Furthermore, by choosing an appropriate parameter of MIM,it is possible to emphasize the message importance of a certain probability element in a distribution. Therefore, parametric MIM can play a vital role in anomaly detection of bi...

  5. Use of risk importance measures in maintenance prioritization

    International Nuclear Information System (INIS)

    Dubreil Chambardel, A.; Ardorino, F.; Mauger, P.

    1997-01-01

    A RCM method has been developed at EDF since 1990 to optimize maintenance through a prioritization of resources for equipment that are important in terms of safety, availability and maintenance costs. In 1994, the Nuclear Power Plant Operations Division decided to apply this method to the most important systems of the French PWRs. About 50 systems are in the scope of the RCM. Those that have a role in safety were ranked depending on their contribution to the risk of core melt provided by PSAs. The RCM studies on the 20 most important to safety systems are performed by the Nuclear Power Plant Operations division, the other 30 systems are studied on sites. The RCM study consists first in the research of equipment and failures modes significant to safety, availability or maintenance costs and the evaluation of the performance of those equipment. Those studies lead to the distinction of equipment and failure modes that are critical or non critical to safety, availability and costs. The last part of the study consists in optimizing maintenance on those equipment. In this process, risk measures are used to help defining equipment and failure modes critical to safety. This is done by calculation of risk importance measures provided by PSAs. We explain in this paper which measures of risk have been defined, how PSAs allow calculation of those measures, and how we used those results in the RCM studies we processed. We give also extensions of the use of those measures in the process of defining optimized maintenance tasks. After having defined a RCM method for the French PWRs, the Nuclear Power plant Operations Division decided to start a generalized program of maintenance optimization for the most important systems. The three criteria on which the method relies are: safety, unit availability and maintenance costs. We present here the safety aspect of the method and more precisely these of risk importance measures in the RCM process. (author)

  6. IMPORTANCE OF KINETIC MEASURES IN TRAJECTORY PREDICTION WITH OPTIMAL CONTROL

    Directory of Open Access Journals (Sweden)

    Ömer GÜNDOĞDU

    2001-02-01

    Full Text Available A two-dimensional sagittally symmetric human-body model was established to simulate an optimal trajectory for manual material handling tasks. Nonlinear control techniques and genetic algorithms were utilized in the optimizations to explore optimal lifting patterns. The simulation results were then compared with the experimental data. Since the kinetic measures such as joint reactions and moments are vital parameters in injury determination, the importance of comparing kinetic measures rather than kinematical ones was emphasized.

  7. Learner Variables Important for Success in L2 Listening Comprehension in French Immersion Classrooms

    Science.gov (United States)

    Vandergrift, Larry; Baker, Susan C.

    2018-01-01

    Listening comprehension, which is relatively straightforward for native language (L1) speakers, is often frustrating for second language (L2) learners. Listening comprehension is important to L2 acquisition, but little is known about the variables that influence the development of L2 listening skills. The goal of this study was to determine which…

  8. Measuring the surgical 'learning curve': methods, variables and competency.

    Science.gov (United States)

    Khan, Nuzhath; Abboudi, Hamid; Khan, Mohammed Shamim; Dasgupta, Prokar; Ahmed, Kamran

    2014-03-01

    To describe how learning curves are measured and what procedural variables are used to establish a 'learning curve' (LC). To assess whether LCs are a valuable measure of competency. A review of the surgical literature pertaining to LCs was conducted using the Medline and OVID databases. Variables should be fully defined and when possible, patient-specific variables should be used. Trainee's prior experience and level of supervision should be quantified; the case mix and complexity should ideally be constant. Logistic regression may be used to control for confounding variables. Ideally, a learning plateau should reach a predefined/expert-derived competency level, which should be fully defined. When the group splitting method is used, smaller cohorts should be used in order to narrow the range of the LC. Simulation technology and competence-based objective assessments may be used in training and assessment in LC studies. Measuring the surgical LC has potential benefits for patient safety and surgical education. However, standardisation in the methods and variables used to measure LCs is required. Confounding variables, such as participant's prior experience, case mix, difficulty of procedures and level of supervision, should be controlled. Competency and expert performance should be fully defined. © 2013 The Authors. BJU International © 2013 BJU International.

  9. On Association Measures for Continuous Variables and Correction for Chance

    NARCIS (Netherlands)

    Warrens, Matthijs J.

    2015-01-01

    This paper studies correction for chance for association measures for continuous variables. The set of linear transformations of Pearson's product-moment correlation is used as the domain of the correction for chance function. Examples of measures in this set are Tucker's congruence coefficient,

  10. Measured spatial variability of beach erosion due to aeolian processes.

    NARCIS (Netherlands)

    de Vries, S.; Verheijen, A.H.; Hoonhout, B.M.; Vos, S.E.; Cohn, Nicholas; Ruggiero, P; Aagaard, T.; Deigaard, R.; Fuhrman, D.

    2017-01-01

    This paper shows the first results of measured spatial variability of beach erosion due to aeolian processes during the recently conducted SEDEX2 field experiment at Long Beach, Washington, U.S.A.. Beach erosion and sedimentation were derived using series of detailed terrestrial LIDAR measurements

  11. Bayesian modeling of measurement error in predictor variables using item response theory

    NARCIS (Netherlands)

    Fox, Gerardus J.A.; Glas, Cornelis A.W.

    2000-01-01

    This paper focuses on handling measurement error in predictor variables using item response theory (IRT). Measurement error is of great important in assessment of theoretical constructs, such as intelligence or the school climate. Measurement error is modeled by treating the predictors as unobserved

  12. ECONOMIC IMPORTANCE OF THE PREVENTIVE MEASURES IN DENTISTRY.

    Science.gov (United States)

    Deljo, Emsudina; Sijercic, Zinaida; Mulaosmanovic, Amina; Musanovic, Alma; Prses, Nedim

    2016-10-01

    Previous studies have shown that the state of oral health in the area of Podrinje Canton is really poor. Taking into account that in the last five years are implemented two projects in the municipality it is necessary to examine the impact of preventive measures in dentistry on the oral health. a) To evaluate the impact of continuing education and local fluoridation on the state of oral health; b) To analyze the economic importance of preventive measures. For the purpose of the research on activities of continuing education on the importance of oral health and local fluoridation of teeth and to determine the economic aspects of the application of preventive measures is tested and reviewed 900 students from fourth to ninth grade. The children were divided into three groups of 300 students in each group: a) In the first group of children is carried out continuous education about proper tooth brushing and the importance of oral hygiene and local fluoridation twice a year during the last three years, b) In the second group children carried out local fluoridation twice a year during the last three years while in the third group, there were no continuous prevention measures; c) Used is a single questionnaire for all respondents. Data obtained in this study were analyzed by descriptive and inferential statistical methods. The importance of continuing education and local fluoridation is clearly reflected in the different values DMF-index, which was the subject of research. In the first group, in which is carried out continuous education and local fluoridation value of DMF index was 2.7, in the second group with local fluorination this value was 3.56, while in the third group, in which is not implemented preventive measures, the value DMF- index was 5.93. From an economic point the preventive measures are the cheapest, most effective and the best solution in order to maintain oral health.

  13. Studies and research concerning BNFP. Identification and simplified modeling of economically important radwaste variables

    International Nuclear Information System (INIS)

    Ebel, P.E.; Godfrey, W.L.; Henry, J.L.; Postles, R.L.

    1983-09-01

    An extensive computer model describing the mass balance and economic characteristics of radioactive waste disposal systems was exercised in a series of runs designed using linear statistical methods. The most economically important variables were identified, their behavior characterized, and a simplified computer model prepared which runs on desk-top minicomputers. This simplified model allows the investigation of the effects of the seven most significant variables in each of four waste areas: Liquid Waste Storage, Liquid Waste Solidification, General Process Trash Handling, and Hulls Handling. 8 references, 1 figure, 12 tables

  14. Measuring and Predicting Tag Importance for Image Retrieval.

    Science.gov (United States)

    Li, Shangwen; Purushotham, Sanjay; Chen, Chen; Ren, Yuzhuo; Kuo, C-C Jay

    2017-12-01

    Textual data such as tags, sentence descriptions are combined with visual cues to reduce the semantic gap for image retrieval applications in today's Multimodal Image Retrieval (MIR) systems. However, all tags are treated as equally important in these systems, which may result in misalignment between visual and textual modalities during MIR training. This will further lead to degenerated retrieval performance at query time. To address this issue, we investigate the problem of tag importance prediction, where the goal is to automatically predict the tag importance and use it in image retrieval. To achieve this, we first propose a method to measure the relative importance of object and scene tags from image sentence descriptions. Using this as the ground truth, we present a tag importance prediction model to jointly exploit visual, semantic and context cues. The Structural Support Vector Machine (SSVM) formulation is adopted to ensure efficient training of the prediction model. Then, the Canonical Correlation Analysis (CCA) is employed to learn the relation between the image visual feature and tag importance to obtain robust retrieval performance. Experimental results on three real-world datasets show a significant performance improvement of the proposed MIR with Tag Importance Prediction (MIR/TIP) system over other MIR systems.

  15. Accounting for components interactions in the differential importance measure

    International Nuclear Information System (INIS)

    Zio, Enrico; Podofillini, Luca

    2006-01-01

    A limitation of the importance measures (IMs) currently used in reliability and risk analyses is that they rank only individual components or basic events whereas they are not directly applicable to combinations or groups of components or basic events. To partially overcome this limitation, recently, the differential importance measure (DIM), has been introduced for use in risk-informed decision making. The DIM is a first-order sensitivity measure that ranks the parameters of the risk model according to the fraction of total change in the risk that is due to a small change in the parameters' values, taken one at a time. However, it does not account for the effects of interactions among components. In this paper, a second-order extension of the DIM, named DIM II , is proposed for accounting of the interactions of pairs of components when evaluating the change in system performance due to changes of the reliability parameters of the components. A numerical application is presented in which the informative contents of DIM and DIM II are compared. The results confirm that in certain cases when second-order interactions among components are accounted for, the importance ranking of the components may differ from that produced by a first-order sensitivity measure

  16. Importance measures in global sensitivity analysis of nonlinear models

    International Nuclear Information System (INIS)

    Homma, Toshimitsu; Saltelli, Andrea

    1996-01-01

    The present paper deals with a new method of global sensitivity analysis of nonlinear models. This is based on a measure of importance to calculate the fractional contribution of the input parameters to the variance of the model prediction. Measures of importance in sensitivity analysis have been suggested by several authors, whose work is reviewed in this article. More emphasis is given to the developments of sensitivity indices by the Russian mathematician I.M. Sobol'. Given that Sobol' treatment of the measure of importance is the most general, his formalism is employed throughout this paper where conceptual and computational improvements of the method are presented. The computational novelty of this study is the introduction of the 'total effect' parameter index. This index provides a measure of the total effect of a given parameter, including all the possible synergetic terms between that parameter and all the others. Rank transformation of the data is also introduced in order to increase the reproducibility of the method. These methods are tested on a few analytical and computer models. The main conclusion of this work is the identification of a sensitivity analysis methodology which is both flexible, accurate and informative, and which can be achieved at reasonable computational cost

  17. THE RELATIVE IMPORTANCE OF FINANCIAL RATIOS AND NONFINANCIAL VARIABLES IN PREDICTING OF INSOLVENCY

    Directory of Open Access Journals (Sweden)

    Ivica Pervan

    2013-02-01

    Full Text Available One of the most important decisions in every bank is approving loans to firms, which is based on evaluated credit risk and collateral. Namely, it is necessary to evaluate the risk that client will be unable to repay the obligations according to the contract. After Beaver's (1967 and Altman's (1968 seminal papers many authors extended the initial research by changing the methodology, samples, countries, etc. But majority of business failure papers as predictors use financial ratios, while in the real life banks combine financial and nonfinancial variables. In order to test predictive power of nonfinancial variables authors in the paper compare two insolvency prediction models. The first model that used financial rations resulted with classification accuracy of 82.8%, while the combined model with financial and nonfinancial variables resulted with classification accuracy of 88.1%.

  18. The gait standard deviation, a single measure of kinematic variability.

    Science.gov (United States)

    Sangeux, Morgan; Passmore, Elyse; Graham, H Kerr; Tirosh, Oren

    2016-05-01

    Measurement of gait kinematic variability provides relevant clinical information in certain conditions affecting the neuromotor control of movement. In this article, we present a measure of overall gait kinematic variability, GaitSD, based on combination of waveforms' standard deviation. The waveform standard deviation is the common numerator in established indices of variability such as Kadaba's coefficient of multiple correlation or Winter's waveform coefficient of variation. Gait data were collected on typically developing children aged 6-17 years. Large number of strides was captured for each child, average 45 (SD: 11) for kinematics and 19 (SD: 5) for kinetics. We used a bootstrap procedure to determine the precision of GaitSD as a function of the number of strides processed. We compared the within-subject, stride-to-stride, variability with the, between-subject, variability of the normative pattern. Finally, we investigated the correlation between age and gait kinematic, kinetic and spatio-temporal variability. In typically developing children, the relative precision of GaitSD was 10% as soon as 6 strides were captured. As a comparison, spatio-temporal parameters required 30 strides to reach the same relative precision. The ratio stride-to-stride divided by normative pattern variability was smaller in kinematic variables (the smallest for pelvic tilt, 28%) than in kinetic and spatio-temporal variables (the largest for normalised stride length, 95%). GaitSD had a strong, negative correlation with age. We show that gait consistency may stabilise only at, or after, skeletal maturity. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. ROLE AND IMPORTANCE OF KEY PERFORMANCE INDICATORS MEASUREMENT

    Directory of Open Access Journals (Sweden)

    Rade Stanković

    2011-03-01

    Full Text Available Key performance indicators are financial and non financial indicators that organizations use inorder to estimate and fortify how successful they are, aiming previously established long lastinggoals. Appropriate selection of indicators that will be used for measuring is of a greatest importance.Process organization of business is necessary to be constitute in order to realize such effective andefficient system or performance measuring via KPI. Process organization also implies customerorientation and necessary flexibility in nowadays condition of global competition.Explanation of process organization, the way of KPI selection, and practical example of KPImeasuring in Toyota dealerships are presented in this paper.

  20. R Package multiPIM: A Causal Inference Approach to Variable Importance Analysis

    Directory of Open Access Journals (Sweden)

    Stephan J Ritter

    2014-04-01

    Full Text Available We describe the R package multiPIM, including statistical background, functionality and user options. The package is for variable importance analysis, and is meant primarily for analyzing data from exploratory epidemiological studies, though it could certainly be applied in other areas as well. The approach taken to variable importance comes from the causal inference field, and is different from approaches taken in other R packages. By default, multiPIM uses a double robust targeted maximum likelihood estimator (TMLE of a parameter akin to the attributable risk. Several regression methods/machine learning algorithms are available for estimating the nuisance parameters of the models, including super learner, a meta-learner which combines several different algorithms into one. We describe a simulation in which the double robust TMLE is compared to the graphical computation estimator. We also provide example analyses using two data sets which are included with the package.

  1. Radioactivity measurement in imported food and food related items

    International Nuclear Information System (INIS)

    Sombrito, E.Z.; Santos, F.L.; Rosa, A.M. de la; Tangonan, M.C.; Bulos, A.D.; Nuguid, Z.F.

    1989-01-01

    The Philippine Nuclear Research Institute (PNRI), formerly Philippine Atomic Energy Commission (PAEC) undertook the radioactivity monitoring of imported food and food-related products after the Chernobyl Plant accident in April 1986. Food samples were analyzed for 137 Cs and 134 Cs by gamma spectral method of analysis. This report deals with the measurement process and gives the result of the activity covering the period June 1986 to December 1987. (Auth.). 9 tabs., 7 figs., 4 refs

  2. Innovative approaches for addressing old challenges in component importance measures

    International Nuclear Information System (INIS)

    Rocco S, Claudio M.; Ramirez-Marquez, Jose Emmanuel

    2012-01-01

    Importance measures (IM) are component related indices that allow assessing how a component in a system affects one or more system level performance functions. While several IM have been presented in the literature, challenges still remain with respect to the following: (1) multiple ranking—multiple perspective, (2) multi-component importance and, (3) multi-function importance. To address these challenges, this paper proposes set of innovative solutions based on several available techniques: Hasse diagram, Copeland score and Multi-objective optimization. As such, the purpose of this research is twofold: first propose solutions and second foster new research to address these challenges. Each of the proposed solutions is exemplified with a working example.

  3. Weighing evidence: quantitative measures of the importance of bitemark evidence.

    Science.gov (United States)

    Kittelson, J M; Kieser, J A; Buckingham, D M; Herbison, G P

    2002-12-01

    Quantitative measures of the importance of evidence such as the "likelihood ratio" have become increasingly popular in the courtroom. These measures have been used by expert witnesses formally to describe their certainty about a piece of evidence. These measures are commonly interpreted as the amount by which the evidence should revise the opinion of guilt, and thereby summarize the importance of a particular piece of evidence. Unlike DNA evidence, quantitative measures have not been widely used by forensic dentists to describe their certainty when testifying about bitemark evidence. There is, however, no inherent reason why they should not be used to evaluate bitemarks. The purpose of this paper is to describe the likelihood ratio as it might be applied to bitemark evidence. We use a simple bitemark example to define the likelihood ratio, its application, and interpretation. In particular we describe how the jury interprets the likelihood ratio from a Bayesian perspective when evaluating the impact of the evidence on the odds that the accused is guilty. We describe how the dentist would calculate the likelihood ratio based on frequentist interpretations. We also illustrate some of the limitations of the likelihood ratio, and show how those limitations apply to bitemark evidence. We conclude that the quality of bitemark evidence cannot be adequately summarized by the likelihood ratio, and argue that its application in this setting may be more misleading than helpful.

  4. Measurement and monitoring technologies are important SITE program component

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    An ongoing component of the Superfund Innovative Technologies Evaluation (SITE) Program, managed by the US EPA at its Hazardous Waste Engineering Research Laboratory in Cincinnati, is the development and demonstration of new and innovative measurement and monitoring technologies that will be applicable to Superfund site characterization. There are four important roles for monitoring and measurement technologies at Superfund sites: (1) to assess the extent of contamination at a site, (2) to supply data and information to determine impacts to human health and the environment, (3) to supply data to select the appropriate remedial action, and (4) to monitor the success or effectiveness of the selected remedy. The Environmental Monitoring Systems Laboratory in Las Vegas, Nevada (EMSL-LV) has been supporting the development of improved measurement and monitoring techniques in conjunction with the SITE Program with a focus on two areas: Immunoassay for toxic substances and fiber optic sensing for in-situ analysis at Superfund sites

  5. Determination of continuous variable entanglement by purity measurements.

    Science.gov (United States)

    Adesso, Gerardo; Serafini, Alessio; Illuminati, Fabrizio

    2004-02-27

    We classify the entanglement of two-mode Gaussian states according to their degree of total and partial mixedness. We derive exact bounds that determine maximally and minimally entangled states for fixed global and marginal purities. This characterization allows for an experimentally reliable estimate of continuous variable entanglement based on measurements of purity.

  6. What variables are important in predicting bovine viral diarrhea virus? A random forest approach.

    Science.gov (United States)

    Machado, Gustavo; Mendoza, Mariana Recamonde; Corbellini, Luis Gustavo

    2015-07-24

    Bovine viral diarrhea virus (BVDV) causes one of the most economically important diseases in cattle, and the virus is found worldwide. A better understanding of the disease associated factors is a crucial step towards the definition of strategies for control and eradication. In this study we trained a random forest (RF) prediction model and performed variable importance analysis to identify factors associated with BVDV occurrence. In addition, we assessed the influence of features selection on RF performance and evaluated its predictive power relative to other popular classifiers and to logistic regression. We found that RF classification model resulted in an average error rate of 32.03% for the negative class (negative for BVDV) and 36.78% for the positive class (positive for BVDV).The RF model presented area under the ROC curve equal to 0.702. Variable importance analysis revealed that important predictors of BVDV occurrence were: a) who inseminates the animals, b) number of neighboring farms that have cattle and c) rectal palpation performed routinely. Our results suggest that the use of machine learning algorithms, especially RF, is a promising methodology for the analysis of cross-sectional studies, presenting a satisfactory predictive power and the ability to identify predictors that represent potential risk factors for BVDV investigation. We examined classical predictors and found some new and hard to control practices that may lead to the spread of this disease within and among farms, mainly regarding poor or neglected reproduction management, which should be considered for disease control and eradication.

  7. The importance of immune gene variability (MHC in evolutionary ecology and conservation

    Directory of Open Access Journals (Sweden)

    Sommer Simone

    2005-10-01

    Full Text Available Abstract Genetic studies have typically inferred the effects of human impact by documenting patterns of genetic differentiation and levels of genetic diversity among potentially isolated populations using selective neutral markers such as mitochondrial control region sequences, microsatellites or single nucleotide polymorphism (SNPs. However, evolutionary relevant and adaptive processes within and between populations can only be reflected by coding genes. In vertebrates, growing evidence suggests that genetic diversity is particularly important at the level of the major histocompatibility complex (MHC. MHC variants influence many important biological traits, including immune recognition, susceptibility to infectious and autoimmune diseases, individual odours, mating preferences, kin recognition, cooperation and pregnancy outcome. These diverse functions and characteristics place genes of the MHC among the best candidates for studies of mechanisms and significance of molecular adaptation in vertebrates. MHC variability is believed to be maintained by pathogen-driven selection, mediated either through heterozygote advantage or frequency-dependent selection. Up to now, most of our knowledge has derived from studies in humans or from model organisms under experimental, laboratory conditions. Empirical support for selective mechanisms in free-ranging animal populations in their natural environment is rare. In this review, I first introduce general information about the structure and function of MHC genes, as well as current hypotheses and concepts concerning the role of selection in the maintenance of MHC polymorphism. The evolutionary forces acting on the genetic diversity in coding and non-coding markers are compared. Then, I summarise empirical support for the functional importance of MHC variability in parasite resistance with emphasis on the evidence derived from free-ranging animal populations investigated in their natural habitat. Finally, I

  8. Measurement of Alpha Emitters Concentration in Imported Cigarettes

    International Nuclear Information System (INIS)

    Nasser Allah, Z.K.; Musa, W.A.; AL-Rawi, A.A.S.

    2011-01-01

    The aime of this study was to measured the alpha emitters concentration of (15) different kinds of imported cigarettes. the nuclear reaction used U-235(n, f) obtained by the bombardment of U-235 with thermal neutrons from (Am B e)neutron source with thermal flux of(5*10 3 n.cm -2 .s -1 ). The Results obtained showed the values of the Uranium concentration, and varies from (0.041 ppm) in five stares kind to (2.374ppm) in Machbeth (chocolate) 100's kind. All the result obtained are within the limit levels as given by UNSCAR data

  9. Measuring variability of procedure progression in proceduralized scenarios

    International Nuclear Information System (INIS)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea

    2012-01-01

    Highlights: ► The VPP measure was developed to quantify how differently operators follow the procedures. ► Sources that cause variability of ways to follow a given procedure were identified. ► The VPP values for the scenarios are positively related to the scenario performance time. ► The VPP measure is meaningful for explaining characteristics of several PSFs. -- Abstract: Various performance shaping factors (PSFs) have been presented to explain the contributors to unsafe acts in a human failure event or predict a human error probability of new human performance. However, because most of these parameters of an HRA depend on the subjective knowledge and experience of HRA analyzers, the results of an HRA insufficiently provide unbiased standards to explain human performance variations or compare collected data with other data from different analyzers. To secure the validity of the HRA results, we propose a quantitative measure, which represents the variability of procedure progression (VPP) in proceduralized scenarios. A VPP measure shows how differently the operators follow the steps of the procedures. This paper introduces the sources of the VPP measure and relevance to PSFs. The assessment method of the VPP measure is also proposed, and the application examples are shown with a comparison of the performance time. Although more empirical studies should be conducted to reveal the relationship between the VPP measure and other PSFs, it is believed that the VPP measure provides evidence to quantitatively evaluate human performance variations and to cross-culturally compare the collected data.

  10. Measures of Quantum Synchronization in Continuous Variable Systems

    Science.gov (United States)

    Mari, A.; Farace, A.; Didier, N.; Giovannetti, V.; Fazio, R.

    2013-09-01

    We introduce and characterize two different measures which quantify the level of synchronization of coupled continuous variable quantum systems. The two measures allow us to extend to the quantum domain the notions of complete and phase synchronization. The Heisenberg principle sets a universal bound to complete synchronization. The measure of phase synchronization is, in principle, unbounded; however, in the absence of quantum resources (e.g., squeezing) the synchronization level is bounded below a certain threshold. We elucidate some interesting connections between entanglement and synchronization and, finally, discuss an application based on quantum optomechanical systems.

  11. Tethered balloon-based measurements of meteorological variables and aerosols

    Science.gov (United States)

    Sentell, R. J.; Storey, R. W.; Chang, J. J. C.; Jacobsen, S. J.

    1976-01-01

    Tethered balloon based measurements of the vertical distributions of temperature, humidity, wind speed, and aerosol concentrations were taken over a 4-hour period beginning at sunrise on June 29, 1976, at Wallops Island, Virginia. Twelve consecutive profiles of each variable were obtained from ground to about 500 meters. These measurements were in conjuction with a noise propagation study on remotely arrayed acoustic range (ROMAAR) at Wallops Flight Center. An organized listing of these vertical soundings is presented. The tethered balloon system configuration utilized for these measurements is described.

  12. Environmental economic variables - what has been measured until now?

    International Nuclear Information System (INIS)

    Ahlroth, S.; Palm, V.

    2001-01-01

    Environmental accounting encompasses a variety of economic variables. They range from production values of different branches of industry, through fiscal instruments such as environmental taxes, and to valuation studies of external effects of the economy. This paper tries to map out the different aspects of variables, and to point out their linkages and uses, viewed from an environmental accounting perspective. Also, the estimated size of the different types of variables is discussed, based mainly on Swedish studies and on a national scale. Included variables are GDP, export and import, environmental taxes, subsidies, environmental costs, remediation costs, environmental damage costs and examples of prevention costs. We will divide the economic variables into four different types: 1. Those that are recorded as the actors payment on the market 2. Those that are part of the government budget 3. Those that serve as a valuation of the costs incurred on society 4. Those that could be invested to prevent environmental damage The size of the different costs will be taken from a variety of studies, mainly Swedish, and be put in relation to GDP or similar. A brief discussion of the Swedish situation as compared to international figures will also be made

  13. Beta activity measurements in high, variable gamma backgrounds

    International Nuclear Information System (INIS)

    Stanga, D.; Sandu, E.; Craciun, L.

    1997-01-01

    In many cases beta activity measurements must be performed in high and variable gamma backgrounds. In such instances it is necessary to use well-shielded detectors but this technique is limited to laboratory equipment and frequently insufficient. In order to perform in a simple manner beta activity measurements in high and variable backgrounds a software-aided counting technique have been developed and a counting system have been constructed. This technique combines the different counting techniques with traditional method of successive measurement of the sample and background. The counting system is based on a programmable multi-scaler which is endowed with appropriate software and allow all operations to be performed via keyboard in an interactive fashion. Two large - area proportional detectors were selected in order to have the same background and the same gamma response within 5%. A program has been developed for the counting data analysis and beta activity computing. The software-aided counting technique has been implemented for beta activity measurement in high and variable backgrounds. (authors)

  14. Posterior variability of inclusion shape based on tomographic measurement data

    International Nuclear Information System (INIS)

    Watzenig, Daniel; Fox, Colin

    2008-01-01

    We treat the problem of recovering the unknown shape of a single inclusion with unknown constant permittivity in an otherwise uniform background material, from uncertain measurements of trans-capacitance at electrodes outside the material. The ubiquitous presence of measurement noise implies that the practical measurement process is probabilistic, and the inverse problem is naturally stated as statistical inference. Formulating the inverse problem in a Bayesian inferential framework requires accurately modelling the forward map, measurement noise, and specifying a prior distribution for the cross-sectional material distribution. Numerical implementation of the forward map is via the boundary element method (BEM) taking advantage of a piecewise constant representation. Summary statistics are calculated using MCMC sampling to characterize posterior variability for synthetic and measured data sets.

  15. Estimation of road profile variability from measured vehicle responses

    Science.gov (United States)

    Fauriat, W.; Mattrand, C.; Gayton, N.; Beakou, A.; Cembrzynski, T.

    2016-05-01

    When assessing the statistical variability of fatigue loads acting throughout the life of a vehicle, the question of the variability of road roughness naturally arises, as both quantities are strongly related. For car manufacturers, gathering information on the environment in which vehicles evolve is a long and costly but necessary process to adapt their products to durability requirements. In the present paper, a data processing algorithm is proposed in order to estimate the road profiles covered by a given vehicle, from the dynamic responses measured on this vehicle. The algorithm based on Kalman filtering theory aims at solving a so-called inverse problem, in a stochastic framework. It is validated using experimental data obtained from simulations and real measurements. The proposed method is subsequently applied to extract valuable statistical information on road roughness from an existing load characterisation campaign carried out by Renault within one of its markets.

  16. Variable reflectivity signal mirrors and signal response measurements

    International Nuclear Information System (INIS)

    Vine, Glenn de; Shaddock, Daniel A; McClelland, David E

    2002-01-01

    Future gravitational wave detectors will include some form of signal mirror in order to alter the signal response of the device. We introduce interferometer configurations which utilize a variable reflectivity signal mirror allowing a tunable peak frequency and variable signal bandwidth. A detector configured with a Fabry-Perot cavity as the signal mirror is compared theoretically with one using a Michelson interferometer for a signal mirror. A system for the measurement of the interferometer signal responses is introduced. This technique is applied to a power-recycled Michelson interferometer with resonant sideband extraction. We present broadband measurements of the benchtop prototype's signal response for a range of signal cavity detunings. This technique is also applicable to most other gravitational wave detector configurations

  17. Variable reflectivity signal mirrors and signal response measurements

    CERN Document Server

    Vine, G D; McClelland, D E

    2002-01-01

    Future gravitational wave detectors will include some form of signal mirror in order to alter the signal response of the device. We introduce interferometer configurations which utilize a variable reflectivity signal mirror allowing a tunable peak frequency and variable signal bandwidth. A detector configured with a Fabry-Perot cavity as the signal mirror is compared theoretically with one using a Michelson interferometer for a signal mirror. A system for the measurement of the interferometer signal responses is introduced. This technique is applied to a power-recycled Michelson interferometer with resonant sideband extraction. We present broadband measurements of the benchtop prototype's signal response for a range of signal cavity detunings. This technique is also applicable to most other gravitational wave detector configurations.

  18. Complex analyses of inverted repeats in mitochondrial genomes revealed their importance and variability.

    Science.gov (United States)

    Cechová, Jana; Lýsek, Jirí; Bartas, Martin; Brázda, Václav

    2018-04-01

    The NCBI database contains mitochondrial DNA (mtDNA) genomes from numerous species. We investigated the presence and locations of inverted repeat sequences (IRs) in these mtDNA sequences, which are known to be important for regulating nuclear genomes. IRs were identified in mtDNA in all species. IR lengths and frequencies correlate with evolutionary age and the greatest variability was detected in subgroups of plants and fungi and the lowest variability in mammals. IR presence is non-random and evolutionary favoured. The frequency of IRs generally decreased with IR length, but not for IRs 24 or 30 bp long, which are 1.5 times more abundant. IRs are enriched in sequences from the replication origin, followed by D-loop, stem-loop and miscellaneous sequences, pointing to the importance of IRs in regulatory regions of mitochondrial DNA. Data were produced using Palindrome analyser, freely available on the web at http://bioinformatics.ibp.cz. vaclav@ibp.cz. Supplementary data are available at Bioinformatics online.

  19. Benefits of balancing method for component RAW importance measure

    International Nuclear Information System (INIS)

    Kim, Kil Yoo; Yang, Joon Eon

    2005-01-01

    In the Risk Informed Regulation and Applications (RIR and A), the determination of risk significant Structure, System and Components (SSCs) plays an important role, and importance measures such as Fussell-Vesely (FV) and RAW (Risk Achievement Worth) are widely used in the determination of risk significant SSCs. For example, in the Maintenance Rule, Graded Quality Assurance(GQA) and Option 2, FV and RAW are used in the categorization of SSCs. Especially, in the GQA and Option 2, the number of SSCs to be categorized is too many to handle, so the FVs and RAWs of the components are practically derived in a convenient way with those of the basic events which have already been acquired as PSA (Probabilistic Safety Assessment) results instead of by reevaluating the fault tree/event tree of the PSA model. That is, the group FVs and RAWs for the components are derived from the FVs and RAWs of the basic events which consist of the group. Here, the basic events include random failure, Common Cause Failure (CCF), test and maintenance, etc. which make the system unavailable. A method called 'Balancing Method' which can practically and correctly derive the component RAW with the basic event FVs and RAWs even if CCFs exists as basic events was introduced in Ref.. However, 'Balancing Method' has other advantage, i.e., it can also fairly correctly derive component RAW using fault tree without using basic events FVs and RAWs

  20. Variability of Measured Runoff and Soil Loss from Field Plots

    Directory of Open Access Journals (Sweden)

    F. Asadzadeh

    2016-02-01

    Full Text Available Introduction: Field plots are widely used in studies related to the measurements of soil loss and modeling of erosion processes. Research efforts are needed to investigate factors affecting the data quality of plots. Spatial scale or size of plots is one of these factors which directly affects measuring runoff and soil loss by means of field plots. The effect of plot size on measured runoff or soil loss from natural plots is known as plot scale effect. On the other hand, variability of runoff and sediment yield from replicated filed plots is a main source of uncertainty in measurement of erosion from plots which should be considered in plot data interpretation processes. Therefore, there is a demand for knowledge of soil erosion processes occurring in plots of different sizes and of factors that determine natural variability, as a basis for obtaining soil loss data of good quality. This study was carried out to investigate the combined effects of these two factors by measurement of runoff and soil loss from replicated plots with different sizes. Materials and Methods: In order to evaluate the variability of runoff and soil loss data seven plots, differing in width and length, were constructed in a uniform slope of 9% at three replicates at Koohin Research Station in Qazvin province. The plots were ploughed up to down slope in September 2011. Each plot was isolated using soil beds with a height of 30 cm, to direct generated surface runoff to the lower part of the plots. Runoff collecting systems composed of gutters, pipes and tankswere installed at the end of each plot. During the two-year study period of 2011-2012, plots were maintained in bare conditions and runoff and soil loss were measured for each single event. Precipitation amounts and characteristics were directly measured by an automatic recording tipping-bucket rain gauge located about 200 m from the experimental plots. The entire runoff volume including eroded sediment was measured on

  1. Importance of measuring lactate levels in children with sepsis.

    Science.gov (United States)

    Anil, Nisha

    2017-10-10

    Sepsis is a major public health problem as well as one of the leading causes of preventable death in children because of failure to recognise the early signs and symptoms and to resuscitate rapidly. Blood lactate levels are used to assess the severity of sepsis and the effectiveness of resuscitation. Lactate levels are easily obtainable and should be checked in all patients admitted with suspected sepsis within six hours of presentation. The test should be repeated four and eight-hours post-diagnosis of sepsis. For the diagnosis of sepsis, patients' clinical symptoms, along with the combined analysis of partial pressure of oxygen, carbon dioxide and lactate levels, should be used. A multitude of factors can cause elevated lactate levels and so clinicians should use elevated levels cautiously by considering all other aetiologies. This article, which focuses on practice in Australia but makes reference to the UK, discusses the importance of measuring lactate levels in sepsis, the pathophysiology of lactate production, causes of elevated lactate levels, lactate measurement, nursing management of patients with elevated lactate levels, limitations of using lactate as a biomarker for diagnosing sepsis and implications for practice. ©2012 RCN Publishing Company Ltd. All rights reserved. Not to be copied, transmitted or recorded in any way, in whole or part, without prior permission of the publishers.

  2. Performance Measure as Feedback Variable in Image Processing

    Directory of Open Access Journals (Sweden)

    Ristić Danijela

    2006-01-01

    Full Text Available This paper extends the view of image processing performance measure presenting the use of this measure as an actual value in a feedback structure. The idea behind is that the control loop, which is built in that way, drives the actual feedback value to a given set point. Since the performance measure depends explicitly on the application, the inclusion of feedback structures and choice of appropriate feedback variables are presented on example of optical character recognition in industrial application. Metrics for quantification of performance at different image processing levels are discussed. The issues that those metrics should address from both image processing and control point of view are considered. The performance measures of individual processing algorithms that form a character recognition system are determined with respect to the overall system performance.

  3. Environmental variables measured at multiple spatial scales exert uneven influence on fish assemblages of floodplain lakes

    Science.gov (United States)

    Dembkowski, Daniel J.; Miranda, Leandro E.

    2014-01-01

    We examined the interaction between environmental variables measured at three different scales (i.e., landscape, lake, and in-lake) and fish assemblage descriptors across a range of over 50 floodplain lakes in the Mississippi Alluvial Valley of Mississippi and Arkansas. Our goal was to identify important local- and landscape-level determinants of fish assemblage structure. Relationships between fish assemblage structure and variables measured at broader scales (i.e., landscape-level and lake-level) were hypothesized to be stronger than relationships with variables measured at finer scales (i.e., in-lake variables). Results suggest that fish assemblage structure in floodplain lakes was influenced by variables operating on three different scales. However, and contrary to expectations, canonical correlations between in-lake environmental characteristics and fish assemblage structure were generally stronger than correlations between landscape-level and lake-level variables and fish assemblage structure, suggesting a hierarchy of influence. From a resource management perspective, our study suggests that landscape-level and lake-level variables may be manipulated for conservation or restoration purposes, and in-lake variables and fish assemblage structure may be used to monitor the success of such efforts.

  4. Monitoring of airborne biological particles in outdoor atmosphere. Part 1: Importance, variability and ratios.

    Science.gov (United States)

    Núñez, Andrés; Amo de Paz, Guillermo; Rastrojo, Alberto; García, Ana M; Alcamí, Antonio; Gutiérrez-Bustillo, A Montserrat; Moreno, Diego A

    2016-03-01

    The first part of this review ("Monitoring of airborne biological particles in outdoor atmosphere. Part 1: Importance, variability and ratios") describes the current knowledge on the major biological particles present in the air regarding their global distribution, concentrations, ratios and influence of meteorological factors in an attempt to provide a framework for monitoring their biodiversity and variability in such a singular environment as the atmosphere. Viruses, bacteria, fungi, pollen and fragments thereof are the most abundant microscopic biological particles in the air outdoors. Some of them can cause allergy and severe diseases in humans, other animals and plants, with the subsequent economic impact. Despite the harsh conditions, they can be found from land and sea surfaces to beyond the troposphere and have been proposed to play a role also in weather conditions and climate change by acting as nucleation particles and inducing water vapour condensation. In regards to their global distribution, marine environments act mostly as a source for bacteria while continents additionally provide fungal and pollen elements. Within terrestrial environments, their abundances and diversity seem to be influenced by the land-use type (rural, urban, coastal) and their particularities. Temporal variability has been observed for all these organisms, mostly triggered by global changes in temperature, relative humidity, et cetera. Local fluctuations in meteorological factors may also result in pronounced changes in the airbiota. Although biological particles can be transported several hundreds of meters from the original source, and even intercontinentally, the time and final distance travelled are strongly influenced by factors such as wind speed and direction. [Int Microbiol 2016; 19(1):1-1 3]. Copyright© by the Spanish Society for Microbiology and Institute for Catalan Studies.

  5. Assessment of oil content and fatty acid composition variability in two economically important Hibiscus species.

    Science.gov (United States)

    Wang, Ming Li; Morris, Brad; Tonnis, Brandon; Davis, Jerry; Pederson, Gary A

    2012-07-04

    The Hibiscus genus encompasses more than 300 species, but kenaf (Hibiscus cannabinus L.) and roselle (Hibiscus sabdariffa L.) are the two most economically important species within the genus. Seeds from these two Hibiscus species contain a relatively high amount of oil with two unusual fatty acids: dihydrosterculic and vernolic acids. The fatty acid composition in the oil can directly affect oil quality and its utilization. However, the variability in oil content and fatty acid composition for these two species is unclear. For these two species, 329 available accessions were acquired from the USDA germplasm collection. Their oil content and fatty acid composition were determined by nuclear magnetic resonance (NMR) and gas chromatography (GC), respectively. Using NMR and GC analyses, we found that Hibiscus seeds on average contained 18% oil and seed oil was composed of six major fatty acids (each >1%) and seven minor fatty acids (each Hibiscus cannabinus seeds contained significantly higher amounts of oil (18.14%), palmitic (20.75%), oleic (28.91%), vernolic acids (VA, 4.16%), and significantly lower amounts of stearic (3.96%), linoleic (39.49%), and dihydrosterculic acids (DHSA, 1.08%) than H. sabdariffa seeds (17.35%, 18.52%, 25.16%, 3.52%, 4.31%, 44.72%, and 1.57%, respectively). For edible oils, a higher oleic/linoleic (O/L) ratio and lower level of DHSA are preferred, and for industrial oils a high level of VA is preferred. Our results indicate that seeds from H. cannabinus may be of higher quality than H. sabdariffa seeds for these reasons. Significant variability in oil content and major fatty acids was also detected within both species. The variability in oil content and fatty acid composition revealed from this study will be useful for exploring seed utilization and developing new cultivars in these Hibiscus species.

  6. Important variables for parents' postnatal sense of security: evaluating a new Swedish instrument (the PPSS instrument).

    Science.gov (United States)

    Persson, Eva K; Dykes, Anna-Karin

    2009-08-01

    to evaluate dimensions of both parents' postnatal sense of security the first week after childbirth, and to determine associations between the PPSS instrument and different sociodemographic and situational background variables. evaluative, cross-sectional design. 113 mothers and 99 fathers with children live born at term, from five hospitals in southern Sweden. mothers and fathers had similar feelings concerning postnatal sense of security. Of the dimensions in the PPSS instrument, a sense of midwives'/nurses' empowering behaviour, a sense of one's own general well-being and a sense of the mother's well-being as experienced by the father were the most important dimensions for parents' experienced security. A sense of affinity within the family (for both parents) and a sense of manageable breast feeding (for mothers) were not significantly associated with their experienced security. A sense of participation during pregnancy and general anxiety were significantly associated background variables for postnatal sense of security for both parents. For the mothers, parity and a sense that the father was participating during pregnancy were also significantly associated. more focus on parents' participation during pregnancy as well as midwives'/nurses' empowering behaviour during the postnatal period will be beneficial for both parents' postnatal sense of security.

  7. Evolution of dispersal in spatially and temporally variable environments: The importance of life cycles.

    Science.gov (United States)

    Massol, François; Débarre, Florence

    2015-07-01

    Spatiotemporal variability of the environment is bound to affect the evolution of dispersal, and yet model predictions strongly differ on this particular effect. Recent studies on the evolution of local adaptation have shown that the life cycle chosen to model the selective effects of spatiotemporal variability of the environment is a critical factor determining evolutionary outcomes. Here, we investigate the effect of the order of events in the life cycle on the evolution of unconditional dispersal in a spatially heterogeneous, temporally varying landscape. Our results show that the occurrence of intermediate singular strategies and disruptive selection are conditioned by the temporal autocorrelation of the environment and by the life cycle. Life cycles with dispersal of adults versus dispersal of juveniles, local versus global density regulation, give radically different evolutionary outcomes that include selection for total philopatry, evolutionary bistability, selection for intermediate stable states, and evolutionary branching points. Our results highlight the importance of accounting for life-cycle specifics when predicting the effects of the environment on evolutionarily selected trait values, such as dispersal, as well as the need to check the robustness of model conclusions against modifications of the life cycle. © 2015 The Author(s). Evolution © 2015 The Society for the Study of Evolution.

  8. Parametric Study on Important Variables of Aircraft Impact to Prestressed Concrete Containment Vessels

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Sangshup; Hahm, Daegi; Choi, Inkil [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-05-15

    In this paper, to find the damage parameter, it is necessary to use many analysis cases and the time reduction. Thus, this paper uses a revised version of Riera's method. Using this method, the response has been found a Prestressed Concrete Containments Vessels (PCCVs) subject to impact loading, and the results of the velocity and mass of the important parameters have been analyzed. To find the response of the PCCVs subjected to aircraft impact load, it is made that a variable forcing functions depending on the velocity and fuel in the paper. The velocity variation affects more than fuel percentage, and we expect that the severe damage of the PCCVs with the same material properties is subject to aircraft impact load (more than 200m/s and 70%)

  9. Resiliency as a component importance measure in network reliability

    International Nuclear Information System (INIS)

    Whitson, John C.; Ramirez-Marquez, Jose Emmanuel

    2009-01-01

    This paper seeks to define the concept of resiliency as a component importance measure related to network reliability. Resiliency can be defined as a composite of: (1) the ability of a network to provide service despite external failures and (2) the time to restore service when in the presence of such failures. Although, Resiliency has been extensively studied in different research areas, this paper will study the specific aspects of quantifiable network resiliency when the network is experiencing potential catastrophic failures from external events and/or influences, and when it is not known a priori which specific components within the network will fail. A formal definition for Category I resiliency is proposed and a step-by-step approach based on Monte-Carlo simulation to calculate it is defined. To illustrate the approach, two-terminal networks with varying degrees of redundancy, have been considered. The results obtained for test networks show that this new quantifiable concept of resiliency provides insight into the performance and topology of the network. Future use for this work could include methods for safeguarding critical network components and optimizing the use of redundancy as a technique to improve network resiliency.

  10. The importance of histopathological and clinical variables in predicting the evolution of colon cancer.

    Science.gov (United States)

    Diculescu, Mircea; Iacob, Răzvan; Iacob, Speranţa; Croitoru, Adina; Becheanu, Gabriel; Popeneciu, Valentin

    2002-09-01

    It has been a consensus that prognostic factors should always be taken into account before planning treatment in colorectal cancer. A 5 year prospective study was conducted, in order to assess the importance of several histopathological and clinical prognostic variables in the prediction of evolution in colon cancer. Some of the factors included in the analysis are still subject to dispute by different authors. 46 of 53 screened patients qualified to enter the study and underwent a potentially curative resection of the tumor, followed, when necessary, by adjuvant chemotherapy. Univariate and multivariate analyses were carried out in order to identify independent prognostic indicators. The endpoint of the study was considered the recurrence of the tumor or the detection of metastases. 65.2% of the patients had a good evolution during the follow up period. Multivariate survival analysis performed by Cox proportional hazard model identified 3 independent prognostic factors: Dukes stage (p = 0.00002), the grade of differentiation (p = 0.0009) and the weight loss index, representing the weight loss of the patient divided by the number of months when it was actually lost (p = 0.02). Age under 40 years, sex, microscopic aspect of the tumor, tumor location, anemia degree were not identified by our analysis as having prognostic importance. Histopathological factors continue to be the most valuable source of information regarding the possible evolution of patients with colorectal cancer. Individual clinical symptoms or biological parameters such as erytrocyte sedimentation rate or hemoglobin level are of little or no prognostic value. More research is required relating to the impact of a performance status index (which could include also weight loss index) as another reliable prognostic variable.

  11. Caliper variable sonde for thermal conductivity measurements in situ

    Energy Technology Data Exchange (ETDEWEB)

    Oelsner, C; Leischner, H; Pischel, S

    1968-01-01

    For the measurement of the thermal conductivity of the formations surrounding a borehole, a sonde having variable diameter (consisting of an inflatable rubber cylinder with heating wires embedded in its wall) is described. The conditions for the usual sonde made of metal are no longer fulfilled, but the solution to the problem of determining the thermal conductivity from the temperature rise is given, based on an approach by Carslaw and Jaeger, which contains the Bessel functions of the second kind. It is shown that a simpler solution for large values of time can be obtained through the Laplace transformation, and the necessary series developments for computer application are also given. The sonde and the necessary measuring circuitry are described. Tests measurements have indicated that the thermal conductivity can be determined with this sonde with a precision of + 10%.

  12. Multi-scale variability of storm Ophelia 2017: The importance of synchronised environmental variables in coastal impact.

    Science.gov (United States)

    Guisado-Pintado, Emilia; Jackson, Derek W T

    2018-07-15

    Low frequency, high magnitude storm events can dramatically alter coastlines, helping to relocate large volumes of sediments and changing the configuration of landforms. Increases in the number of intense cyclones occurring in the Northern Hemisphere since the 1970s is evident with more northward tracking patterns developing. This brings added potential risk to coastal environments and infrastructure in northwest Europe and therefore understanding how these high-energy storms impact sandy coasts in particular is important for future management. This study highlights the evolution of Storm (formally Hurricane) Ophelia in October 2017 as it passed up and along the western seaboard of Ireland. The largest ever recorded Hurricane to form in the eastern Atlantic, we describe, using a range of environmental measurements and wave modelling, its track and intensity over its duration whilst over Ireland. The impact on a stretch of sandy coast in NW Ireland during Storm Ophelia, when the winds were at their peak, is examined using terrestrial laser scanning surveys pre- and post-storm to describe local changes of intertidal and dune edge dynamics. During maximum wind conditions (>35 knots) waves no >2m were recorded with an oblique to parallel orientation and coincident with medium to low tide (around 0.8m). Therefore, we demonstrate that anticipated widespread coastal erosion and damage may not always unfold as predicted. In fact, around 6000m 3 of net erosion occurred along the 420m stretch of coastline with maximum differences in beach topographic changes of 0.8m. The majority of the sediment redistribution occurred within the intertidal and lower beach zone with some limited dune trimming in the southern section (10% of the total erosion). Asynchronous high water (tide levels), localised offshore winds as well as coastline orientation relative to the storm winds and waves plays a significant role in reducing coastal erosional impact. Copyright © 2018 Elsevier B.V. All

  13. Minimizing variability of cascade impaction measurements in inhalers and nebulizers.

    Science.gov (United States)

    Bonam, Matthew; Christopher, David; Cipolla, David; Donovan, Brent; Goodwin, David; Holmes, Susan; Lyapustina, Svetlana; Mitchell, Jolyon; Nichols, Steve; Pettersson, Gunilla; Quale, Chris; Rao, Nagaraja; Singh, Dilraj; Tougas, Terrence; Van Oort, Mike; Walther, Bernd; Wyka, Bruce

    2008-01-01

    The purpose of this article is to catalogue in a systematic way the available information about factors that may influence the outcome and variability of cascade impactor (CI) measurements of pharmaceutical aerosols for inhalation, such as those obtained from metered dose inhalers (MDIs), dry powder inhalers (DPIs) or products for nebulization; and to suggest ways to minimize the influence of such factors. To accomplish this task, the authors constructed a cause-and-effect Ishikawa diagram for a CI measurement and considered the influence of each root cause based on industry experience and thorough literature review. The results illustrate the intricate network of underlying causes of CI variability, with the potential for several multi-way statistical interactions. It was also found that significantly more quantitative information exists about impactor-related causes than about operator-derived influences, the contribution of drug assay methodology and product-related causes, suggesting a need for further research in those areas. The understanding and awareness of all these factors should aid in the development of optimized CI methods and appropriate quality control measures for aerodynamic particle size distribution (APSD) of pharmaceutical aerosols, in line with the current regulatory initiatives involving quality-by-design (QbD).

  14. Modeling intraindividual variability with repeated measures data methods and applications

    CERN Document Server

    Hershberger, Scott L

    2013-01-01

    This book examines how individuals behave across time and to what degree that behavior changes, fluctuates, or remains stable.It features the most current methods on modeling repeated measures data as reported by a distinguished group of experts in the field. The goal is to make the latest techniques used to assess intraindividual variability accessible to a wide range of researchers. Each chapter is written in a ""user-friendly"" style such that even the ""novice"" data analyst can easily apply the techniques.Each chapter features:a minimum discussion of mathematical detail;an empirical examp

  15. Importance and variability in processes relevant to environmental tritium ingestion dose models

    International Nuclear Information System (INIS)

    Raskob, W.; Barry, P.

    1997-01-01

    The Aiken List was devised in 1990 to help decide which transport processes should be investigated experimentally so as to derive the greatest improvement in performance of environmental tritium assessment models. Each process was rated high, medium and low on each of two criteria. These were ''Importance'', which rated processes by how much each contributed to ingestion doses, and ''State of Modelling'', which rated the adequacy of the knowledge base on which models were built. Ratings, though unanimous, were, nevertheless, qualitative and subjective opinions. This paper describes how we have tried to quantify the ratings. To do this, we use, as measures of ''Importance'', sensitivities of predicted ingestion doses to changes in values of parameters in mathematical descriptions of individual processes. Measures of ''ModellinStatus'' were taken from a recently completed BIOMOVS study of HTO transport model performance and based either on by how much predicted transport by individual processes differed amongst participating modellers or by the variety of different ways that modellers chose to describe individual processes. The tritium transport model UFOTRI was used, and because environmental transport of HTO varies according to the weather at and after release time, sensitivities were measured in a sample of all conditions likely to arise in central Europe. (Author)

  16. Measurement-Device Independency Analysis of Continuous-Variable Quantum Digital Signature

    Directory of Open Access Journals (Sweden)

    Tao Shang

    2018-04-01

    Full Text Available With the practical implementation of continuous-variable quantum cryptographic protocols, security problems resulting from measurement-device loopholes are being given increasing attention. At present, research on measurement-device independency analysis is limited in quantum key distribution protocols, while there exist different security problems for different protocols. Considering the importance of quantum digital signature in quantum cryptography, in this paper, we attempt to analyze the measurement-device independency of continuous-variable quantum digital signature, especially continuous-variable quantum homomorphic signature. Firstly, we calculate the upper bound of the error rate of a protocol. If it is negligible on condition that all measurement devices are untrusted, the protocol is deemed to be measurement-device-independent. Then, we simplify the calculation by using the characteristics of continuous variables and prove the measurement-device independency of the protocol according to the calculation result. In addition, the proposed analysis method can be extended to other quantum cryptographic protocols besides continuous-variable quantum homomorphic signature.

  17. Robust Machine Learning Variable Importance Analyses of Medical Conditions for Health Care Spending.

    Science.gov (United States)

    Rose, Sherri

    2018-03-11

    To propose nonparametric double robust machine learning in variable importance analyses of medical conditions for health spending. 2011-2012 Truven MarketScan database. I evaluate how much more, on average, commercially insured enrollees with each of 26 of the most prevalent medical conditions cost per year after controlling for demographics and other medical conditions. This is accomplished within the nonparametric targeted learning framework, which incorporates ensemble machine learning. Previous literature studying the impact of medical conditions on health care spending has almost exclusively focused on parametric risk adjustment; thus, I compare my approach to parametric regression. My results demonstrate that multiple sclerosis, congestive heart failure, severe cancers, major depression and bipolar disorders, and chronic hepatitis are the most costly medical conditions on average per individual. These findings differed from those obtained using parametric regression. The literature may be underestimating the spending contributions of several medical conditions, which is a potentially critical oversight. If current methods are not capturing the true incremental effect of medical conditions, undesirable incentives related to care may remain. Further work is needed to directly study these issues in the context of federal formulas. © Health Research and Educational Trust.

  18. The application of the PSA important measures in risk-informed administrations

    International Nuclear Information System (INIS)

    Chen Yan; Fu Zhiwei; Jing Jianping; Zhang Chunming; Liu Hongquan

    2012-01-01

    The importance measures analyses of PSA are main approaches during the risk-informed administrations. This paper reviews kinds of importance measures, mainly researches the meaning of the FV and RAW importance measures, and introduces the applications of importance measures in the in-service testing and categorization of SSCs, finally, discusses the limitations of the importance measures analyses. (authors)

  19. Importance of the macroeconomic variables for variance prediction: A GARCH-MIDAS approach

    DEFF Research Database (Denmark)

    Asgharian, Hossein; Hou, Ai Jun; Javed, Farrukh

    2013-01-01

    This paper aims to examine the role of macroeconomic variables in forecasting the return volatility of the US stock market. We apply the GARCH-MIDAS (Mixed Data Sampling) model to examine whether information contained in macroeconomic variables can help to predict short-term and long-term compone......This paper aims to examine the role of macroeconomic variables in forecasting the return volatility of the US stock market. We apply the GARCH-MIDAS (Mixed Data Sampling) model to examine whether information contained in macroeconomic variables can help to predict short-term and long...

  20. Variability in Adaptive Behavior in Autism: Evidence for the Importance of Family History

    Science.gov (United States)

    Mazefsky, Carla A.; Williams, Diane L.; Minshew, Nancy J.

    2008-01-01

    Adaptive behavior in autism is highly variable and strongly related to prognosis. This study explored family history as a potential source of variability in adaptive behavior in autism. Participants included 77 individuals (mean age = 18) with average or better intellectual ability and autism. Parents completed the Family History Interview about…

  1. Use of UAVs for Remote Measurement of Vegetation Canopy Variables

    Science.gov (United States)

    Rango, A.; Laliberte, A.; Herrick, J.; Steele, C.; Bestelmeyer, B.; Chopping, M. J.

    2006-12-01

    Remote sensing with different sensors has proven useful for measuring vegetation canopy variables at scales ranging from landscapes down to individual plants. For use at landscape scales, such as desert grasslands invaded by shrubs, it is possible to use multi-angle imagery from satellite sensors, such as MISR and CHRIS/Proba, with geometric optical models to retrieve fractional woody plant cover. Vegetation community states can be mapped using visible and near infrared ASTER imagery at 15 m resolution. At finer scales, QuickBird satellite imagery with approximately 60 cm resolution and piloted aircraft photography with 25-80 cm resolution can be used to measure shrubs above a critical size. Tests conducted with the QuickBird data in the Jornada basin of southern New Mexico have shown that 87% of all shrubs greater than 2 m2 were detected whereas only about 29% of all shrubs less than 2 m2 were detected, even at these high resolutions. Because there is an observational gap between satellite/aircraft measurements and ground observations, we have experimented with Unmanned Aerial Vehicles (UAVs) producing digital photography with approximately 5 cm resolution. We were able to detect all shrubs greater than 2 m2, and we were able to map small subshrubs indicative of rangeland deterioration, as well as remnant grass patches, for the first time. None of these could be identified on the 60 cm resolution data. Additionally, we were able to measure canopy gaps, shrub patterns, percent bare soil, and vegetation cover over mixed rangeland vegetation. This approach is directly applicable to rangeland health monitoring, and it provides a quantitative way to assess shrub invasion over time and to detect the depletion or recovery of grass patches. Further, if the UAV images have sufficient overlap, it may be possible to exploit the stereo viewing capabilities to develop a digital elevation model from the orthophotos, with a potential for extracting canopy height. We envision two

  2. Blood Pressure Variability and Cognitive Function Among Older African Americans: Introducing a New Blood Pressure Variability Measure.

    Science.gov (United States)

    Tsang, Siny; Sperling, Scott A; Park, Moon Ho; Helenius, Ira M; Williams, Ishan C; Manning, Carol

    2017-09-01

    Although blood pressure (BP) variability has been reported to be associated with cognitive impairment, whether this relationship affects African Americans has been unclear. We sought correlations between systolic and diastolic BP variability and cognitive function in community-dwelling older African Americans, and introduced a new BP variability measure that can be applied to BP data collected in clinical practice. We assessed cognitive function in 94 cognitively normal older African Americans using the Mini-Mental State Examination (MMSE) and the Computer Assessment of Mild Cognitive Impairment (CAMCI). We used BP measurements taken at the patients' three most recent primary care clinic visits to generate three traditional BP variability indices, range, standard deviation, and coefficient of variation, plus a new index, random slope, which accounts for unequal BP measurement intervals within and across patients. MMSE scores did not correlate with any of the BP variability indices. Patients with greater diastolic BP variability were less accurate on the CAMCI verbal memory and incidental memory tasks. Results were similar across the four BP variability indices. In a sample of cognitively intact older African American adults, BP variability did not correlate with global cognitive function, as measured by the MMSE. However, higher diastolic BP variability correlated with poorer verbal and incidental memory. By accounting for differences in BP measurement intervals, our new BP variability index may help alert primary care physicians to patients at particular risk for cognitive decline.

  3. Volatile and intermediate volatility organic compounds in suburban Paris: variability, origin and importance for SOA formation

    International Nuclear Information System (INIS)

    Ait-Helal, W.; Borbon, A.; Beekmann, M.; Doussin, J.F.; Durand-Jolibois, R.; Grand, N.; Michoud, V.; Miet, K.; Perrier, S.; Siour, G.; Zapf, P.; Sauvage, S.; Fronval, I.; Leonardis, T.; Locoge, N.; Gouw, J.A. de; Colomb, A.; Gros, V.; Lopez, M.

    2014-01-01

    approaches, which are based on in situ observations of particular I/VOCs, emphasize the importance of the intermediate volatility compounds in the SOA formation, and support previous results from chamber experiments and modeling studies. They also support the need to make systematic the IVOCs' speciated measurement during field campaigns. (authors)

  4. Active Learning: The Importance of Developing a Comprehensive Measure

    Science.gov (United States)

    Carr, Rodney; Palmer, Stuart; Hagel, Pauline

    2015-01-01

    This article reports on an investigation into the validity of a widely used scale for measuring the extent to which higher education students employ active learning strategies. The scale is the active learning scale in the Australasian Survey of Student Engagement. This scale is based on the Active and Collaborative Learning scale of the National…

  5. GFT centrality: A new node importance measure for complex networks

    Science.gov (United States)

    Singh, Rahul; Chakraborty, Abhishek; Manoj, B. S.

    2017-12-01

    Identifying central nodes is very crucial to design efficient communication networks or to recognize key individuals of a social network. In this paper, we introduce Graph Fourier Transform Centrality (GFT-C), a metric that incorporates local as well as global characteristics of a node, to quantify the importance of a node in a complex network. GFT-C of a reference node in a network is estimated from the GFT coefficients derived from the importance signal of the reference node. Our study reveals the superiority of GFT-C over traditional centralities such as degree centrality, betweenness centrality, closeness centrality, eigenvector centrality, and Google PageRank centrality, in the context of various arbitrary and real-world networks with different degree-degree correlations.

  6. The importance of measuring customer satisfaction in palliative care.

    Science.gov (United States)

    Turriziani, Adriana; Attanasio, Gennaro; Scarcella, Francesco; Sangalli, Luisa; Scopa, Anna; Genualdo, Alessandra; Quici, Stefano; Nazzicone, Giulia; Ricciotti, Maria Adelaide; La Commare, Francesco

    2016-03-01

    In the last decades, palliative care has been more and more focused on the evaluation of patients' and families' satisfaction with care. However, the evaluation of customer satisfaction in palliative care presents a number of issues such as the presence of both patients and their families, the frail condition of the patients and the complexity of their needs, and the lack of standard quality indicators and appropriate measurement tools. In this manuscript, we critically review existing evidence and literature on the evaluation of satisfaction in the palliative care context. Moreover, we provide - as a practical example - the preliminary results of our experience in this setting with the development of a dedicated tool for the measurement of satisfaction.

  7. Strategic measures for the two cities. Most important strategies - recommendations

    DEFF Research Database (Denmark)

    Herslund, Lise Byskov; Lund, Dorthe Hedensted; Yeshitela, Kumelachew

    and challenges with flooding and drought into the main city plans as well as in land use and environmental planning, urban services like drainage and solid waste management and upgrading programs etc. Focus should be put on building networks between the stakeholders from integrated and on-going projects like......The deliverable serves three purposes: First it lists core measures brought forward in literature and guidelines on how to make developing cities more resilient to climate change as well as some necessary conditions for effective implementation. Second it presents strategic measures for climate...... change adaptation based on stakeholder interactions and interviews in the two CLUVA cities: Dar es Salaam and Addis Ababa. Third it draws together the main points in recommendations on data needs and process for how to work towards more resilient cities. The deliverable combines the identification...

  8. Concept, characteristics, and applications of important electrical measuring techniques

    International Nuclear Information System (INIS)

    Amberg, C.; Czaika, N.; Andreae, G.

    1978-01-01

    In the field of electrical measuring techniques the investigations were concentrated on the transducers. We investigated the time-temperature behaviour of the following transducers: The weldable, fully encapsulated high temperature strain gauges, inductance and transformer displacement transducers, and weldable capacitive strain transducers with distance sensor. A literatur-review showing the state of techniques reference the influence of nuclear radiation was put together. (orig./HP) [de

  9. The importance of puberty for adolescent development: conceptualization and measurement.

    Science.gov (United States)

    Berenbaum, Sheri A; Beltz, Adriene M; Corley, Robin

    2015-01-01

    How and why are teenagers different from children and adults? A key question concerns the ways in which pubertal development shapes psychological changes in adolescence directly through changes to the brain and indirectly through the social environment. Empirical work linking pubertal development to adolescent psychological function draws from several different perspectives, often with varying approaches and a focus on different outcomes and mechanisms. The main themes concern effects of atypical pubertal timing on behavior problems during adolescence, effects of pubertal status (and associated hormones) on normative changes in behaviors that can facilitate or hinder development (especially risk-taking, social reorientation, and stress responsivity), and the role of puberty in triggering psychopathology in vulnerable individuals. There is also interest in understanding the ways in which changes in the brain reflect pubertal processes and underlie psychological development in adolescence. In this chapter, we consider the ways that puberty might affect adolescent psychological development, and why this is of importance to developmentalists. We describe the processes of pubertal development; summarize what is known about pubertal influences on adolescent development; consider the assumptions that underlie most work and the methodological issues that affect the interpretation of results; and propose research directions to help understand paths from puberty to behavior. Throughout, we emphasize the importance of pubertal change in all aspects of psychological development, and the ways in which puberty represents an opportunity to study the interplay of biological and social influences. © 2015 Elsevier Inc. All rights reserved.

  10. Importance and Impact of Preanalytical Variables on Alzheimer Disease Biomarker Concentrations in Cerebrospinal Fluid

    NARCIS (Netherlands)

    Le Bastard, Nathalie; De Deyn, Peter Paul; Engelborghs, Sebastiaan

    BACKGROUND: Analyses of cerebrospinal fluid (CSF) biomarkers (beta-amyloid protein, total tau protein, and hyperphosphorylated tau protein) are part of the diagnostic criteria of Alzheimer disease. Different preanalytical sample procedures contribute to variability of CSF biomarker concentrations,

  11. The importance of localisation within 131J-retention measurements

    International Nuclear Information System (INIS)

    Greifeneder, G.; Aiginger, H.; Steger, F.; Flores, J.

    1996-01-01

    A widely used method to investigate thyroid gland carcinomas, but also disorders of the thyroid gland is the measurement of the 131 J-retention behaviour using a high-sensitivity whole-body counter (HWBC). Normally just the retention of oral administered Radioiodine is controlled. However, it is possible to visualize the distribution of 131 J (using a modified profilescantechnique) beside the retention in the human body and this provides much better information on human metabolism as will be demonstrated in the following

  12. Original method to compute epipoles using variable homography: application to measure emergent fibers on textile fabrics

    Science.gov (United States)

    Xu, Jun; Cudel, Christophe; Kohler, Sophie; Fontaine, Stéphane; Haeberlé, Olivier; Klotz, Marie-Louise

    2012-04-01

    Fabric's smoothness is a key factor in determining the quality of finished textile products and has great influence on the functionality of industrial textiles and high-end textile products. With popularization of the zero defect industrial concept, identifying and measuring defective material in the early stage of production is of great interest to the industry. In the current market, many systems are able to achieve automatic monitoring and control of fabric, paper, and nonwoven material during the entire production process, however online measurement of hairiness is still an open topic and highly desirable for industrial applications. We propose a computer vision approach to compute epipole by using variable homography, which can be used to measure emergent fiber length on textile fabrics. The main challenges addressed in this paper are the application of variable homography on textile monitoring and measurement, as well as the accuracy of the estimated calculation. We propose that a fibrous structure can be considered as a two-layer structure, and then we show how variable homography combined with epipolar geometry can estimate the length of the fiber defects. Simulations are carried out to show the effectiveness of this method. The true length of selected fibers is measured precisely using a digital optical microscope, and then the same fibers are tested by our method. Our experimental results suggest that smoothness monitored by variable homography is an accurate and robust method of quality control for important industrial fabrics.

  13. Time measurement - technical importance of most exact clocks

    International Nuclear Information System (INIS)

    Goebel, E.O.; Riehle, F.

    2004-01-01

    The exactness of the best atomic clocks currently shows a temporal variation of 1 second in 30 million years. This means that we have reached the point of the most exact frequency and time measurement ever. In the past, there was a trend towards increasing the exactness in an increasingly fast sequence. Will this trend continue? And who will profit from it? This article is meant to give answers to these questions. This is done by presenting first the level reached currently with the best atomic clocks and describing the research activities running worldwide with the aim of achieving even more exact clocks. In the second part, we present examples of various areas of technical subjects and research in which the most exact clocks are being applied presently and even more exact ones will be needed in the future [de

  14. Clinical importance of radioisotope measurement of placental circulation

    International Nuclear Information System (INIS)

    Doszpod, J.; Vittay, P.; Csakany, M.Gy.; Misak, L.; Gati, I.

    1980-01-01

    Placental circulation was measured by scintigraphic technique after administration of 18.5 MBq (0.5 mCi) 113 In. Scintigrams of the placenta and of the myometrium were taken with a time interval of 3 s for 3 minutes, and with an interval of 1 min for further 8 min. The placental perfusion index (PPI) was calculated on the basis of the time-activity histograms. The examination was carried out in 14 patients, with the following indications: toxicosis gravidarum, stenosis of the aortic valve, diabetes mellitus, placenta praevia, and suspicion of intrauterine retardation. In 13 cases the PPI was in accordance with the birth weight, whereas in one case significant difference was found. Intrauterine death occurred in cases where the PPI was near 1.0, whereas healthy mature babies were born with PPIs above 1.5. (L.E.)

  15. Clinical importance of radioisotope measurement of placental circulation

    Energy Technology Data Exchange (ETDEWEB)

    Doszpod, J; Vittay, P; Csakany, M Gy; Misak, L; Gati, I [Orvostovabbkepzoe Intezet, Budapest (Hungary)

    1980-12-27

    Placental circulation was measured by scintigraphic technique after administration of 18.5 MBq (0.5 mCi) /sup 113/In. Scintigrams of the placenta and of the myometrium were taken with a time interval of 3 s for 3 minutes, and with an interval of 1 min for further 8 min. The placental perfusion index (PPI) was calculated on the basis of the time-activity histograms. The examination was carried out in 14 patients, with the following indications: toxicosis gravidarum, stenosis of the aortic valve, diabetes mellitus, placenta praevia, and suspicion of intrauterine retardation. In 13 cases the PPI was in accordance with the birth weight, whereas in one case significant difference was found. Intrauterine death occurred in cases where the PPI was near 1.0, whereas healthy mature babies were born with PPIs above 1.5.

  16. On the importance of being bilingual: word stress processing in a context of segmental variability.

    Science.gov (United States)

    Abboub, Nawal; Bijeljac-Babic, Ranka; Serres, Josette; Nazzi, Thierry

    2015-04-01

    French-learning infants have language-specific difficulties in processing lexical stress due to the lack of lexical stress in French. These difficulties in discriminating between words with stress-initial (trochaic) and stress-final (iambic) patterns emerge by 10months of age in the easier context of low variability (using a single item pronounced with a trochaic pattern vs. an iambic pattern) as well as in the more challenging context of high segmental variability (using lists of segmentally different trochaic and iambic items). These findings raise the question of stress pattern perception in simultaneous bilinguals learning French and a second language using stress at the lexical level. Bijeljac-Babic, Serres, Höhle, and Nazzi (2012) established that at 10 months of age, in the simpler context of low variability, such bilinguals have better stress discrimination abilities than French-learning monolinguals. The current study explored whether this advantage extends to the more challenging context of high segmental variability. Results first establish stress pattern discrimination in a group of bilingual 10-month-olds learning French and one language with (variable) lexical stress, but not in French-learning 10-month-old monolinguals. Second, discrimination in bilinguals appeared not to be affected by the language balance of the infants, suggesting that sensitivity to stress patterns might be maintained in these bilingual infants provided that they hear at least 30% of a language with lexical stress. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Putting the cart before the horse. A comment on Wagstaff on inequality measurement in the presence of binary variables

    NARCIS (Netherlands)

    G. Erreygers (Guido); T.G.M. van Ourti (Tom)

    2011-01-01

    textabstractAdam Wagstaff's (2011) recent paper sends a strong reminder that binary variables occur frequently in health inequality studies and that it is important to examine whether the standard measurement tools can be applied without any modification when the health variable happens to be

  18. ZnO crystals obtained by electrodeposition: Statistical analysis of most important process variables

    International Nuclear Information System (INIS)

    Cembrero, Jesus; Busquets-Mataix, David

    2009-01-01

    In this paper a comparative study by means of a statistical analysis of the main process variables affecting ZnO crystal electrodeposition is presented. ZnO crystals were deposited on two different substrates, silicon wafer and indium tin oxide. The control variables were substrate types, electrolyte concentration, temperature, exposition time and current density. The morphologies of the different substrates were observed using scanning electron microscopy. The percentage of substrate area covered by ZnO deposit was calculated by computational image analysis. The design of the applied experiments was based on a two-level factorial analysis involving a series of 32 experiments and an analysis of variance. Statistical results reveal that variables exerting a significant influence on the area covered by ZnO deposit are electrolyte concentration, substrate type and time of deposition, together with a combined two-factor interaction between temperature and current density. However, morphology is also influenced by surface roughness of the substrates

  19. Importance of fruit variability in the assessment of apple quality by sensory evaluation

    DEFF Research Database (Denmark)

    Bavay, Cécile; Symoneaux, Ronan; Maître, Isabelle

    2013-01-01

    cultivars, apples were sorted into homogenous acoustic firmness categories within each cultivar. The discrimination ability of the trained panel was observed not only between cultivars but also within each cultivar for crunchiness, firmness, juiciness and acidity. Following these results, a mixed......The assessment of produce quality is a major aspect of applied postharvest biology. Horticultural researchers working on organoleptic quality of fruit need objective methods for the evaluation of sensory properties. The development of sensory methodologies specifically for apples highlighted...... the problem of handling variation due to fruit variability and assessor differences. The aim of this study was to investigate the weight of within-batch variability in sensory evaluation of apples and to propose a methodology that accounts for this variability. Prior to sensory analysis, for three apple...

  20. Sternal wound complications after primary isolated myocardial revascularization: the importance of the post-operative variables.

    NARCIS (Netherlands)

    Noyez, L.; Druten, J.A.M. van; Mulder, J.; Schroen, A.M.; Skotnicki, S.H.; Brouwer, R.

    2001-01-01

    OBJECTIVE: Select pre-, peri-, and post-operative variables, predictive for sternal wound complications (SWC), in a clinical setting. METHODS: We analyzed pre-, peri-, and post-operative data of 3815 patients who underwent a primary isolated bypass grafting. 100 patients (2.6%) had post-operative

  1. Hydrothermal activity at slow-spreading ridges: variability and importance of magmatic controls

    Science.gov (United States)

    Escartin, Javier

    2016-04-01

    Hydrothermal activity along mid-ocean ridge axes is ubiquitous, associated with mass, chemical, and heat exchanges between the deep lithosphere and the overlying envelopes, and sustaining chemiosynthetic ecosystems at the seafloor. Compared with hydrothermal fields at fast-spreading ridges, those at slow spreading ones show a large variability as their location and nature is controlled or influenced by several parameters that are inter-related: a) tectonic setting, ranging from 'volcanic systems' (along the rift valley floor, volcanic ridges, seamounts), to 'tectonic' ones (rift-bounding faults, oceanic detachment faults); b) the nature of the host rock, owing to compositional heterogeneity of slow-spreading lithosphere (basalt, gabbro, peridotite); c) the type of heat source (magmatic bodies at depth, hot lithosphere, serpentinization reactions); d) and the associated temperature of outflow fluids (high- vs.- low temperature venting and their relative proportion). A systematic review of the distribution and characteristics of hydrothermal fields along the slow-spreading Mid-Atlantic Ridge suggests that long-lived hydrothermal activity is concentrated either at oceanic detachment faults, or along volcanic segments with evidence of robust magma supply to the axis. A detailed study of the magmatically robust Lucky Strike segment suggests that all present and past hydrothermal activity is found at the center of the segment. The association of these fields to central volcanos, and the absence of indicators of hydrothermal activity along the remaining of the ridge segment, suggests that long-lived hydrothermal activity in these volcanic systems is maintained by the enhanced melt supply and the associated magma chamber(s) required to build these volcanic edifices. In this setting, hydrothermal outflow zones at the seafloor are systematically controlled by faults, indicating that hydrothermal fluids in the shallow crust exploit permeable fault zones to circulate. While

  2. VARIABILITY OF MANUAL AND COMPUTERIZED METHODS FOR MEASURING CORONAL VERTEBRAL INCLINATION IN COMPUTED TOMOGRAPHY IMAGES

    Directory of Open Access Journals (Sweden)

    Tomaž Vrtovec

    2015-06-01

    Full Text Available Objective measurement of coronal vertebral inclination (CVI is of significant importance for evaluating spinal deformities in the coronal plane. The purpose of this study is to systematically analyze and compare manual and computerized measurements of CVI in cross-sectional and volumetric computed tomography (CT images. Three observers independently measured CVI in 14 CT images of normal and 14 CT images of scoliotic vertebrae by using six manual and two computerized measurements. Manual measurements were obtained in coronal cross-sections by manually identifying the vertebral body corners, which served to measure CVI according to the superior and inferior tangents, left and right tangents, and mid-endplate and mid-wall lines. Computerized measurements were obtained in two dimensions (2D and in three dimensions (3D by manually initializing an automated method in vertebral centroids and then searching for the planes of maximal symmetry of vertebral anatomical structures. The mid-endplate lines were the most reproducible and reliable manual measurements (intra- and inter-observer variability of 0.7° and 1.2° standard deviation, SD, respectively. The computerized measurements in 3D were more reproducible and reliable (intra- and inter-observer variability of 0.5° and 0.7° SD, respectively, but were most consistent with the mid-wall lines (2.0° SD and 1.4° mean absolute difference. The manual CVI measurements based on mid-endplate lines and the computerized CVI measurements in 3D resulted in the lowest intra-observer and inter-observer variability, however, computerized CVI measurements reduce observer interaction.

  3. Measuring Networking as an Outcome Variable in Undergraduate Research Experiences.

    Science.gov (United States)

    Hanauer, David I; Hatfull, Graham

    2015-01-01

    The aim of this paper is to propose, present, and validate a simple survey instrument to measure student conversational networking. The tool consists of five items that cover personal and professional social networks, and its basic principle is the self-reporting of degrees of conversation, with a range of specific discussion partners. The networking instrument was validated in three studies. The basic psychometric characteristics of the scales were established by conducting a factor analysis and evaluating internal consistency using Cronbach's alpha. The second study used a known-groups comparison and involved comparing outcomes for networking scales between two different undergraduate laboratory courses (one involving a specific effort to enhance networking). The final study looked at potential relationships between specific networking items and the established psychosocial variable of project ownership through a series of binary logistic regressions. Overall, the data from the three studies indicate that the networking scales have high internal consistency (α = 0.88), consist of a unitary dimension, can significantly differentiate between research experiences with low and high networking designs, and are related to project ownership scales. The ramifications of the networking instrument for student retention, the enhancement of public scientific literacy, and the differentiation of laboratory courses are discussed. © 2015 D. I. Hanauer and G. Hatfull. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  4. The importance of variables and parameters in radiolytic chemical kinetics modeling

    International Nuclear Information System (INIS)

    Piepho, M.G.; Turner, P.J.; Reimus, P.W.

    1989-01-01

    Many of the pertinent radiochemical reactions are not completely understood, and most of the associated rate constants are poorly characterized. To help identify the important radiochemical reactions, rate constants, species, and environmental conditions, an importance theory code, SWATS (Sensitivitiy With Adjoint Theory-Sparse version)-LOOPCHEM, has been developed for the radiolytic chemical kinetics model in the radiolysis code LOOPCHEM. The LOOPCHEM code calculates the concentrations of various species in a radiolytic field over time. The SWATS-LOOPCHEM code efficiently calculates: the importance (relative to a defined response of interest) of each species concentration over time, the sensitivity of each parameter of interest, and the importance of each equation in the radiolysis model. The calculated results will be used to guide future experimental and modeling work for determining the importance of radiolysis on waste package performance. A demonstration (the importance of selected concentrations and the sensitivities of selected parameters) of the SWATS-LOOPCHEM code is provided for illustrative purposes

  5. Comparison of seasonal variability in European domestic radon measurements

    Science.gov (United States)

    Groves-Kirkby, C. J.; Denman, A. R.; Phillips, P. S.; Crockett, R. G. M.; Sinclair, J. M.

    2010-03-01

    Analysis of published data characterising seasonal variability of domestic radon concentrations in Europe and elsewhere shows significant variability between different countries and between regions where regional data is available. Comparison is facilitated by application of the Gini Coefficient methodology to reported seasonal variation data. Overall, radon-rich sedimentary strata, particularly high-porosity limestones, exhibit high seasonal variation, while radon-rich igneous lithologies demonstrate relatively constant, but somewhat higher, radon concentrations. High-variability regions include the Pennines and South Downs in England, Languedoc and Brittany in France, and especially Switzerland. Low-variability high-radon regions include the granite-rich Cornwall/Devon peninsula in England, and Auvergne and Ardennes in France, all components of the Devonian-Carboniferous Hercynian belt.

  6. Comparison of seasonal variability in European domestic radon measurements

    Directory of Open Access Journals (Sweden)

    C. J. Groves-Kirkby

    2010-03-01

    Full Text Available Analysis of published data characterising seasonal variability of domestic radon concentrations in Europe and elsewhere shows significant variability between different countries and between regions where regional data is available. Comparison is facilitated by application of the Gini Coefficient methodology to reported seasonal variation data. Overall, radon-rich sedimentary strata, particularly high-porosity limestones, exhibit high seasonal variation, while radon-rich igneous lithologies demonstrate relatively constant, but somewhat higher, radon concentrations. High-variability regions include the Pennines and South Downs in England, Languedoc and Brittany in France, and especially Switzerland. Low-variability high-radon regions include the granite-rich Cornwall/Devon peninsula in England, and Auvergne and Ardennes in France, all components of the Devonian-Carboniferous Hercynian belt.

  7. Measuring The Variability Of Gamma-Ray Sources With AGILE

    International Nuclear Information System (INIS)

    Chen, Andrew W.; Vercellone, Stefano; Pellizzoni, Alberto; Tavani, Marco

    2005-01-01

    Variability in the gamma-ray flux above 100 MeV at various time scales is one of the primary characteristics of the sources detected by EGRET, both allowing the identification of individual sources and constraining the unidentified source classes. We present a detailed simulation of the capacity of AGILE to characterize the variability of gamma-ray sources, discussing the implications for source population studies

  8. Measuring lip force by oral screens Part 2: The importance of screen design, instruction and suction.

    Science.gov (United States)

    Wertsén, Madeleine; Stenberg, Manne

    2017-10-01

    The aim of this study was to find a reliable method for measuring lip force and to find the most important factors that influence the measurements in terms of magnitude and variability. The hypothesis tested was that suction is involved and thus the instruction and the design of the oral screen are of importance when measuring lip force. This is a methodological study in a healthy population. This study was conducted in a general community. The designs of the screens were soft and hard prefabricated screens and 2 semi-individually made with a tube allowing air to pass. The screens and the instructions squeeze or suck were tested on 29 healthy adults, one at a time and on 4 occasions. The test order of the screens was randomized. Data were collected during 4 consecutive days, and the procedure was repeated after 1 month. The participants were 29 healthy adult volunteers. The instruction was an important mean to distinguish between squeezing and sucking. The design of the screen affected the lip force so that it increases in relation to the projected area of the screen. A screen design with a tube allowing air to pass made it possible to avoid suction when squeezing. By measuring with and without allowing air to pass, it was possible to distinguish between suction related and not suction related lip force. The additional screen pressure when sucking was related to the ability to produce a negative intraoral pressure. In conclusion lip force increases in relation to the projected area of the screen, sucking generally increases the measured lip force and the additional screen pressure when sucking is related to the ability to produce a negative intraoral pressure.

  9. A new interpretation and validation of variance based importance measures for models with correlated inputs

    Science.gov (United States)

    Hao, Wenrui; Lu, Zhenzhou; Li, Luyi

    2013-05-01

    In order to explore the contributions by correlated input variables to the variance of the output, a novel interpretation framework of importance measure indices is proposed for a model with correlated inputs, which includes the indices of the total correlated contribution and the total uncorrelated contribution. The proposed indices accurately describe the connotations of the contributions by the correlated input to the variance of output, and they can be viewed as the complement and correction of the interpretation about the contributions by the correlated inputs presented in "Estimation of global sensitivity indices for models with dependent variables, Computer Physics Communications, 183 (2012) 937-946". Both of them contain the independent contribution by an individual input. Taking the general form of quadratic polynomial as an illustration, the total correlated contribution and the independent contribution by an individual input are derived analytically, from which the components and their origins of both contributions of correlated input can be clarified without any ambiguity. In the special case that no square term is included in the quadratic polynomial model, the total correlated contribution by the input can be further decomposed into the variance contribution related to the correlation of the input with other inputs and the independent contribution by the input itself, and the total uncorrelated contribution can be further decomposed into the independent part by interaction between the input and others and the independent part by the input itself. Numerical examples are employed and their results demonstrate that the derived analytical expressions of the variance-based importance measure are correct, and the clarification of the correlated input contribution to model output by the analytical derivation is very important for expanding the theory and solutions of uncorrelated input to those of the correlated one.

  10. Measurement of variable magnetic reversal paths in electrically contacted pseudo-spin-valve rings

    International Nuclear Information System (INIS)

    Hayward, T J; Llandro, J; Schackert, F D O; Morecroft, D; Balsod, R B; Bland, J A C; Castano, F J; Ross, C A

    2007-01-01

    In this work we show that the measurement of single magnetic reversal events is of critical importance in order to correctly characterize the switching of magnetic microstructures. Magnetoresistance measurements are performed on two pseudo-spin-valve ring structures with high enough signal to noise to allow the probing of single reversal events. Using this technique we acquire 'switching spectra' which demonstrate that the rings exhibit a range of variable reversal paths, including a bistable reversal mechanism of the hard layer, where the two switching routes have substantially different switching fields. The signature of the variable reversal paths would have been obscured in field cycle averaged data and in the bistable case would cause a fundamental misinterpretation of the reversal behaviour

  11. Important aspects of Eastern Mediterranean large-scale variability revealed from data of three fixed observatories

    Science.gov (United States)

    Bensi, Manuel; Velaoras, Dimitris; Cardin, Vanessa; Perivoliotis, Leonidas; Pethiakis, George

    2015-04-01

    Long-term variations of temperature and salinity observed in the Adriatic and Aegean Seas seem to be regulated by larger-scale circulation modes of the Eastern Mediterranean (EMed) Sea, such as the recently discovered feedback mechanisms, namely the BiOS (Bimodal Oscillating System) and the internal thermohaline pump theories. These theories are the results of interpretation of many years' observations, highlighting possible interactions between two key regions of the EMed. Although repeated oceanographic cruises carried out in the past or planned for the future are a very useful tool for understanding the interaction between the two basins (e.g. alternating dense water formation, salt ingressions), recent long time-series of high frequency (up to 1h) sampling have added valuable information to the interpretation of internal mechanisms for both areas (i.e. mesoscale eddies, evolution of fast internal processes, etc.). During the last 10 years, three deep observatories were deployed and maintained in the Adriatic, Ionian, and Aegean Seas: they are respectively, the E2-M3A, the Pylos, and the E1-M3A. All are part of the largest European network of Fixed Point Open Ocean Observatories (FixO3, http://www.fixo3.eu/). Herein, from the analysis of temperature and salinity, and potential density time series collected at the three sites from the surface down to the intermediate and deep layers, we will discuss the almost perfect anti-correlated behavior between the Adriatic and the Aegean Seas. Our data, collected almost continuously since 2006, reveal that these observatories well represent the thermohaline variability of their own areas. Interestingly, temperature and salinity in the intermediate layer suddenly increased in the South Adriatic from the end of 2011, exactly when they started decreasing in the Aegean Sea. Moreover, Pylos data used together with additional ones (e.g. Absolute dynamic topography, temperature and salinity data from other platforms) collected

  12. Conventional QT Variability Measurement vs. Template Matching Techniques: Comparison of Performance Using Simulated and Real ECG

    Science.gov (United States)

    Baumert, Mathias; Starc, Vito; Porta, Alberto

    2012-01-01

    Increased beat-to-beat variability in the QT interval (QTV) of ECG has been associated with increased risk for sudden cardiac death, but its measurement is technically challenging and currently not standardized. The aim of this study was to investigate the performance of commonly used beat-to-beat QT interval measurement algorithms. Three different methods (conventional, template stretching and template time shifting) were subjected to simulated data featuring typical ECG recording issues (broadband noise, baseline wander, amplitude modulation) and real short-term ECG of patients before and after infusion of sotalol, a QT interval prolonging drug. Among the three algorithms, the conventional algorithm was most susceptible to noise whereas the template time shifting algorithm showed superior overall performance on simulated and real ECG. None of the algorithms was able to detect increased beat-to-beat QT interval variability after sotalol infusion despite marked prolongation of the average QT interval. The QTV estimates of all three algorithms were inversely correlated with the amplitude of the T wave. In conclusion, template matching algorithms, in particular the time shifting algorithm, are recommended for beat-to-beat variability measurement of QT interval in body surface ECG. Recording noise, T wave amplitude and the beat-rejection strategy are important factors of QTV measurement and require further investigation. PMID:22860030

  13. Conventional QT variability measurement vs. template matching techniques: comparison of performance using simulated and real ECG.

    Directory of Open Access Journals (Sweden)

    Mathias Baumert

    Full Text Available Increased beat-to-beat variability in the QT interval (QTV of ECG has been associated with increased risk for sudden cardiac death, but its measurement is technically challenging and currently not standardized. The aim of this study was to investigate the performance of commonly used beat-to-beat QT interval measurement algorithms. Three different methods (conventional, template stretching and template time shifting were subjected to simulated data featuring typical ECG recording issues (broadband noise, baseline wander, amplitude modulation and real short-term ECG of patients before and after infusion of sotalol, a QT interval prolonging drug. Among the three algorithms, the conventional algorithm was most susceptible to noise whereas the template time shifting algorithm showed superior overall performance on simulated and real ECG. None of the algorithms was able to detect increased beat-to-beat QT interval variability after sotalol infusion despite marked prolongation of the average QT interval. The QTV estimates of all three algorithms were inversely correlated with the amplitude of the T wave. In conclusion, template matching algorithms, in particular the time shifting algorithm, are recommended for beat-to-beat variability measurement of QT interval in body surface ECG. Recording noise, T wave amplitude and the beat-rejection strategy are important factors of QTV measurement and require further investigation.

  14. Continuous measurement of heart rate variability following carbon ...

    African Journals Online (AJOL)

    Background: Previous studies of autonomic nervous system activity through analysis of heart rate variability (HRV) have demonstrated increased sympathetic activity during positive-pressure pneumoperitoneum. We employed an online, continuous method for rapid HRV analysis (MemCalc™, Tarawa, Suwa Trust, Tokyo, ...

  15. Continuous measurement of heart rate variability following carbon ...

    African Journals Online (AJOL)

    2010-07-16

    Jul 16, 2010 ... Power spectral analysis of the electrocardiographic R-R interval [heart rate variability: (HRV)] is a well known, non- invasive method for assessing autonomic nervous activity.1. Studies using HRV analysis during positive-pressure pneumoperitoneum (PPP) have demonstrated increased sympathetic ...

  16. Effect of variable measurement coil displacement on the determination of the harmonic content of a multipole

    International Nuclear Information System (INIS)

    Halbach, K.

    1976-01-01

    When one measures the harmonic content of a two dimensional field it is important that the measurement equipment does not simulate harmonics that are not present in the field. Of the several sources that can lead to this kind of false data, we discuss here the error caused by a displacement of the axis of rotation of the measuring coil system that varies with the rotation angle of the measuring system. The two most prominent reasons for such variable axis displacements are probably imperfect bearings, and bending of the coil support structure due to either gravity or imperfectly aligned bearings. In both cases, the displacement may vary along the length of the coil. However, since we are dealing with very small displacements, the treatment of the problem with two dimensional field analysis should be perfectly adequate to understand what the consequences are, and to calculate them quantitatively

  17. Variability in Measured Space Temperatures in 60 Homes

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, D.; Lay, K.

    2013-03-01

    This report discusses the observed variability in indoor space temperature in a set of 60 homes located in Florida, New York, Oregon, and Washington. Temperature data were collected at 15-minute intervals for an entire year, including living room, master bedroom, and outdoor air temperature (Arena, et. al). The data were examined to establish the average living room temperature for the set of homes for the heating and cooling seasons, the variability of living room temperature depending on climate, and the variability of indoor space temperature within the homes. The accuracy of software-based energy analysis depends on the accuracy of input values. Thermostat set point is one of the most influential inputs for building energy simulation. Several industry standards exist that recommend differing default thermostat settings for heating and cooling seasons. These standards were compared to the values calculated for this analysis. The data examined for this report show that there is a definite difference between the climates and that the data do not agree well with any particular standard.

  18. Use of Sobol's quasirandom sequence generator for integration of modified uncertainty importance measure

    International Nuclear Information System (INIS)

    Homma, Toshimitsu; Saltelli, A.

    1995-01-01

    Sensitivity analysis of model output is relevant to a number of practices, including verification of models and computer code quality assurance. It deals with the identification of influential model parameters, especially in complex models implemented in computer programs with many uncertain input variables. In a recent article a new method for sensitivity analysis, named HIM * based on a rank transformation of the uncertainty importance measure suggested by Hora and Iman was proved very powerful for performing automated sensitivity analysis of model output, even in presence of model non-monotonicity. The same was not true of other widely used non-parametric techniques such as standardized rank regression coefficients. A drawback of the HIM * method was the large dimension of the stochastic sample needed for its estimation, which made HIM * impracticable for systems with large number of uncertain parameters. In the present note a more effective sampling algorithm, based on Sobol's quasirandom generator is coupled with HIM * , thereby greatly reducing the sample size needed for an effective identification of influential variables. The performances of the new technique are investigated for two different benchmarks. (author)

  19. Histological variability and the importance of clinicopathological correlation in cutaneous Rosai-Dorfman disease.

    Science.gov (United States)

    Gameiro, Ana; Gouveia, Miguel; Cardoso, José Carlos; Tellechea, Oscar

    2016-01-01

    Rosai-Dorfman disease is a benign histiocytic proliferative disorder of unknown etiology. The disease mainly affects lymph node tissue, although it is rarely confined to the skin. Here, we describe a 53-year-old woman with purely cutaneous Rosai-Dorfman disease. The patient presented with a large pigmented plaque on her left leg, and sparse erythematous papules on her face and arms. A complete clinical response was achieved with thalidomide, followed by recurrence at the initial site one year later. The histological examination displayed the typical features of Rosai-Dorfman disease in the recent lesions but not in the older lesions. In the setting of no lymphadenopathy, the histopathological features of Rosai-Dorfman disease are commonly misinterpreted. Therefore, awareness of the histological aspects present at different stages, not always featuring the hallmark microscopic signs of Rosai-Dorfman disease, is particularly important for a correct diagnosis of this rare disorder.

  20. Temporal variability of TEC deduced from groundbased measurements

    International Nuclear Information System (INIS)

    Mosert, M.; Ezquer, R.G.; Jadur, C.; Radicella, S.M.

    2001-01-01

    This paper presents a study of the behaviour of the integrated total electron content (ITEC) deduced from electron density profiles of two Argentine stations: Tucuman (26.9 S; 294.6 E) and San Juan (31.5 S; 290.4 E). The ITEC values have been obtained by the technique proposed by Reinisch and Huang (2000). The database includes electron density profiles derived from ionograms recorded at 4 typical hours of the day (00.00, 06.00, 12.00 and 18.00 LT) during different seasonal and solar activity conditions. An analysis of the day to day variability of ITEC has also been done. (author)

  1. Measurement problem and local hidden variables with entangled photons

    Directory of Open Access Journals (Sweden)

    Muchowski Eugen

    2017-12-01

    Full Text Available It is shown that there is no remote action with polarization measurements of photons in singlet state. A model is presented introducing a hidden parameter which determines the polarizer output. This model is able to explain the polarization measurement results with entangled photons. It is not ruled out by Bell’s Theorem.

  2. Familiarity and Within-Person Facial Variability: The Importance of the Internal and External Features.

    Science.gov (United States)

    Kramer, Robin S S; Manesi, Zoi; Towler, Alice; Reynolds, Michael G; Burton, A Mike

    2018-01-01

    As faces become familiar, we come to rely more on their internal features for recognition and matching tasks. Here, we assess whether this same pattern is also observed for a card sorting task. Participants sorted photos showing either the full face, only the internal features, or only the external features into multiple piles, one pile per identity. In Experiments 1 and 2, we showed the standard advantage for familiar faces-sorting was more accurate and showed very few errors in comparison with unfamiliar faces. However, for both familiar and unfamiliar faces, sorting was less accurate for external features and equivalent for internal and full faces. In Experiment 3, we asked whether external features can ever be used to make an accurate sort. Using familiar faces and instructions on the number of identities present, we nevertheless found worse performance for the external in comparison with the internal features, suggesting that less identity information was available in the former. Taken together, we show that full faces and internal features are similarly informative with regard to identity. In comparison, external features contain less identity information and produce worse card sorting performance. This research extends current thinking on the shift in focus, both in attention and importance, toward the internal features and away from the external features as familiarity with a face increases.

  3. How the choice of safety performance function affects the identification of important crash prediction variables.

    Science.gov (United States)

    Wang, Ketong; Simandl, Jenna K; Porter, Michael D; Graettinger, Andrew J; Smith, Randy K

    2016-03-01

    Across the nation, researchers and transportation engineers are developing safety performance functions (SPFs) to predict crash rates and develop crash modification factors to improve traffic safety at roadway segments and intersections. Generalized linear models (GLMs), such as Poisson or negative binomial regression, are most commonly used to develop SPFs with annual average daily traffic as the primary roadway characteristic to predict crashes. However, while more complex to interpret, data mining models such as boosted regression trees have improved upon GLMs crash prediction performance due to their ability to handle more data characteristics, accommodate non-linearities, and include interaction effects between the characteristics. An intersection data inventory of 36 safety relevant parameters for three- and four-legged non-signalized intersections along state routes in Alabama was used to study the importance of intersection characteristics on crash rate and the interaction effects between key characteristics. Four different SPFs were investigated and compared: Poisson regression, negative binomial regression, regularized generalized linear model, and boosted regression trees. The models did not agree on which intersection characteristics were most related to the crash rate. The boosted regression tree model significantly outperformed the other models and identified several intersection characteristics as having strong interaction effects. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. THE RELIABILITY AND ACCURACY OF THE TRIPLE MEASUREMENTS OF ANALOG PROCESS VARIABLES

    Directory of Open Access Journals (Sweden)

    V. A. Anishchenko

    2017-01-01

    Full Text Available The increase in unit capacity of electric equipment as well as complication of technological processes, devices control and management of the latter in power plants and substations demonstrate the need to improve the reliability and accuracy of measurement information characterizing the state of the objects being managed. The mentioned objective is particularly important for nuclear power plants, where the price of inaccuracy of measurement responsible process variables is particularly high and the error might lead to irreparable consequences. Improving the reliability and accuracy of measurements along with the improvement of the element base is provided by methods of operational validation. These methods are based on the use of information redundancy (structural, topological, temporal. In particular, information redundancy can be achieved by the simultaneous measurement of one analog variable by two (duplication or three devices (triplication i.e., triple redundancy. The problem of operational control of the triple redundant system of measurement of electrical analog variables (currents, voltages, active and reactive power and energy is considered as a special case of signal processing by an orderly sampling on the basis of majority transformation and transformation being close to majority one. Difficulties in monitoring the reliability of measurements are associated with the two tasks. First, one needs to justify the degree of truncation of the distributions of random errors of measurements and allowable residuals of the pairwise differences of the measurement results. The second task consists in formation of the algorithm of joint processing of a set of separate measurements determined as valid. The quality of control is characterized by the reliability, which adopted the synonym of validity, and accuracy of the measuring system. Taken separately, these indicators might lead to opposite results. A compromise solution is therefore proposed

  5. Predicting local dengue transmission in Guangzhou, China, through the influence of imported cases, mosquito density and climate variability.

    Directory of Open Access Journals (Sweden)

    Shaowei Sang

    Full Text Available Each year there are approximately 390 million dengue infections worldwide. Weather variables have a significant impact on the transmission of Dengue Fever (DF, a mosquito borne viral disease. DF in mainland China is characterized as an imported disease. Hence it is necessary to explore the roles of imported cases, mosquito density and climate variability in dengue transmission in China. The study was to identify the relationship between dengue occurrence and possible risk factors and to develop a predicting model for dengue's control and prevention purpose.Three traditional suburbs and one district with an international airport in Guangzhou city were selected as the study areas. Autocorrelation and cross-correlation analysis were used to perform univariate analysis to identify possible risk factors, with relevant lagged effects, associated with local dengue cases. Principal component analysis (PCA was applied to extract principal components and PCA score was used to represent the original variables to reduce multi-collinearity. Combining the univariate analysis and prior knowledge, time-series Poisson regression analysis was conducted to quantify the relationship between weather variables, Breteau Index, imported DF cases and the local dengue transmission in Guangzhou, China. The goodness-of-fit of the constructed model was determined by pseudo-R2, Akaike information criterion (AIC and residual test. There were a total of 707 notified local DF cases from March 2006 to December 2012, with a seasonal distribution from August to November. There were a total of 65 notified imported DF cases from 20 countries, with forty-six cases (70.8% imported from Southeast Asia. The model showed that local DF cases were positively associated with mosquito density, imported cases, temperature, precipitation, vapour pressure and minimum relative humidity, whilst being negatively associated with air pressure, with different time lags.Imported DF cases and mosquito

  6. Test sensitivity is important for detecting variability in pointing comprehension in canines.

    Science.gov (United States)

    Pongrácz, Péter; Gácsi, Márta; Hegedüs, Dorottya; Péter, András; Miklósi, Adám

    2013-09-01

    Several articles have been recently published on dogs' (Canis familiaris) performance in two-way object choice experiments in which subjects had to find hidden food by utilizing human pointing. The interpretation of results has led to a vivid theoretical debate about the cognitive background of human gestural signal understanding in dogs, despite the fact that many important details of the testing method have not yet been standardized. We report three experiments that aim to reveal how some procedural differences influence adult companion dogs' performance in these tests. Utilizing a large sample in Experiment 1, we provide evidence that neither the keeping conditions (garden/house) nor the location of the testing (outdoor/indoor) affect a dogs' performance. In Experiment 2, we compare dogs' performance using three different types of pointing gestures. Dogs' performance varied between momentary distal and momentary cross-pointing but "low" and "high" performer dogs chose uniformly better than chance level if they responded to sustained pointing gestures with reinforcement (food reward and a clicking sound; "clicker pointing"). In Experiment 3, we show that single features of the aforementioned "clicker pointing" method can slightly improve dogs' success rate if they were added one by one to the momentary distal pointing method. These results provide evidence that although companion dogs show a robust performance at different testing locations regardless of their keeping conditions, the exact execution of the human gesture and additional reinforcement techniques have substantial effect on the outcomes. Consequently, researchers should standardize their methodology before engaging in debates on the comparative aspects of socio-cognitive skills because the procedures they utilize may differ in sensitivity for detecting differences.

  7. Linear variable differential transformer and its uses for in-core fuel rod behavior measurements

    International Nuclear Information System (INIS)

    Wolf, J.R.

    1979-01-01

    The linear variable differential transformer (LVDT) is an electromechanical transducer which produces an ac voltage proportional to the displacement of a movable ferromagnetic core. When the core is connected to the cladding of a nuclear fuel rod, it is capable of producing extremely accurate measurements of fuel rod elongation caused by thermal expansion. The LVDT is used in the Thermal Fuels Behavior Program at the U.S. Idaho National Engineering Laboratory (INEL) for measurements of nuclear fuel rod elongation and as an indication of critical heat flux and the occurrence of departure from nucleate boiling. These types of measurements provide important information about the behavior of nuclear fuel rods under normal and abnormal operating conditions. The objective of the paper is to provide a complete account of recent advances made in LVDT design and experimental data from in-core nuclear reactor tests which use the LVDT

  8. Longterm and spatial variability of Aerosol optical properties measured by sky radiometer in Japan sites

    Science.gov (United States)

    Aoki, K.

    2016-12-01

    Aerosols and cloud play an important role in the climate change. We started the long-term monitoring of aerosol and cloud optical properties since 1990's by using sky radiometer (POM-01, 02; Prede Co. Ltd., Japan). We provide the information, in this presentation, on the aerosol optical properties with respect to their temporal and spatial variability in Japan site (ex. Sapporo, Toyama, Kasuga and etc). The global distributions of aerosols have been derived from earth observation satellite and have been simulated in numerical models, which assume optical parameters. However, these distributions are difficult to derive because of variability in time and space. Therefore, Aerosol optical properties were investigated using the measurements from ground-based and ship-borne sky radiometer. The sky radiometer is an automatic instrument that takes observations only in daytime under the clear sky conditions. Observation of diffuse solar intensity interval was made every ten or five minutes by once. The aerosol optical properties were computed using the SKYRAD.pack version 4.2. The obtained Aerosol optical properties (Aerosol optical thickness, Ångström exponent, Single scattering albedo, and etc.) and size distribution volume clearly showed spatial and temporal variability in Japan area. In this study, we present the temporal and spatial variability of Aerosol optical properties at several Japan sites, applied to validation of satellite and numerical models. This project is validation satellite of GCOM-C, JAXA. The GCOM-C satellite scheduled to be launched in early 2017.

  9. Night-to-night arousal variability and interscorer reliability of arousal measurements.

    Science.gov (United States)

    Loredo, J S; Clausen, J L; Ancoli-Israel, S; Dimsdale, J E

    1999-11-01

    Measurement of arousals from sleep is clinically important, however, their definition is not well standardized, and little data exist on reliability. The purpose of this study is to determine factors that affect arousal scoring reliability and night-to-night arousal variability. The night-to-night arousal variability and interscorer reliability was assessed in 20 subjects with and without obstructive sleep apnea undergoing attended polysomnography during two consecutive nights. Five definitions of arousal were studied, assessing duration of electroencephalographic (EEG) frequency changes, increases in electromyographic (EMG) activity and leg movement, association with respiratory events, as well as the American Sleep Disorders Association (ASDA) definition of arousals. NA. NA. NA. Interscorer reliability varied with the definition of arousal and ranged from an Intraclass correlation (ICC) of 0.19 to 0.92. Arousals that included increases in EMG activity or leg movement had the greatest reliability, especially when associated with respiratory events (ICC 0.76 to 0.92). The ASDA arousal definition had high interscorer reliability (ICC 0.84). Reliability was lowest for arousals consisting of EEG changes lasting <3 seconds (ICC 0.19 to 0.37). The within subjects night-to-night arousal variability was low for all arousal definitions In a heterogeneous population, interscorer arousal reliability is enhanced by increases in EMG activity, leg movements, and respiratory events and decreased by short duration EEG arousals. The arousal index night-to-night variability was low for all definitions.

  10. A bivariate measurement error model for semicontinuous and continuous variables: Application to nutritional epidemiology.

    Science.gov (United States)

    Kipnis, Victor; Freedman, Laurence S; Carroll, Raymond J; Midthune, Douglas

    2016-03-01

    Semicontinuous data in the form of a mixture of a large portion of zero values and continuously distributed positive values frequently arise in many areas of biostatistics. This article is motivated by the analysis of relationships between disease outcomes and intakes of episodically consumed dietary components. An important aspect of studies in nutritional epidemiology is that true diet is unobservable and commonly evaluated by food frequency questionnaires with substantial measurement error. Following the regression calibration approach for measurement error correction, unknown individual intakes in the risk model are replaced by their conditional expectations given mismeasured intakes and other model covariates. Those regression calibration predictors are estimated using short-term unbiased reference measurements in a calibration substudy. Since dietary intakes are often "energy-adjusted," e.g., by using ratios of the intake of interest to total energy intake, the correct estimation of the regression calibration predictor for each energy-adjusted episodically consumed dietary component requires modeling short-term reference measurements of the component (a semicontinuous variable), and energy (a continuous variable) simultaneously in a bivariate model. In this article, we develop such a bivariate model, together with its application to regression calibration. We illustrate the new methodology using data from the NIH-AARP Diet and Health Study (Schatzkin et al., 2001, American Journal of Epidemiology 154, 1119-1125), and also evaluate its performance in a simulation study. © 2015, The International Biometric Society.

  11. Measures of small business success/performance: Importance, reliability and usability

    Directory of Open Access Journals (Sweden)

    Leković Božidar

    2015-01-01

    Full Text Available The main objective of this paper is to assess the reliability of selected measures of success/performance of small enterprises taking into account all the advantages and disadvantages of objective (traditional, financial and subjective (personal perception indicators. The realization of the basic objective will enable removal of doubt in different categories of success / performance of small businesses, which would be a secondary objective of the paper. The research methodology involves the use parametric procedures due to the characteristics of the selected variables and the number of observations in the sample. Univariate ANOVA and Pearson's coefficient correlation will be the main methods used. The basis of this study consists of data gathered from e-survey of 260 entrepreneurs/ owners/ managers of small enterprises in administrative sub-region of the Republic of Serbia. The analysis results show the correlation between subjective estimate of success of owners/ entrepreneurs/ managers and objective performance indicators, which can be characterized as complementary, meaning that subjective assessment as well as formally stated performance indicators are realistic. In addition, it is important to emphasize as an important result within the group of objective performance indicators, the differences between two groups, financial and non-financial, performance indicators.

  12. 41 CFR 102-117.280 - What aspects of the TSP's performance are important to measure?

    Science.gov (United States)

    2010-07-01

    ...'s performance are important to measure? 102-117.280 Section 102-117.280 Public Contracts and... § 102-117.280 What aspects of the TSP's performance are important to measure? Important TSP performance...) Percentage of customer satisfaction reports on carrier performance. ...

  13. Applications of hybrid measurements with discrete and continuous variables

    DEFF Research Database (Denmark)

    Laghaout, Amine

    . This is what we do for two particular applications of quantum measurements: Bell tests and the amplication of Schrödinger cat states. This project also had an experimental component which was supposed to produce high-fidelity Schrödinger cat states. This goal turned out to be hampered by noise from the laser...... as well as a series of anomalous behavior of the nonlinear crystal whereby no classical de-amplification, and therefore no squeezing, could be observed....

  14. Climate variables explain neutral and adaptive variation within salmonid metapopulations: The importance of replication in landscape genetics

    Science.gov (United States)

    Hand, Brian K.; Muhlfeld, Clint C.; Wade, Alisa A.; Kovach, Ryan; Whited, Diane C.; Narum, Shawn R.; Matala, Andrew P.; Ackerman, Michael W.; Garner, B. A.; Kimball, John S; Stanford, Jack A.; Luikart, Gordon

    2016-01-01

    Understanding how environmental variation influences population genetic structure is important for conservation management because it can reveal how human stressors influence population connectivity, genetic diversity and persistence. We used riverscape genetics modelling to assess whether climatic and habitat variables were related to neutral and adaptive patterns of genetic differentiation (population-specific and pairwise FST) within five metapopulations (79 populations, 4583 individuals) of steelhead trout (Oncorhynchus mykiss) in the Columbia River Basin, USA. Using 151 putatively neutral and 29 candidate adaptive SNP loci, we found that climate-related variables (winter precipitation, summer maximum temperature, winter highest 5% flow events and summer mean flow) best explained neutral and adaptive patterns of genetic differentiation within metapopulations, suggesting that climatic variation likely influences both demography (neutral variation) and local adaptation (adaptive variation). However, we did not observe consistent relationships between climate variables and FST across all metapopulations, underscoring the need for replication when extrapolating results from one scale to another (e.g. basin-wide to the metapopulation scale). Sensitivity analysis (leave-one-population-out) revealed consistent relationships between climate variables and FST within three metapopulations; however, these patterns were not consistent in two metapopulations likely due to small sample sizes (N = 10). These results provide correlative evidence that climatic variation has shaped the genetic structure of steelhead populations and highlight the need for replication and sensitivity analyses in land and riverscape genetics.

  15. Why is variability important for performance assessment and what are its consequences for site characterisation and repository design?

    International Nuclear Information System (INIS)

    Dverstorp, B.; Smith, P.A.; Zuidema, P.

    1998-01-01

    The importance of spatial variability is discussed in terms of its consequences for site characterisation and for repository design and safety. Variability is described in terms of various scales of discrete structural features and a pragmatic classification is proposed according to whether the features are: feasibility-determining (i.e. features within which repository construction and operation is not practical and which preclude long-term safety); layout-determining (i.e. features which, if avoided, would enhance long-term safety); safety-determining features (i.e. features which cannot be shown to be avoidable and which strongly influence the calculated long-term safety of the repository system). The significance with respect to the geosphere-transport barrier of small-scale pore structure within the various classes of feature is also discussed. The practical problems of characterising variability and modelling its effects on radionuclide transport are described. Key factors affecting groundwater flow and radionuclide transport are identified, models that incorporate spatial variability are described and the estimation of appropriate parameters for these models is discussed. (author)

  16. Importance of Non-invasive Right and Left Ventricular Variables on Exercise Capacity in Patients with Tetralogy of Fallot Hemodynamics.

    Science.gov (United States)

    Meierhofer, Christian; Tavakkoli, Timon; Kühn, Andreas; Ulm, Kurt; Hager, Alfred; Müller, Jan; Martinoff, Stefan; Ewert, Peter; Stern, Heiko

    2017-12-01

    Good quality of life correlates with a good exercise capacity in daily life in patients with tetralogy of Fallot (ToF). Patients after correction of ToF usually develop residual defects such as pulmonary regurgitation or stenosis of variable severity. However, the importance of different hemodynamic parameters and their impact on exercise capacity is unclear. We investigated several hemodynamic parameters measured by cardiovascular magnetic resonance (CMR) and echocardiography and evaluated which parameter has the most pronounced effect on maximal exercise capacity determined by cardiopulmonary exercise testing (CPET). 132 patients with ToF-like hemodynamics were tested during routine follow-up with CMR, echocardiography and CPET. Right and left ventricular volume data, ventricular ejection fraction and pulmonary regurgitation were evaluated by CMR. Echocardiographic pressure gradients in the right ventricular outflow tract and through the tricuspid valve were measured. All data were classified and correlated with the results of CPET evaluations of these patients. The analysis was performed using the Random Forest model. In this way, we calculated the importance of the different hemodynamic variables related to the maximal oxygen uptake in CPET (VO 2 %predicted). Right ventricular pressure showed the most important influence on maximal oxygen uptake, whereas pulmonary regurgitation and right ventricular enddiastolic volume were not important hemodynamic variables to predict maximal oxygen uptake in CPET. Maximal exercise capacity was only very weakly influenced by right ventricular enddiastolic volume and not at all by pulmonary regurgitation in patients with ToF. The variable with the most pronounced influence was the right ventricular pressure.

  17. Contribution to the discussion of P.M. Fayers and David J. Hand: Causal variables, indicator variables and measurement scales: an example from quality of life

    DEFF Research Database (Denmark)

    Keiding, Niels

    2002-01-01

    Causal variables; Clinimetric scales; Composite scales; Construct validity; Measurement scales; Multi-item scales; Quality-of-life instruments......Causal variables; Clinimetric scales; Composite scales; Construct validity; Measurement scales; Multi-item scales; Quality-of-life instruments...

  18. Enceladus Plume Morphology and Variability from UVIS Measurements

    Science.gov (United States)

    Hansen, Candice; Esposito, Larry; Colwell, Josh; Hendrix, Amanda; Portyankina, Ganna

    2017-10-01

    The Ultraviolet Imaging Spectrograph (UVIS) on the Cassini spacecraft has been observing Enceladus’ plume and its effect on the Saturnian environment since 2004. One solar and 7 stellar occultations have been observed between 2005 and 2017. On 27 March 2017 epsilon Canis Majoris (CMa) passed behind the plume of water vapor spewing from Enceladus’ tiger stripe fissures. With this occultation we have 6 cuts through the plume at a variety of orientations over 12 years. Following our standard procedure the column density along the line of sight from Enceladus to the star was determined and the water flux calculated [1]. The mean anomaly was 131, well away from the dust flux peak associated with Enceladus at an orbital longitude near apoapsis [2]. We find that the water vapor flux was ~160 kg/sec (this number will be refined when the final reconstructed trajectory is available). That puts it “in family” with the other occultations, with values that cluster around 200 kg/sec. It is at the low end, which may be consistent with the drop in particle output observed over the last decade [3]. UVIS results show that the supersonic collimated gas jets imbedded in the plume are the likely source of the variability in dust output [4], rather than overall flux from the tiger stripes. An occultation of epsilon Orionis was observed on 11 March 2016 when Enceladus was at a mean anomaly of 208. Although the bulk flux changed little the amount of water vapor coming from the Baghdad I supersonic jet increased by 25% relative to 2011. The Baghdad I jet was observed again in the 2017 epsilon CMa occultation, and the column density is half that of 2016, further bolstering the conclusion that the gas jets change output as a function of orbital longitude. UVIS results describing gas flux, jets, and general structure of the plume, the observables above the surface, are key to testing hypotheses for what is driving Enceladus’ eruptive activity below the surface. [1] Hansen, C. J. et

  19. Estimation of intersubject variability of cerebral blood flow measurements using MRI and positron emission tomography

    DEFF Research Database (Denmark)

    Henriksen, Otto Mølby; Larsson, Henrik B W; Hansen, Adam E

    2012-01-01

    PURPOSE: To investigate the within and between subject variability of quantitative cerebral blood flow (CBF) measurements in normal subjects using various MRI techniques and positron emission tomography (PET). MATERIALS AND METHODS: Repeated CBF measurements were performed in 17 healthy, young...

  20. Observer variability of absolute and relative thrombus density measurements in patients with acute ischemic stroke

    NARCIS (Netherlands)

    Santos, Emilie M. M.; Yoo, Albert J.; Beenen, Ludo F.; Berkhemer, Olvert A.; den Blanken, Mark D.; Wismans, Carrie; Niessen, Wiro J.; Majoie, Charles B.; Marquering, Henk A.; Fransen, Puck S. S.; Beumer, Debbie; van den Berg, Lucie A.; Lingsma, Hester F.; Schonewille, Wouter J.; Vos, Jan Albert; Nederkoorn, Paul J.; Wermer, Marieke J. H.; van Walderveen, Marianne A. A.; Staals, Julie; Hofmeijer, Jeannette; van Oostayen, Jacques A.; Lycklama à Nijeholt, Geert J.; Boiten, Jelis; Brouwer, Patrick A.; Emmer, Bart J.; de Bruijn, Sebastiaan F.; van Dijk, Lukas C.; Kappelle, L. Jaap; Lo, Rob H.; van Dijk, Ewoud J.; de Vries, Joost; de Kort, Paul L. M.; van den Berg, Jan S. P.; A A M van Hasselt, Boudewijn; Aerden, Leo A. M.; Dallinga, René J.; Visser, Marieke C.; Bot, Joseph C. J.; Vroomen, Patrick C.; Eshghi, Omid; Schreuder, Tobien H. C. M. L.; Heijboer, Roel J. J.; Keizer, Koos; Tielbeek, Alexander V.; Hertog, Heleen M. Den; Gerrits, Dick G.; van den Berg-Vos, Renske M.; Karas, Giorgos B.; Steyerberg, Ewout W.; Flach, H. Zwenneke; Sprengers, Marieke E. S.; Jenniskens, Sjoerd F. M.; van den Berg, René; Koudstaal, Peter J.; van Zwam, Wim H.; Roos, Yvo B. W. E. M.; van der Lugt, Aad; van Oostenbrugge, Robert J.; Dippel, Diederik W. J.

    2016-01-01

    Thrombus density may be a predictor for acute ischemic stroke treatment success. However, only limited data on observer variability for thrombus density measurements exist. This study assesses the variability and bias of four common thrombus density measurement methods by expert and non-expert

  1. Observer variability of absolute and relative thrombus density measurements in patients with acute ischemic stroke

    NARCIS (Netherlands)

    E.M.M. Santos (Emilie M.); A.J. Yoo (Albert J.); L.F.M. Beenen (Ludo); O.A. Berkhemer (Olvert); M.D. Den Blanken (Mark D.); C. Wismans (Carrie); W.J. Niessen (Wiro); C.B. Majoie (Charles); H. Marquering (Henk)

    2016-01-01

    textabstractIntroduction: Thrombus density may be a predictor for acute ischemic stroke treatment success. However, only limited data on observer variability for thrombus density measurements exist. This study assesses the variability and bias of four common thrombus density measurement methods by

  2. Observer variability of absolute and relative thrombus density measurements in patients with acute ischemic stroke

    NARCIS (Netherlands)

    Santos, E.M.; Yoo, A.J.; Beenen, L.F.; Berkhemer, O.A.; Blanken, M.D. den; Wismans, C.; Niessen, W.J.; Majoie, C.B.; Marquering, H.A.; Dijk, E.J. van; et al.,

    2016-01-01

    INTRODUCTION: Thrombus density may be a predictor for acute ischemic stroke treatment success. However, only limited data on observer variability for thrombus density measurements exist. This study assesses the variability and bias of four common thrombus density measurement methods by expert and

  3. Comparison of global sensitivity analysis techniques and importance measures in PSA

    International Nuclear Information System (INIS)

    Borgonovo, E.; Apostolakis, G.E.; Tarantola, S.; Saltelli, A.

    2003-01-01

    This paper discusses application and results of global sensitivity analysis techniques to probabilistic safety assessment (PSA) models, and their comparison to importance measures. This comparison allows one to understand whether PSA elements that are important to the risk, as revealed by importance measures, are also important contributors to the model uncertainty, as revealed by global sensitivity analysis. We show that, due to epistemic dependence, uncertainty and global sensitivity analysis of PSA models must be performed at the parameter level. A difficulty arises, since standard codes produce the calculations at the basic event level. We discuss both the indirect comparison through importance measures computed for basic events, and the direct comparison performed using the differential importance measure and the Fussell-Vesely importance at the parameter level. Results are discussed for the large LLOCA sequence of the advanced test reactor PSA

  4. FIRE: an SPSS program for variable selection in multiple linear regression analysis via the relative importance of predictors.

    Science.gov (United States)

    Lorenzo-Seva, Urbano; Ferrando, Pere J

    2011-03-01

    We provide an SPSS program that implements currently recommended techniques and recent developments for selecting variables in multiple linear regression analysis via the relative importance of predictors. The approach consists of: (1) optimally splitting the data for cross-validation, (2) selecting the final set of predictors to be retained in the equation regression, and (3) assessing the behavior of the chosen model using standard indices and procedures. The SPSS syntax, a short manual, and data files related to this article are available as supplemental materials from brm.psychonomic-journals.org/content/supplemental.

  5. Variability in the measurement of hospital-wide mortality rates.

    Science.gov (United States)

    Shahian, David M; Wolf, Robert E; Iezzoni, Lisa I; Kirle, Leslie; Normand, Sharon-Lise T

    2010-12-23

    Several countries use hospital-wide mortality rates to evaluate the quality of hospital care, although the usefulness of this metric has been questioned. Massachusetts policymakers recently requested an assessment of methods to calculate this aggregate mortality metric for use as a measure of hospital quality. The Massachusetts Division of Health Care Finance and Policy provided four vendors with identical information on 2,528,624 discharges from Massachusetts acute care hospitals from October 1, 2004, through September 30, 2007. Vendors applied their risk-adjustment algorithms and provided predicted probabilities of in-hospital death for each discharge and for hospital-level observed and expected mortality rates. We compared the numbers and characteristics of discharges and hospitals included by each of the four methods. We also compared hospitals' standardized mortality ratios and classification of hospitals with mortality rates that were higher or lower than expected, according to each method. The proportions of discharges that were included by each method ranged from 28% to 95%, and the severity of patients' diagnoses varied widely. Because of their discharge-selection criteria, two methods calculated in-hospital mortality rates (4.0% and 5.9%) that were twice the state average (2.1%). Pairwise associations (Pearson correlation coefficients) of discharge-level predicted mortality probabilities ranged from 0.46 to 0.70. Hospital-performance categorizations varied substantially and were sometimes completely discordant. In 2006, a total of 12 of 28 hospitals that had higher-than-expected hospital-wide mortality when classified by one method had lower-than-expected mortality when classified by one or more of the other methods. Four common methods for calculating hospital-wide mortality produced substantially different results. This may have resulted from a lack of standardized national eligibility and exclusion criteria, different statistical methods, or

  6. Soil Temperature Variability in Complex Terrain measured using Distributed a Fiber-Optic Distributed Temperature Sensing

    Science.gov (United States)

    Seyfried, M. S.; Link, T. E.

    2013-12-01

    Soil temperature (Ts) exerts critical environmental controls on hydrologic and biogeochemical processes. Rates of carbon cycling, mineral weathering, infiltration and snow melt are all influenced by Ts. Although broadly reflective of the climate, Ts is sensitive to local variations in cover (vegetative, litter, snow), topography (slope, aspect, position), and soil properties (texture, water content), resulting in a spatially and temporally complex distribution of Ts across the landscape. Understanding and quantifying the processes controlled by Ts requires an understanding of that distribution. Relatively few spatially distributed field Ts data exist, partly because traditional Ts data are point measurements. A relatively new technology, fiber optic distributed temperature system (FO-DTS), has the potential to provide such data but has not been rigorously evaluated in the context of remote, long term field research. We installed FO-DTS in a small experimental watershed in the Reynolds Creek Experimental Watershed (RCEW) in the Owyhee Mountains of SW Idaho. The watershed is characterized by complex terrain and a seasonal snow cover. Our objectives are to: (i) evaluate the applicability of fiber optic DTS to remote field environments and (ii) to describe the spatial and temporal variability of soil temperature in complex terrain influenced by a variable snow cover. We installed fiber optic cable at a depth of 10 cm in contrasting snow accumulation and topographic environments and monitored temperature along 750 m with DTS. We found that the DTS can provide accurate Ts data (+/- .4°C) that resolves Ts changes of about 0.03°C at a spatial scale of 1 m with occasional calibration under conditions with an ambient temperature range of 50°C. We note that there are site-specific limitations related cable installation and destruction by local fauna. The FO-DTS provide unique insight into the spatial and temporal variability of Ts in a landscape. We found strong seasonal

  7. Tack Measurements of Prepreg Tape at Variable Temperature and Humidity

    Science.gov (United States)

    Wohl, Christopher; Palmieri, Frank L.; Forghani, Alireza; Hickmott, Curtis; Bedayat, Houman; Coxon, Brian; Poursartip, Anoush; Grimsley, Brian

    2017-01-01

    NASA’s Advanced Composites Project has established the goal of achieving a 30 percent reduction in the timeline for certification of primary composite structures for application on commercial aircraft. Prepreg tack is one of several critical parameters affecting composite manufacturing by automated fiber placement (AFP). Tack plays a central role in the prevention of wrinkles and puckers that can occur during AFP, thus knowledge of tack variation arising from a myriad of manufacturing and environmental conditions is imperative for the prediction of defects during AFP. A full design of experiments was performed to experimentally characterize tack on 0.25-inch slit-tape tow IM7/8552-1 prepreg using probe tack testing. Several process parameters (contact force, contact time, retraction speed, and probe diameter) as well as environmental parameters (temperature and humidity) were varied such that the entire parameter space could be efficiently evaluated. Mid-point experimental conditions (i.e., parameters not at either extrema) were included to enable prediction of curvature in relationships and repeat measurements were performed to characterize experimental error. Collectively, these experiments enable determination of primary dependencies as well as multi-parameter relationships. Slit-tape tow samples were mounted to the bottom plate of a rheometer parallel plate fixture using a jig to prevent modification of the active area to be interrogated with the top plate, a polished stainless steel probe, during tack testing. The probe surface was slowly brought into contact with the pre-preg surface until a pre-determined normal force was achieved (2-30 newtons). After a specified dwell time (0.02-10 seconds), during which the probe substrate interaction was maintained under displacement control, the probe was retracted from the surface (0.1-50 millimeters per second). Initial results indicated a clear dependence of tack strength on several parameters, with a particularly

  8. Variability of vascular CT measurement techniques used in the assessment abdominal aortic aneurysms

    International Nuclear Information System (INIS)

    England, Andrew; Niker, Amanda; Redmond, Claire

    2010-01-01

    Purpose: The aim of this project is to assess the variability of six CT measurement techniques for sizing abdominal aortic aneurysms (AAAs). Method: 37 CT scans with known AAAs were loaded on to a departmental picture archiving and communication system (PACS). A team of three observers, with experience in aortic CT measurements and the PACS performed a series of 2D and 3D measurements on the abdominal aorta. Each observer was asked to measure 3 quantities; anterior-posterior AAA diameter, maximum oblique AAA diameter, maximum aneurysm area using both 2D and 3D techniques. In order to test intra-observer variability each observer was asked to repeat their measurements. All measurements were taken using electronic callipers, under standardised viewing conditions using previously calibrated equipment. 3D measurements were conducted using a computer generated central luminal line (CLL). All measurements for this group were taken perpendicular to the CLL. Results: A total of 972 independent measurements were recorded by three observers. Mean intra-observer variability was lower for 2D diameter measurements (AP 1.3 ± 1.6 mm; 2D Oblique 1.2 ± 1.3 mm) and 2D areas (0.7 ± 1.3 cm 2 ) when compared to inter-observer variability (AP 1.7 ± 1.9 mm; Oblique 1.6 ± 1.7 mm; area 1.1 ± 1.5 cm 2 ). When comparing 2D with 3D measurements, differences were comparable except for 3D AP diameter and area which had lower inter-observer variability than their 2D counterparts (AP 2D 1.7 ± 1.9 mm, 3D 1.3 ± 1.3 mm; area 2D 1.1 ± 1.5 cm 2 , 3D 0.7 ± 0.7 cm 2 ). 3D area measurement was the only technique which had equal variability for intra- and inter-observer measurements. Overall observer variability for the study was good with 94-100% of all paired measurements within 5.00 mm/cm 2 or less. Using Pitman's test it can be confirmed that area measurements in the 3D plane have the least variability (r = 0.031) and 3D oblique measurements have the highest variability (r = 0

  9. Capturing Dynamics of Biased Attention: Are New Attention Variability Measures the Way Forward?

    Directory of Open Access Journals (Sweden)

    Anne-Wil Kruijt

    Full Text Available New indices, calculated on data from the widely used Dot Probe Task, were recently proposed to capture variability in biased attention allocation. We observed that it remains unclear which data pattern is meant to be indicative of dynamic bias and thus to be captured by these indices. Moreover, we hypothesized that the new indices are sensitive to SD differences at the response time (RT level in the absence of bias.Randomly generated datasets were analyzed to assess properties of the Attention Bias Variability (ABV and Trial Level Bias Score (TL-BS indices. Sensitivity to creating differences in 1 RT standard deviation, 2 mean RT, and 3 bias magnitude were assessed. In addition, two possible definitions of dynamic attention bias were explored by creating differences in 4 frequency of bias switching, and 5 bias magnitude in the presence of constant switching.ABV and TL-BS indices were found highly sensitive to increasing SD at the response time level, insensitive to increasing bias, linearly sensitive to increasing bias magnitude in the presence of bias switches, and non-linearly sensitive to increasing the frequency of bias switches. The ABV index was also found responsive to increasing mean response times in the absence of bias.Recently proposed DPT derived variability indices cannot uncouple measurement error from bias variability. Significant group differences may be observed even if there is no bias present in any individual dataset. This renders the new indices in their current form unfit for empirical purposes. Our discussion focuses on fostering debate and ideas for new research to validate the potentially very important notion of biased attention being dynamic.

  10. Measures of component importance in repairable multistate systems—a numerical study

    International Nuclear Information System (INIS)

    Natvig, Bent; Huseby, Arne B.; Reistadbakk, Mads O.

    2011-01-01

    Dynamic and stationary measures of importance of a component in a repairable multistate system are an important part of reliability. For multistate systems little has been published until now on such measures even in the nonrepairable case. According to the Barlow–Proschan type measures a component is important if there is a high probability that a change in the component state causes a change in whether or not the system state is above a given state. On the other hand, the Natvig type measures focus on how a change in the component state affects the expected system uptime and downtime relative to the given system state. In the present paper we first review these measures which can be estimated using advanced simulation methods. Extending earlier work from the binary to the multistate case, a numerical study of these measures is then given for two three component systems, a bridge system and also applied to an offshore oil and gas production system. In the multistate case the importance of a component is calculated separately for each component state. Thus it may happen that a component is very important at one state, and less important, or even irrelevant at another. Unified measures combining the importances for all component states can be obtained by adding up the importance measures for each individual state. According to these unified measures a component can be important relative to a given system state but not to another. It can be seen that if the distributions of the total component times spent in the non-complete failure states for the multistate system and the component lifetimes for the binary system are identical, the Barlow–Proschan measure to the lowest system state simply reduces to the binary version of the measure. The extended Natvig measure, however, does not have this property. This indicates that the latter measure captures more information about the system. - Highlights: ► The paper discusses measures of component importance in

  11. Between-centre variability versus variability over time in DXA whole body measurements evaluated using a whole body phantom

    Energy Technology Data Exchange (ETDEWEB)

    Louis, Olivia [Department of Radiology, AZ-VUB, Vrije Universiteit Brussel, Laarbeeklaan 101, 1090 Brussel (Belgium)]. E-mail: olivia.louis@az.vub.ac.be; Verlinde, Siska [Belgian Study Group for Pediatric Endocrinology (Belgium); Thomas, Muriel [Belgian Study Group for Pediatric Endocrinology (Belgium); De Schepper, Jean [Department of Pediatrics, AZ-VUB, Vrije Universiteit Brussel, Laarbeeklaan 101, 1090 Brussel (Belgium)

    2006-06-15

    This study aimed to compare the variability of whole body measurements, using dual energy X-ray absorptiometry (DXA), among geographically distinct centres versus that over time in a given centre. A Hologic-designed 28 kg modular whole body phantom was used, including high density polyethylene, gray polyvinylchloride and aluminium. It was scanned on seven Hologic QDR 4500 DXA devices, located in seven centres and was also repeatedly (n = 18) scanned in the reference centre, over a time span of 5 months. The mean between-centre coefficient of variation (CV) ranged from 2.0 (lean mass) to 5.6% (fat mass) while the mean within-centre CV ranged from 0.3 (total mass) to 4.7% (total area). Between-centre variability compared well with within-centre variability for total area, bone mineral content and bone mineral density, but was significantly higher for fat (p < 0.001), lean (p < 0.005) and total mass (p < 0.001). Our results suggest that, even when using the same device, the between-centre variability remains a matter of concern, particularly where body composition is concerned.

  12. Evaluation of the Wind Flow Variability Using Scanning Doppler Lidar Measurements

    Science.gov (United States)

    Sand, S. C.; Pichugina, Y. L.; Brewer, A.

    2016-12-01

    Better understanding of the wind flow variability at the heights of the modern turbines is essential to accurately assess of generated wind power and efficient turbine operations. Nowadays the wind energy industry often utilizes scanning Doppler lidar to measure wind-speed profiles at high spatial and temporal resolution.The study presents wind flow features captured by scanning Doppler lidars during the second Wind Forecast and Improvement Project (WFIP 2) sponsored by the Department of Energy (DOE) and National Oceanic and Atmospheric Administration (NOAA). This 18-month long experiment in the Columbia River Basin aims to improve model wind forecasts complicated by mountain terrain, coastal effects, and numerous wind farms.To provide a comprehensive dataset to use for characterizing and predicting meteorological phenomena important to Wind Energy, NOAA deployed scanning, pulsed Doppler lidars to two sites in Oregon, one at Wasco, located upstream of all wind farms relative to the predominant westerly flow in the region, and one at Arlington, located in the middle of several wind farms.In this presentation we will describe lidar scanning patterns capable of providing data in conical, or vertical-slice modes. These individual scans were processed to obtain 15-min averaged profiles of wind speed and direction in real time. Visualization of these profiles as time-height cross sections allows us to analyze variability of these parameters with height, time and location, and reveal periods of rapid changes (ramp events). Examples of wind flow variability between two sites of lidar measurements along with examples of reduced wind velocity downwind of operating turbines (wakes) will be presented.

  13. The Impact of Resonance Frequency Breathing on Measures of Heart Rate Variability, Blood Pressure, and Mood

    Directory of Open Access Journals (Sweden)

    Patrick R. Steffen

    2017-08-01

    Full Text Available Heart rate variability biofeedback (HRVB significantly improves heart rate variability (HRV. Breathing at resonance frequency (RF, approximately 6 breaths/min constitutes a key part of HRVB training and is hypothesized to be a pathway through which biofeedback improves HRV. No studies to date, however, have experimentally examined whether RF breathing impacts measures of HRV. The present study addressed this question by comparing three groups: the RF group breathed at their determined RF for 15 min; the RF + 1 group breathed at 1 breath/min higher than their determined RF for 15 min; and the third group sat quietly for 15 min. After this 15-min period, all groups participated in the Paced Auditory Serial Addition Task (PASAT for 8 min, and then sat quietly during a 10-min recovery period. HRV, blood pressure, and mood were measured throughout the experiment. Groups were not significantly different on any of the measures at baseline. After the breathing exercise, the RF group reported higher positive mood than the other two groups and a significantly higher LF/HF HRV ratio relative to the control group, a key goal in HRVB training (p < 0.05. Additionally, the RF group showed lower systolic blood pressure during the PASAT and during the recovery period relative to the control group, with the RF + 1 group not being significantly different from either group (p < 0.05. Overall, RF breathing appears to play an important role in the positive effect HRVB has on measures of HRV.

  14. The Importance of and the Complexities Associated With Measuring Continuity of Care During Resident Training: Possible Solutions Do Exist.

    Science.gov (United States)

    Carney, Patricia A; Conry, Colleen M; Mitchell, Karen B; Ericson, Annie; Dickinson, W Perry; Martin, James C; Carek, Peter J; Douglass, Alan B; Eiff, M Patrice

    2016-04-01

    Evolutions in care delivery toward the patient-centered medical home have influenced important aspects of care continuity. Primary responsibility for a panel of continuity patients is a foundational requirement in family medicine residencies. In this paper we characterize challenges in measuring continuity of care in residency training in this new era of primary care. We synthesized the literature and analyzed information from key informant interviews and group discussions with residency faculty and staff to identify the challenges and possible solutions for measuring continuity of care during family medicine training. We specifically focused on measuring interpersonal continuity at the patient level, resident level, and health care team level. Challenges identified in accurately measuring interpersonal continuity of care during residency training include: (1) variability in empanelment approaches for all patients, (2) scheduling complexity in different types of visits, (3) variability in ability to attain continuity counts at the level of the resident, and (4) shifting make-up of health care teams, especially in residency training. Possible solutions for each challenge are presented. Philosophical issues related to continuity are discussed, including whether true continuity can be achieved during residency training and whether qualitative rather than quantitative measures of continuity are better suited to residencies. Measuring continuity of care in residency training is challenging but possible, though improvements in precision and assessment of the comprehensive nature of the relationships are needed. Definitions of continuity during training and the role continuity measurement plays in residency need further study.

  15. Component state-based integrated importance measure for multi-state systems

    International Nuclear Information System (INIS)

    Si, Shubin; Levitin, Gregory; Dui, Hongyan; Sun, Shudong

    2013-01-01

    Importance measures in reliability engineering are used to identify weak components and/or states in contributing to the reliable functioning of a system. Traditionally, importance measures do not consider the possible effect of groups of transition rates among different component states, which, however, has great effect on the component probability distribution and should therefore be taken into consideration. This paper extends the integrated importance measure (IIM) to estimate the effect of a component residing at certain states on the performance of the entire multi-state systems. This generalization of IIM describes in which state it is most worthy to keep the component to provide the desired level of system performance, and which component is the most important to keep in some state and above for improving the performance of the system. An application to an oil transportation system is presented to illustrate the use of the suggested importance measure

  16. Virtual continuity of the measurable functions of several variables, and Sobolev embedding theorems

    OpenAIRE

    Vershik, Anatoly; Zatitskiy, Pavel; Petrov, Fedor

    2013-01-01

    Classical Luzin's theorem states that the measurable function of one variable is "almost" continuous. This is not so anymore for functions of several variables. The search of right analogue of the Luzin theorem leads to a notion of virtually continuous functions of several variables. This probably new notion appears implicitly in the statements like embeddings theorems and traces theorems for Sobolev spaces. In fact, it reveals their nature as theorems about virtual continuity. This notion is...

  17. A new measure of uncertainty importance based on distributional sensitivity analysis for PSA

    International Nuclear Information System (INIS)

    Han, Seok Jung; Tak, Nam Il; Chun, Moon Hyun

    1996-01-01

    The main objective of the present study is to propose a new measure of uncertainty importance based on distributional sensitivity analysis. The new measure is developed to utilize a metric distance obtained from cumulative distribution functions (cdfs). The measure is evaluated for two cases: one is a cdf given by a known analytical distribution and the other given by an empirical distribution generated by a crude Monte Carlo simulation. To study its applicability, the present measure has been applied to two different cases. The results are compared with those of existing three methods. The present approach is a useful measure of uncertainty importance which is based on cdfs. This method is simple and easy to calculate uncertainty importance without any complex process. On the basis of the results obtained in the present work, the present method is recommended to be used as a tool for the analysis of uncertainty importance

  18. Measuring the chemical and cytotoxic variability of commercially available kava (Piper methysticum G. Forster.

    Directory of Open Access Journals (Sweden)

    Amanda C Martin

    Full Text Available Formerly used world-wide as a popular botanical medicine to reduce anxiety, reports of hepatotoxicity linked to consuming kava extracts in the late 1990s have resulted in global restrictions on kava use and have hindered kava-related research. Despite its presence on the United States Food and Drug Administration consumer advisory list for the past decade, export data from kava producing countries implies that US kava imports, which are not publicly reported, are both increasing and of a fairly high volume. We have measured the variability in extract chemical composition and cytotoxicity towards human lung adenocarcinoma A549 cancer cells of 25 commercially available kava products. Results reveal a high level of variation in chemical content and cytotoxicity of currently available kava products. As public interest and use of kava products continues to increase in the United States, efforts to characterize products and expedite research of this potentially useful botanical medicine are necessary.

  19. Intraindividual change and variability in daily stress processes: Findings from two measurement-burst diary studies

    Science.gov (United States)

    Sliwinski, Martin J.; Almeida, David M.; Smyth, Joshua; Stawski, Robert S.

    2010-01-01

    There is little longitudinal information on aging-related changes in emotional responses to negative events. The present manuscript examined intraindividual change and variability in the within-person coupling of daily stress and negative affect (NA) using data from two-measurement burst daily diary studies. Three main findings emerged. First, average reactivity to daily stress increased longitudinally, and this increase was evident across most the adult lifespan. Second, individual differences in emotional reactivity to daily stress exhibited long-term temporal stability, but this stability was greatest in midlife and decreased in old age. And third, reactivity to daily stress varied reliably within-persons (across-time), with individual exhibiting higher levels of reactivity during times when reporting high levels of global subject stress in previous month. Taken together, the present results emphasize the importance of modeling dynamic psychosocial and aging processes that operate across different time scales for understanding age-related changes in daily stress processes. PMID:20025399

  20. Risk Importance Measures in the Designand Operation of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Vrbanic I.; Samanta P.; Basic, I

    2017-10-31

    This monograph presents and discusses risk importance measures as quantified by the probabilistic risk assessment (PRA) models of nuclear power plants (NPPs) developed according to the current standards and practices. Usually, PRA tools calculate risk importance measures related to a single ?basic event? representing particular failure mode. This is, then, reflected in many current PRA applications. The monograph focuses on the concept of ?component-level? importance measures that take into account different failure modes of the component including common-cause failures (CCFs). In opening sections the roleof risk assessment in safety analysis of an NPP is introduced and discussion given of ?traditional?, mainly deterministic, design principles which have been established to assign a level of importance to a particular system, structure or component. This is followed by an overview of main risk importance measures for risk increase and risk decrease from current PRAs. Basic relations which exist among the measures are shown. Some of the current practical applications of risk importancemeasures from the field of NPP design, operation and regulation are discussed. The core of the monograph provides a discussion on theoreticalbackground and practical aspects of main risk importance measures at the level of ?component? as modeled in a PRA, starting from the simplest case, single basic event, and going toward more complexcases with multiple basic events and involvements in CCF groups. The intent is to express the component-level importance measures via theimportance measures and probabilities of the underlying single basic events, which are the inputs readily available from a PRA model andits results. Formulas are derived and discussed for some typical cases. The formulas and their results are demonstrated through some practicalexamples, done by means of a simplified PRA model developed in and run by RiskSpectrum? tool, which are presented in the appendices. The

  1. Measurement Uncertainty in Racial and Ethnic Identification among Adolescents of Mixed Ancestry: A Latent Variable Approach

    Science.gov (United States)

    Tracy, Allison J.; Erkut, Sumru; Porche, Michelle V.; Kim, Jo; Charmaraman, Linda; Grossman, Jennifer M.; Ceder, Ineke; Garcia, Heidie Vazquez

    2010-01-01

    In this article, we operationalize identification of mixed racial and ethnic ancestry among adolescents as a latent variable to (a) account for measurement uncertainty, and (b) compare alternative wording formats for racial and ethnic self-categorization in surveys. Two latent variable models were fit to multiple mixed-ancestry indicator data from…

  2. Decreased Heart Rate Variability in HIV Positive Patients Receiving Antiretroviral Therapy: Importance of Blood Glucose and Cholesterol

    DEFF Research Database (Denmark)

    Askgaard, Gro; Kristoffersen, Ulrik Sloth; Mehlsen, Jesper

    2011-01-01

    whether autonomic dysfunction is present in an ART treated HIV population and if so to identify factors of importance. METHODS: HIV patients receiving ART for at least 12 months (n¿=¿97) and an age-matched control group of healthy volunteers (n¿=¿52) were included. All were non-diabetic and had never......-intervals (RMSSD) or the percent of differences between adjacent NN intervals greater than 50 ms (pNN50). In the HIV positives, haemoglobin A1c correlated inversely with SDNN, RMSSD and pNN50 (pcorrelated inversely with RMSSD and pNN50 (p...4 cell count nor CD4 nadir correlated with time or phase domain HRV variables. CONCLUSIONS: Moderate autonomic dysfunction is present in HIV positives patients even with suppressed viral load due to ART. The dysfunction is correlated with HbA1c and hypercholesterolemia but not to duration of HIV...

  3. Importance measures in nuclear PSA: how to control their uncertainty and develop new applications

    International Nuclear Information System (INIS)

    Duflot, N.

    2007-01-01

    This PhD thesis deals with the importance measures based on nuclear probabilistic safety analyses (PSA). With these indicators, the importance towards risk of the events considered in the PSA models can be measured. The first part of this thesis sets out the framework in which they are currently used. The information extracted from importance measures evaluation is used in industrial decision-making processes that may impact the safety of nuclear plants. In the second part of the thesis, we thus try to meet the requirements of reliability and simplicity with an approach minimising the uncertainties due to modelling. We also lay out a new truncation process of the set of the minimal cut set (MCS) corresponding to the baseline case which allows a quick, automatic and precise calculation of the importance measures. As PSA are increasingly used in risk-informed decision-making approaches, we have examined the extension of importance measures to groups of basic events. The third part of the thesis therefore presents the definition of the importance of events such as the failure of a system or the loss of a function, as well as their potential applications. PSA being considered to be a useful tool to design new nuclear power plants, the fourth part of the thesis sketches out a design process based both on classical importance measures and on new ones. (author)

  4. Both Reaction Time and Accuracy Measures of Intraindividual Variability Predict Cognitive Performance in Alzheimer's Disease

    Directory of Open Access Journals (Sweden)

    Björn U. Christ

    2018-04-01

    Full Text Available Dementia researchers around the world prioritize the urgent need for sensitive measurement tools that can detect cognitive and functional change at the earliest stages of Alzheimer's disease (AD. Sensitive indicators of underlying neural pathology assist in the early detection of cognitive change and are thus important for the evaluation of early-intervention clinical trials. One method that may be particularly well-suited to help achieve this goal involves the quantification of intraindividual variability (IIV in cognitive performance. The current study aimed to directly compare two methods of estimating IIV (fluctuations in accuracy-based scores vs. those in latency-based scores to predict cognitive performance in AD. Specifically, we directly compared the relative sensitivity of reaction time (RT—and accuracy-based estimates of IIV to cognitive compromise. The novelty of the present study, however, centered on the patients we tested [a group of patients with Alzheimer's disease (AD] and the outcome measures we used (a measure of general cognitive function and a measure of episodic memory function. Hence, we compared intraindividual standard deviations (iSDs from two RT tasks and three accuracy-based memory tasks in patients with possible or probable Alzheimer's dementia (n = 23 and matched healthy controls (n = 25. The main analyses modeled the relative contributions of RT vs. accuracy-based measures of IIV toward the prediction of performance on measures of (a overall cognitive functioning, and (b episodic memory functioning. Results indicated that RT-based IIV measures are superior predictors of neurocognitive impairment (as indexed by overall cognitive and memory performance than accuracy-based IIV measures, even after adjusting for the timescale of measurement. However, one accuracy-based IIV measure (derived from a recognition memory test also differentiated patients with AD from controls, and significantly predicted episodic memory

  5. Measuring pH variability using an experimental sensor on an underwater glider

    Science.gov (United States)

    Hemming, Michael P.; Kaiser, Jan; Heywood, Karen J.; Bakker, Dorothee C. E.; Boutin, Jacqueline; Shitashima, Kiminori; Lee, Gareth; Legge, Oliver; Onken, Reiner

    2017-05-01

    Autonomous underwater gliders offer the capability of measuring oceanic parameters continuously at high resolution in both vertical and horizontal planes, with timescales that can extend to many months. An experimental ion-sensitive field-effect transistor (ISFET) sensor measuring pH on the total scale was attached to a glider during the REP14-MED experiment in June 2014 in the Sardinian Sea in the northwestern Mediterranean. During the deployment, pH was sampled at depths of up to 1000 m along an 80 km transect over a period of 12 days. Water samples were collected from a nearby ship and analysed for dissolved inorganic carbon concentration and total alkalinity to derive the pH for validating the ISFET sensor measurements. The vertical resolution of the pH sensor was good (1 to 2 m), but stability was poor and the sensor drifted in a non-monotonous fashion. In order to remove the sensor drift, a depth-constant time-varying offset was applied throughout the water column for each dive, reducing the spread of the data by approximately two-thirds. Furthermore, the ISFET sensor required temperature- and pressure-based corrections, which were achieved using linear regression. Correcting for this decreased the apparent sensor pH variability by a further 13 to 31 %. Sunlight caused an apparent sensor pH decrease of up to 0.1 in surface waters around local noon, highlighting the importance of shielding the sensor from light in future deployments. The corrected pH from the ISFET sensor is presented along with potential temperature, salinity, potential density anomalies (σθ), and dissolved oxygen concentrations (c(O2)) measured by the glider, providing insights into the physical and biogeochemical variability in the Sardinian Sea. The pH maxima were identified close to the depth of the summer chlorophyll maximum, where high c(O2) values were also found. Longitudinal pH variations at depth (σθ > 28. 8 kg m-3) highlighted the variability of water masses in the Sardinian

  6. On the choice of retrieval variables in the inversion of remotely sensed atmospheric measurements.

    Science.gov (United States)

    Ridolfi, Marco; Sgheri, Luca

    2013-05-06

    In this paper we introduce new variables that can be used to retrieve the atmospheric continuum emission in the inversion of remote sensing measurements. This modification tackles the so-called sloppy model problem. We test this approach on an extensive set of real measurements from the Michelson Interferometer for Passive Atmospheric Sounding. The newly introduced variables permit to achieve a more stable inversion and a smaller value of the minimum of the cost function.

  7. Hidden measurements, hidden variables and the volume representation of transition probabilities

    OpenAIRE

    Oliynyk, Todd A.

    2005-01-01

    We construct, for any finite dimension $n$, a new hidden measurement model for quantum mechanics based on representing quantum transition probabilities by the volume of regions in projective Hilbert space. For $n=2$ our model is equivalent to the Aerts sphere model and serves as a generalization of it for dimensions $n \\geq 3$. We also show how to construct a hidden variables scheme based on hidden measurements and we discuss how joint distributions arise in our hidden variables scheme and th...

  8. An uncertainty importance measure using a distance metric for the change in a cumulative distribution function

    International Nuclear Information System (INIS)

    Chun, Moon-Hyun; Han, Seok-Jung; Tak, Nam-IL

    2000-01-01

    A simple measure of uncertainty importance using the entire change of cumulative distribution functions (CDFs) has been developed for use in probability safety assessments (PSAs). The entire change of CDFs is quantified in terms of the metric distance between two CDFs. The metric distance measure developed in this study reflects the relative impact of distributional changes of inputs on the change of an output distribution, while most of the existing uncertainty importance measures reflect the magnitude of relative contribution of input uncertainties to the output uncertainty. The present measure has been evaluated analytically for various analytical distributions to examine its characteristics. To illustrate the applicability and strength of the present measure, two examples are provided. The first example is an application of the present measure to a typical problem of a system fault tree analysis and the second one is for a hypothetical non-linear model. Comparisons of the present result with those obtained by existing uncertainty importance measures show that the metric distance measure is a useful tool to express the measure of uncertainty importance in terms of the relative impact of distributional changes of inputs on the change of an output distribution

  9. Measuring systemic importance of financial institutions: An extreme value theory approach

    OpenAIRE

    Gravelle, Toni; Li, Fuchun

    2011-01-01

    In this paper, we define a financial institution's contribution to financial systemic risk as the increase in financial systemic risk conditional on the crash of the financial institution. The higher the contribution is, the more systemically important is the institution for the system. Based on relevant but different measurements of systemic risk, we propose a set of market-based measures on the systemic importance of financial institutions, each designed to capture certain aspects of system...

  10. Multi-attribute integrated measurement of node importance in complex networks.

    Science.gov (United States)

    Wang, Shibo; Zhao, Jinlou

    2015-11-01

    The measure of node importance in complex networks is very important to the research of networks stability and robustness; it also can ensure the security of the whole network. Most researchers have used a single indicator to measure the networks node importance, so that the obtained measurement results only reflect certain aspects of the networks with a loss of information. Meanwhile, because of the difference of networks topology, the nodes' importance should be described by combining the character of the networks topology. Most of the existing evaluation algorithms cannot completely reflect the circumstances of complex networks, so this paper takes into account the degree of centrality, the relative closeness centrality, clustering coefficient, and topology potential and raises an integrated measuring method to measure the nodes' importance. This method can reflect nodes' internal and outside attributes and eliminate the influence of network structure on the node importance. The experiments of karate network and dolphin network show that networks topology structure integrated measure has smaller range of metrical result than a single indicator and more universal. Experiments show that attacking the North American power grid and the Internet network with the method has a faster convergence speed than other methods.

  11. Intraobserver and interobserver variability in CT angiography and MR angiography measurements of the size of cerebral aneurysms

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hye Jeong [Hallym University College of Medicine, Department of Radiology, Kangnam Sacred Heart Hospital, Seoul (Korea, Republic of); Yoon, Dae Young; Lee, Hyung Jin [Hallym University College of Medicine, Department of Radiology, Kangdong Seong-Sim Hospital, Seoul (Korea, Republic of); Kim, Eun Soo [Hallym University College of Medicine, Department of Radiology, Hallym University Sacred Heart Hospital, Anyang, Gyeonggi-do (Korea, Republic of); Jeon, Hong Jun; Lee, Jong Young; Cho, Byung-Moon [Hallym University College of Medicine, Department of Neurosurgery, Kangdong Seong-Sim Hospital, Seoul (Korea, Republic of)

    2017-05-15

    Accurate and reliable measurement of aneurysm size is important for treatment planning. The purpose of this study was to determine intraobserver and interobserver variability of CTA and MRA for measurement of the size of cerebral aneurysms. Thirty patients with 33 unruptured cerebral aneurysms (saccular, >3 mm in their maximal dimension, with no daughter sacs or lobulations) who underwent 256-row multislice CTA, 3-D TOF MRA at 3.0T, and 3D rotational angiography (3DRA) were retrospectively analyzed. Three independent observers measured the neck, height, and width of the aneurysms using the CTA and MRA images. Intraobserver and interobserver variability of CTA and MRA measurements was evaluated using the standardized difference and intraclass correlation coefficient, with 3DRA measurements as the reference standard. In addition, the mean values of the measurements using CTA and MRA were compared with those using 3DRA. The overall intraobserver and interobserver standardized differences in CTA/MRA were 12.83-15.92%/13.48-17.45% and 14.08-17.00%/12.08-17.67%, respectively. The overall intraobserver and interobserver intraclass correlation coefficients of CTA/MRA were 0.88-0.98/0.84-0.96 and 0.86-0.98/0.85-0.95, respectively. Compared to the height and width measurements, measurements of the neck dimensions showed higher intraobserver and interobserver variability. The sizes of the cerebral aneurysms measured by CTA and MRA were 1.13-9.26 and 5.20-9.67% larger than those measured by 3DRA, respectively; however, these differences were not statistically significant. There were no noticeable differences between intraobserver and interobserver variability for both CTA- and MRA-based measurements of the size of cerebral aneurysms. (orig.)

  12. Flow-based vulnerability measures for network component importance: Experimentation with preparedness planning

    International Nuclear Information System (INIS)

    Nicholson, Charles D.; Barker, Kash; Ramirez-Marquez, Jose E.

    2016-01-01

    This work develops and compares several flow-based vulnerability measures to prioritize important network edges for the implementation of preparedness options. These network vulnerability measures quantify different characteristics and perspectives on enabling maximum flow, creating bottlenecks, and partitioning into cutsets, among others. The efficacy of these vulnerability measures to motivate preparedness options against experimental geographically located disruption simulations is measured. Results suggest that a weighted flow capacity rate, which accounts for both (i) the contribution of an edge to maximum network flow and (ii) the extent to which the edge is a bottleneck in the network, shows most promise across four instances of varying network sizes and densities. - Highlights: • We develop new flow-based measures of network vulnerability. • We apply these measures to determine the importance of edges after disruptions. • Networks of varying size and density are explored.

  13. Computed tomography dose and variability of airway dimension measurements: how low can we go?

    International Nuclear Information System (INIS)

    Jong, Pim A. de; Long, Frederick R.; Nakano, Yasutaka

    2006-01-01

    Quantitative CT shows promise as an outcome measure for cystic fibrosis (CF) lung disease in infancy, but must be accomplished at a dose as low as reasonably achievable. To determine the feasibility of ultra-low-dose CT for quantitative measurements of airway dimensions. Two juvenile pigs were anesthetized and their lungs scanned at 25 cm H 2 O face-mask pressure in apnoea using beam currents of 5, 10, 20, 40 and 100 mAs. The lumen diameters and wall thicknesses of matched airways (n=22) at each dose were measured by two observers using validated software. Measurement variability at each dose was compared to that at 100 mAs (reference dose) for large and small airways (lumen diameter <2.5 mm). Lowering CT dose (mAs) affected measurement variability for lumen diameter of small and large airways (P<0.001) and for wall thickness of small (P<0.001), but not large (P=0.63), airways. To obtain the same measurement variability at 5 mAs as at 100 mAs, four to six small airways or one to three large airways have to be measured and averaged. Quantitative airway measurements are feasible on images obtained at as low as 5 mAs, but more airways need to be measured to compensate for greater measurement variability. (orig.)

  14. Observer variability of lung function measurements in 2-6-yr-old children

    DEFF Research Database (Denmark)

    Klug, B; Nielsen, K G; Bisgaard, H

    2000-01-01

    by the interrupter technique measurements in young children are subject to influence by the observer, and the random variability between observers appears to be particularly great for respiratory resistance assessed by the interrupter technique. The authors suggest that the between-observer variability should......The aim of this study was to assess the within-observer and between-observer variability of lung function measurements in children aged 2-6 yrs. Two observers examined 22 asthmatic children independently according to a predefined protocol. Each observer obtained duplicate measurements...... observers. The ratio SDw between observers/mean SDw within observers was 0.94, 1.25, 1.35 and 2.86 for Xrs,5, Rrs,5, sRaw and Rint, respectively, indicating greater between-observer variability of the latter. The systematic difference between observers assessed by the difference between observer means...

  15. Volume measurement variability in three-dimensional high-frequency ultrasound images of murine liver metastases

    International Nuclear Information System (INIS)

    Wirtzfeld, L A; Graham, K C; Groom, A C; MacDonald, I C; Chambers, A F; Fenster, A; Lacefield, J C

    2006-01-01

    The identification and quantification of tumour volume measurement variability is imperative for proper study design of longitudinal non-invasive imaging of pre-clinical mouse models of cancer. Measurement variability will dictate the minimum detectable volume change, which in turn influences the scheduling of imaging sessions and the interpretation of observed changes in tumour volume. In this paper, variability is quantified for tumour volume measurements from 3D high-frequency ultrasound images of murine liver metastases. Experimental B16F1 liver metastases were analysed in different size ranges including less than 1 mm 3 , 1-4 mm 3 , 4-8 mm 3 and 8-70 mm 3 . The intra- and inter-observer repeatability was high over a large range of tumour volumes, but the coefficients of variation (COV) varied over the volume ranges. The minimum and maximum intra-observer COV were 4% and 14% for the 1-4 mm 3 and 3 tumours, respectively. For tumour volumes measured by segmenting parallel planes, the maximum inter-slice distance that maintained acceptable measurement variability increased from 100 to 600 μm as tumour volume increased. Comparison of free breathing versus ventilated animals demonstrated that respiratory motion did not significantly change the measured volume. These results enable design of more efficient imaging studies by using the measured variability to estimate the time required to observe a significant change in tumour volume

  16. Acceptability of the Risk Importance Measures in Evaluation of a Change

    International Nuclear Information System (INIS)

    Dimitrijevic, V. B.; Chapman, J. R.

    1996-01-01

    In this paper, the authors discuss insights gained from evaluating changes to plant design and operational practices. Evaluation of a change is performed in order to provide an answer to two fundamental questions: what is the impact and is the impact acceptable? In order to determine 'the acceptability of an impact', the risk-based technologies today provide various ranking schemes. They are based on the existing IPE studies or PSA models and use of standard risk importance measures. In 'ad hoc' applications of risk importance measures, the specific nature of the analyzed change is often neglected. This paper attempts to capture the most common problems in the application of importance measures, and defines the limits of this application. The authors' position is that the use of risk importance information as the sole basis to accept or reject with ranking results, after the basis for the rank is meaningfully established. (author)

  17. A decision-oriented measure of uncertainty importance for use in PSA

    International Nuclear Information System (INIS)

    Poern, Kurt

    1997-01-01

    For the interpretation of the results of probabilistic risk assessments it is important to have measures which identify the basic events that contribute most to the frequency of the top event but also to identify basic events that are the main contributors to the uncertainty in this frequency. Both types of measures, often called Importance Measure and Measure of Uncertainty Importance, respectively, have been the subject of interest for many researchers in the reliability field. The most frequent mode of uncertainty analysis in connection with probabilistic risk assessment has been to propagate the uncertainty of all model parameters up to an uncertainty distribution for the top event frequency. Various uncertainty importance measures have been proposed in order to point out the parameters that in some sense are the main contributors to the top event distribution. The new measure of uncertainty importance suggested here goes a step further in that it has been developed within a decision theory framework, thereby providing an indication of on what basic event it would be most valuable, from the decision-making point of view, to procure more information

  18. State-of-the-art measurements in human body composition: A moving frontier of clinical importance

    Science.gov (United States)

    Gallagher, D.; Shaheen, I.; Zafar, K.

    2010-01-01

    The measurement of human body composition allows for the estimation of body tissues, organs, and their distributions in living persons without inflicting harm. From a nutritional perspective, the interest in body composition has increased multi-fold with the global increase in the prevalence of obesity and its complications. The latter has driven in part the need for improved measurement methods with greater sensitivity and precision. There is no single gold standard for body-composition measurements in-vivo. All methods incorporate assumptions that do not apply in all individuals and the more accurate models are derived by using a combination of measurements, thereby reducing the importance of each assumption. This review will discuss why the measurement of body composition or human phenotyping is important; discuss new areas where the measurement of body composition (human phenotyping) is recognized as having important application; and will summarize recent advances made in new methodology. Reference will also be made to areas we cannot yet measure due to the lack of appropriate measurement methodologies, most especially measurements methods that provide information on kinetic states (not just static state) and metabolic function. PMID:21234275

  19. For the Love of Nature: Exploring the Importance of Species Diversity and Micro-Variables Associated with Favorite Outdoor Places.

    Science.gov (United States)

    Schebella, Morgan F; Weber, Delene; Lindsey, Kiera; Daniels, Christopher B

    2017-01-01

    Although the restorative benefits of nature are widely acknowledged, there is a limited understanding of the attributes of natural environments that are fundamental to restorative experiences. Faced with growing human populations and a greater awareness of the wellbeing benefits natural environments provide, park agencies and planners are increasingly challenged with balancing human and ecological outcomes in natural areas. This study examines the physical and experiential qualities of natural environments people referred to when describing their connection to their most valued natural environments in an online questionnaire. Recruited primarily via a public radio program, respondents were asked to identify their favorite places and explain what they loved about those places. Favorite places are considered exemplars of restorative environments and were classified based on an existing park typology. Reasons people liked particular sites were classified into three domains: setting, activity, or benefit. Content analysis was used to identify the attributes most commonly associated with favorite places. These attributes were then related to the four components of restorative environments according to Attention Restoration Theory. In contrast to previous research, we found that "fascination" was the most important component of favorite places. Possible reasons for this contrast, namely, respondents' median age, and the likelihood of a high degree of ecological literacy amongst the study population are discussed. South Australians' favorite environments comprise primarily hilly, wooded nature parks, and botanical gardens, in stark contrast to the vast arid areas that dominate the state. Micro-variables such as birds, plants, wildlife, native species, and biodiversity appear particularly important elements used to explain people's love of these sites. We discuss the implications of these findings and their potential value as an anchor for marketing campaigns seeking to

  20. For the Love of Nature: Exploring the Importance of Species Diversity and Micro-Variables Associated with Favorite Outdoor Places

    Directory of Open Access Journals (Sweden)

    Morgan F. Schebella

    2017-12-01

    Full Text Available Although the restorative benefits of nature are widely acknowledged, there is a limited understanding of the attributes of natural environments that are fundamental to restorative experiences. Faced with growing human populations and a greater awareness of the wellbeing benefits natural environments provide, park agencies and planners are increasingly challenged with balancing human and ecological outcomes in natural areas. This study examines the physical and experiential qualities of natural environments people referred to when describing their connection to their most valued natural environments in an online questionnaire. Recruited primarily via a public radio program, respondents were asked to identify their favorite places and explain what they loved about those places. Favorite places are considered exemplars of restorative environments and were classified based on an existing park typology. Reasons people liked particular sites were classified into three domains: setting, activity, or benefit. Content analysis was used to identify the attributes most commonly associated with favorite places. These attributes were then related to the four components of restorative environments according to Attention Restoration Theory. In contrast to previous research, we found that “fascination” was the most important component of favorite places. Possible reasons for this contrast, namely, respondents' median age, and the likelihood of a high degree of ecological literacy amongst the study population are discussed. South Australians' favorite environments comprise primarily hilly, wooded nature parks, and botanical gardens, in stark contrast to the vast arid areas that dominate the state. Micro-variables such as birds, plants, wildlife, native species, and biodiversity appear particularly important elements used to explain people's love of these sites. We discuss the implications of these findings and their potential value as an anchor for marketing

  1. How important is the recommended slow cuff pressure deflation rate for blood pressure measurement?

    Science.gov (United States)

    Zheng, Dingchang; Amoore, John N; Mieke, Stephan; Murray, Alan

    2011-10-01

    Cuff pressure deflation rate influences blood pressure (BP) measurement. However, there is little quantitative clinical evidence on its effect. Oscillometric pulses recorded from 75 subjects at the recommended deflation rate of 2-3 mmHg per second were analyzed. Some pulses were removed to realize six faster rates (2-7 times faster than the original). Systolic, diastolic, and mean arterial blood pressures (SBP, DBP, MAP) were determined from the original and six reconstructed oscillometric waveforms. Manual measurement was based on the appearance of oscillometric pulse peaks, and automatic measurement on two model envelopes (linear and polynomial) fitted to the sequence of oscillometric pulse amplitudes. The effects of deflation rate on BP determination and within-subject BP variability were analyzed. For SBP and DBP determined from the manual measurement, different deflation rates resulted in significant changes (both p deflation rate effect (all p > 0.3). Faster deflation increased the within-subject BP variability (all p deflation rate, and for the automatic model-based techniques, the deflation rate had little effect.

  2. Correlation between measured energy expenditure and clinically obtained variables in trauma and sepsis patients.

    Science.gov (United States)

    Frankenfield, D C; Omert, L A; Badellino, M M; Wiles, C E; Bagley, S M; Goodarzi, S; Siegel, J H

    1994-01-01

    Indirect calorimetry is the preferred method for determining caloric requirements of patients, but availability of the device is limited by high cost. A study was therefore conducted to determine whether clinically obtainable variables could be used to predict metabolic rate. Patients with severe trauma or sepsis who required mechanical ventilation were measured by an open-circuit indirect calorimeter. Several clinical variables were obtained simultaneously. Measurements were repeated every 12 hours for up to 10 days. Twenty-six trauma and 30 sepsis patients were measured 423 times. Mean resting energy expenditure was 36 +/- 7 kcal/kg (trauma) vs 45 +/- 8 kcal/kg (sepsis) (p types.

  3. Statistical Primer for Athletic Trainers: The Essentials of Understanding Measures of Reliability and Minimal Important Change.

    Science.gov (United States)

    Riemann, Bryan L; Lininger, Monica R

    2018-01-01

      To describe the concepts of measurement reliability and minimal important change.   All measurements have some magnitude of error. Because clinical practice involves measurement, clinicians need to understand measurement reliability. The reliability of an instrument is integral in determining if a change in patient status is meaningful.   Measurement reliability is the extent to which a test result is consistent and free of error. Three perspectives of reliability-relative reliability, systematic bias, and absolute reliability-are often reported. However, absolute reliability statistics, such as the minimal detectable difference, are most relevant to clinicians because they provide an expected error estimate. The minimal important difference is the smallest change in a treatment outcome that the patient would identify as important.   Clinicians should use absolute reliability characteristics, preferably the minimal detectable difference, to determine the extent of error around a patient's measurement. The minimal detectable difference, coupled with an appropriately estimated minimal important difference, can assist the practitioner in identifying clinically meaningful changes in patients.

  4. Random Forest Variable Importance Spectral Indices Scheme for Burnt Forest Recovery Monitoring—Multilevel RF-VIMP

    Directory of Open Access Journals (Sweden)

    Sornkitja Boonprong

    2018-05-01

    Full Text Available Burnt forest recovery is normally monitored with a time-series analysis of satellite data because of its proficiency for large observation areas. Traditional methods, such as linear correlation plotting, have been proven to be effective, as forest recovery naturally increases with time. However, these methods are complicated and time consuming when increasing the number of observed parameters. In this work, we present a random forest variable importance (RF-VIMP scheme called multilevel RF-VIMP to compare and assess the relationship between 36 spectral indices (parameters of burnt boreal forest recovery in the Great Xing’an Mountain, China. Six Landsat images were acquired in the same month 0, 1, 4, 14, 16, and 20 years after a fire, and 39,380 fixed-location samples were then extracted to calculate the effectiveness of the 36 parameters. Consequently, the proposed method was applied to find correlations between the forest recovery indices. The experiment showed that the proposed method is suitable for explaining the efficacy of those spectral indices in terms of discrimination and trend analysis, and for showing the satellite data and forest succession dynamics when applied in a time series. The results suggest that the tasseled cap transformation wetness, brightness, and the shortwave infrared bands (both 1 and 2 perform better than other indices for both classification and monitoring.

  5. Is temperature an important variable in recovery after mild traumatic brain injury? [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Coleen M. Atkins

    2017-11-01

    Full Text Available With nearly 42 million mild traumatic brain injuries (mTBIs occurring worldwide every year, understanding the factors that may adversely influence recovery after mTBI is important for developing guidelines in mTBI management. Extensive clinical evidence exists documenting the detrimental effects of elevated temperature levels on recovery after moderate to severe TBI. However, whether elevated temperature alters recovery after mTBI or concussion is an active area of investigation. Individuals engaged in exercise and competitive sports regularly experience body and brain temperature increases to hyperthermic levels and these temperature increases are prolonged in hot and humid ambient environments. Thus, there is a strong potential for hyperthermia to alter recovery after mTBI in a subset of individuals at risk for mTBI. Preclinical mTBI studies have found that elevating brain temperature to 39°C before mTBI significantly increases neuronal death within the cortex and hippocampus and also worsens cognitive deficits. This review summarizes the pathology and behavioral problems of mTBI that are exacerbated by hyperthermia and discusses whether hyperthermia is a variable that should be considered after concussion and mTBI. Finally, underlying pathophysiological mechanisms responsible for hyperthermia-induced altered responses to mTBI and potential gender considerations are discussed.

  6. Disparities in lifestyle habits and health related factors of Montreal immigrants: is immigration an important exposure variable in public health?

    Science.gov (United States)

    Meshefedjian, Garbis A; Leaune, Viviane; Simoneau, Marie-Ève; Drouin, Mylène

    2014-10-01

    Study disparities in lifestyle habits and health characteristics of Canadian born population and immigrants with different duration of residence. Data are extracted from 2009 to 2010 public use micro-data files of Canadian Community Health Survey representing about 1.5 million people. Sixty-one percent of the study sample was born in Canada; 49 % males and 59 % below age 50. Amongst lifestyle habits, recent immigrants were less likely to be regular smokers, RR (95 % CI) 0.56 (0.36-0.88) and frequent consumers of alcohol 0.49 (0.27-0.89), but more likely to consume less fruits and vegetables 1.26 (1.04-1.53) than those born in Canada. Amongst health related factors, recent immigrants were less likely to be overweight 0.79 (0.62-0.99) and suffer from chronic diseases 0.59 (0.44-0.80), but more likely to have limited access to family medicine 1.24 (1.04-1.47) than Canada-born population. Immigration status is an important population characteristic which influenced distribution of health indicators. Prevention and promotion strategies should consider immigration status as an exposure variable in the development and implementation of public health programs.

  7. Measuring Instrument Constructs of Return Factors for Green Office Building Investments Variables Using Rasch Measurement Model

    Directory of Open Access Journals (Sweden)

    Isa Mona

    2016-01-01

    Full Text Available This paper is a preliminary study on rationalising green office building investments in Malaysia. The aim of this paper is attempt to introduce the application of Rasch measurement model analysis to determine the validity and reliability of each construct in the questionnaire. In achieving this objective, a questionnaire survey was developed consists of 6 sections and a total of 106 responses were received from various investors who own and lease office buildings in Kuala Lumpur. The Rasch Measurement analysis is used to measure the quality control of item constructs in the instrument by measuring the specific objectivity within the same dimension, to reduce ambiguous measures, and a realistic estimation of precision and implicit quality. The Rasch analysis consists of the summary statistics, item unidimensionality and item measures. A result shows the items and respondent (person reliability is at 0.91 and 0.95 respectively.

  8. A Variable Impacts Measurement in Random Forest for Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jae-Hee Hur

    2017-01-01

    Full Text Available Recently, the importance of mobile cloud computing has increased. Mobile devices can collect personal data from various sensors within a shorter period of time and sensor-based data consists of valuable information from users. Advanced computation power and data analysis technology based on cloud computing provide an opportunity to classify massive sensor data into given labels. Random forest algorithm is known as black box model which is hardly able to interpret the hidden process inside. In this paper, we propose a method that analyzes the variable impact in random forest algorithm to clarify which variable affects classification accuracy the most. We apply Shapley Value with random forest to analyze the variable impact. Under the assumption that every variable cooperates as players in the cooperative game situation, Shapley Value fairly distributes the payoff of variables. Our proposed method calculates the relative contributions of the variables within its classification process. In this paper, we analyze the influence of variables and list the priority of variables that affect classification accuracy result. Our proposed method proves its suitability for data interpretation in black box model like a random forest so that the algorithm is applicable in mobile cloud computing environment.

  9. Importance measures and genetic algorithms for designing a risk-informed optimally balanced system

    International Nuclear Information System (INIS)

    Zio, Enrico; Podofillini, Luca

    2007-01-01

    This paper deals with the use of importance measures for the risk-informed optimization of system design and management. An optimization approach is presented in which the information provided by the importance measures is incorporated in the formulation of a multi-objective optimization problem to drive the design towards a solution which, besides being optimal from the points of view of economics and safety, is also 'balanced' in the sense that all components have similar importance values. The approach allows identifying design systems without bottlenecks or unnecessarily high-performing components and with test/maintenance activities calibrated according to the components' importance ranking. The approach is tested at first against a multi-state system design optimization problem in which off-the-shelf components have to be properly allocated. Then, the more realistic problem of risk-informed optimization of the technical specifications of a safety system of a nuclear power plant is addressed

  10. Hypocretin measurement: shelf age of radioimmunoassay kit, but not freezer time, influences assay variability.

    Science.gov (United States)

    Keating, Glenda; Bliwise, Donald L; Saini, Prabhjyot; Rye, David B; Trotti, Lynn Marie

    2017-09-01

    The hypothalamic peptide hypocretin 1 (orexin A) may be assayed in cerebrospinal fluid to diagnose narcolepsy type 1. This testing is not commercially available, and factors contributing to assay variability have not previously been comprehensively explored. In the present study, cerebrospinal fluid hypocretin concentrations were determined in duplicate in 155 patient samples, across a range of sleep disorders. Intra-assay variability of these measures was analyzed. Inter-assay correlation between samples tested at Emory and at Stanford was high (r = 0.79, p hypocretin values, such that kits closer to expiration exhibit significantly more variability.

  11. Measurement and control of optical nonlinearities of importance to glass laser fusion systems

    International Nuclear Information System (INIS)

    Kurnit, N.A.; Shimada, T.; Sorem, M.S.; Taylor, A.J.; Rodriguez, G.; Clement, T.S.; James, D.F.V.; Milonni, P.W.

    1996-01-01

    Results of a number of studies carried out at Los Alamos, both experimental and theoretical, of nonlinear optical phenomena important to the design of the National Ignition Facility are summarized. These include measurements of nonlinear index coefficients, Raman scattering in atmospheric oxygen, and theoretical studies of harmonic conversion. The measurements were made by two different techniques in order to increase confidence in the results. One method was an application of a recently-developed technique for measuring the amplitude and phase of an ultrashort pulse by Frequency-Resolved Optical Gating (FROG). The other utilized a modified version of the Z-scan technique that measures beam distortion introduced by scanning a sample through the focus of a beam. The measurements by both techniques for fused silica were consistent with the lower range of previously measured values, indicating that it should not be necessary to further expand the beam size in the NIF to stay below the self-focusing threshold

  12. Resolving meso-scale seabed variability using reflection measurements from an autonomous underwater vehicle.

    Science.gov (United States)

    Holland, Charles W; Nielsen, Peter L; Dettmer, Jan; Dosso, Stan

    2012-02-01

    Seabed geoacoustic variability is driven by geological processes that occur over a wide spectrum of space-time scales. While the acoustics community has some understanding of horizontal fine-scale geoacoustic variability, less than O(10(0)) m, and large-scale variability, greater than O(10(3)) m, there is a paucity of data resolving the geoacoustic meso-scale O(10(0)-10(3)) m. Measurements of the meso-scale along an ostensibly "benign" portion of the outer shelf reveal three classes of variability. The first class was expected and is due to horizontal variability of layer thicknesses: this was the only class that could be directly tied to seismic reflection data. The second class is due to rapid changes in layer properties and/or boundaries, occurring over scales of meters to hundreds of meters. The third class was observed as rapid variations of the angle/frequency dependent reflection coefficient within a single observation and is suggestive of variability at scales of meter or less. Though generally assumed to be negligible in acoustic modeling, the second and third classes are indicative of strong horizontal geoacoustic variability within a given layer. The observations give early insight into possible effects of horizontal geoacoustic variability on long-range acoustic propagation and reverberation. © 2012 Acoustical Society of America

  13. Uncertainty Quantification Reveals the Importance of Data Variability and Experimental Design Considerations for in Silico Proarrhythmia Risk Assessment

    Directory of Open Access Journals (Sweden)

    Kelly C. Chang

    2017-11-01

    Full Text Available The Comprehensive in vitro Proarrhythmia Assay (CiPA is a global initiative intended to improve drug proarrhythmia risk assessment using a new paradigm of mechanistic assays. Under the CiPA paradigm, the relative risk of drug-induced Torsade de Pointes (TdP is assessed using an in silico model of the human ventricular action potential (AP that integrates in vitro pharmacology data from multiple ion channels. Thus, modeling predictions of cardiac risk liability will depend critically on the variability in pharmacology data, and uncertainty quantification (UQ must comprise an essential component of the in silico assay. This study explores UQ methods that may be incorporated into the CiPA framework. Recently, we proposed a promising in silico TdP risk metric (qNet, which is derived from AP simulations and allows separation of a set of CiPA training compounds into Low, Intermediate, and High TdP risk categories. The purpose of this study was to use UQ to evaluate the robustness of TdP risk separation by qNet. Uncertainty in the model parameters used to describe drug binding and ionic current block was estimated using the non-parametric bootstrap method and a Bayesian inference approach. Uncertainty was then propagated through AP simulations to quantify uncertainty in qNet for each drug. UQ revealed lower uncertainty and more accurate TdP risk stratification by qNet when simulations were run at concentrations below 5× the maximum therapeutic exposure (Cmax. However, when drug effects were extrapolated above 10× Cmax, UQ showed that qNet could no longer clearly separate drugs by TdP risk. This was because for most of the pharmacology data, the amount of current block measured was <60%, preventing reliable estimation of IC50-values. The results of this study demonstrate that the accuracy of TdP risk prediction depends both on the intrinsic variability in ion channel pharmacology data as well as on experimental design considerations that preclude an

  14. Serial Holter ST-segment monitoring after first acute myocardial infarction. Prevalence, variability, and long-term prognostic importance of transient myocardial ischemia

    DEFF Research Database (Denmark)

    Mickley, H; Nielsen, J R; Berning, J

    1998-01-01

    Based on serial Holter monitoring performed 7 times within 3 years after a first acute myocardial infarction, we assessed the prevalence, variability and long-term clinical importance of transient myocardial ischemia (TMI) defined as episodes of ambulatory ST-segment depression. In all, 121...... consecutive male patients variability was found within and between patients...

  15. Measurement of fuel importance distribution in non-uniformly distributed fuel systems

    International Nuclear Information System (INIS)

    Yamane, Yoshihiro; Hirano, Yasushi; Yasui, Hazime; Izima, Kazunori; Shiroya, Seiji; Kobayashi, Keiji.

    1995-01-01

    A reactivity effect due to a spatial variation of nuclear fuel concentration is an important problem for nuclear criticality safety in a reprocessing plant. As a theory estimating this reactivity effect, the Goertzel and fuel importance theories are well known. It has been shown that the Goertzel's theory is valid in the range of our experiments based on measurements of reactivity effect and thermal neutron flux in non-uniformly distributed fuel systems. On the other hand, there have been no reports concerning systematic experimental studies on the flatness of fuel importance which is a more general index than the Goertzel's theory. It is derived from the perturbation theory that the fuel importance is proportional to the reactivity change resulting from a change of small amount of fuel mass. Using a uniform and three kinds of nonuniform fuel systems consisting of 93.2% enriched uranium plates and polyethylene plates, the fuel importance distributions were measured. As a result, it was found experimentally that the fuel importance distribution became flat, as its reactivity effect became large. Therefore it was concluded that the flatness of fuel importance distribution is the useful index for estimating reactivity effect of non-uniformly distributed fuel system. (author)

  16. The radiation budget of stratocumulus clouds measured by tethered balloon instrumentation: Variability of flux measurements

    Science.gov (United States)

    Duda, David P.; Stephens, Graeme L.; Cox, Stephen K.

    1990-01-01

    Measurements of longwave and shortwave radiation were made using an instrument package on the NASA tethered balloon during the FIRE Marine Stratocumulus experiment. Radiation data from two pairs of pyranometers were used to obtain vertical profiles of the near-infrared and total solar fluxes through the boundary layer, while a pair of pyrgeometers supplied measurements of the longwave fluxes in the cloud layer. The radiation observations were analyzed to determine heating rates and to measure the radiative energy budget inside the stratocumulus clouds during several tethered balloon flights. The radiation fields in the cloud layer were also simulated by a two-stream radiative transfer model, which used cloud optical properties derived from microphysical measurements and Mie scattering theory.

  17. Application of a new importance measure for parametric uncertainty in PSA

    International Nuclear Information System (INIS)

    Poern, K.

    1997-04-01

    The traditional approach to uncertainty analysis in PSA, with propagation of basic event uncertainties through the PSA model, generates as an end product the uncertainty distribution of the top event frequency. This distribution, however, is not of much value for the decision maker. Most decisions are made under uncertainty. What the decision maker needs, to enhance the decision-making quality, is an adequate uncertainty importance measure that provides the decision maker with an indication of on what basic parameters it would be most valuable - as to the quality of the decision making in the specific situation - to procure more information. This paper will describe an application of a new measure of uncertainty importance that has been developed in the ongoing joint Nordic project NKS/RAK-1:3. The measure is called ''decision oriented'' because it is defined within a decision theoretic framework. It is defined as the expected value of a certain additional information about each basic parameter, and utilizes both the system structure and the complete uncertainty distributions of the basic parameters. The measure provides the analyst and the decision maker with a diagnostic information pointing to parameters on which more information would be most valuable to procure in order to enhance the decision-making quality. This uncertainty importance measure must not be confused with the more well-known, traditional importance measures of various kinds that are used to depict the contributions of each basic event or parameter (represented by point values) to the top event frequency. In this study the new measure is practically demonstrated through a real application on the top event: Water overflow through steam generator safety valves caused by steam generator tube rupture. This application object is one of the event sequences that the fore mentioned Nordic project has analysed with an integrated approach. The project has been funded by the Swedish Nuclear Power

  18. Range and number-of-levels effects in derived and stated measures of attribute importance

    NARCIS (Netherlands)

    Verlegh, PWJ; Schifferstein, HNJ; Wittink, DR

    We study how the range of variation and the number of ttribute levels affect five measures of attribute importance: full profile conjoint estimates, ranges in attribute level attractiveness ratings. regression coefficients. graded paired comparisons. and self-reported ratings, We find that all

  19. Measures for Administration of the Import of Mechanical and Electronic Products

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    @@ The Measures for the Administration of the Import of Mechanical and Electronic Products co-formulated by the Ministry of Commerce,the General Administration of Customs and the General Administration of Quality Supervision,Inspection and Quarantine,was hereby promul-gated,which entered into force as of May 1,2008.

  20. 75 FR 1110 - WTO Dispute Settlement Proceeding Regarding United States-Certain Measures Affecting Imports of...

    Science.gov (United States)

    2010-01-08

    ... OFFICE OF THE UNITED STATES TRADE REPRESENTATIVE [Docket No. WTO/DS399] WTO Dispute Settlement... Organization (``WTO Agreement'') concerning certain measures affecting imports of certain passenger vehicle and light truck tires from China. The request may be found at http://www.wto.org in document WT/DS399/2...

  1. Novel Variable Structure Measurement System with Intelligent Components for Flight Vehicles

    Directory of Open Access Journals (Sweden)

    Shen Kai

    2017-06-01

    Full Text Available The paper presents a method of developing a variable structure measurement system with intelligent components for flight vehicles. In order to find a distinguishing feature of a variable structure, a numerical criterion for selecting measuring sensors is proposed by quantifying the observability of different states of the system. Based on the Peter K. Anokhin’s theory of functional systems, a mechanism of “action acceptor” is built with intelligent components, e.g. self-organization algorithms. In this mechanism, firstly, prediction models of system states are constructed using self-organization algorithms; secondly, the predicted and measured values are compared; thirdly, an optimal structure of the measurement system is finally determined based on the results of comparison. According to the results of simulation with practical data and experiments obtained during field tests, the novel developed measurement system has the properties of high-accuracy, reliable operation and fault tolerance.

  2. Disaggregating measurement uncertainty from population variability and Bayesian treatment of uncensored results

    International Nuclear Information System (INIS)

    Strom, Daniel J.; Joyce, Kevin E.; Maclellan, Jay A.; Watson, David J.; Lynch, Timothy P.; Antonio, Cheryl L.; Birchall, Alan; Anderson, Kevin K.; Zharov, Peter

    2012-01-01

    In making low-level radioactivity measurements of populations, it is commonly observed that a substantial portion of net results are negative. Furthermore, the observed variance of the measurement results arises from a combination of measurement uncertainty and population variability. This paper presents a method for disaggregating measurement uncertainty from population variability to produce a probability density function (PDF) of possibly true results. To do this, simple, justifiable, and reasonable assumptions are made about the relationship of the measurements to the measurands (the 'true values'). The measurements are assumed to be unbiased, that is, that their average value is the average of the measurands. Using traditional estimates of each measurement's uncertainty to disaggregate population variability from measurement uncertainty, a PDF of measurands for the population is produced. Then, using Bayes's theorem, the same assumptions, and all the data from the population of individuals, a prior PDF is computed for each individual's measurand. These PDFs are non-negative, and their average is equal to the average of the measurement results for the population. The uncertainty in these Bayesian posterior PDFs is all Berkson with no remaining classical component. The methods are applied to baseline bioassay data from the Hanford site. The data include 90Sr urinalysis measurements on 128 people, 137Cs in vivo measurements on 5,337 people, and 239Pu urinalysis measurements on 3,270 people. The method produces excellent results for the 90Sr and 137Cs measurements, since there are nonzero concentrations of these global fallout radionuclides in people who have not been occupationally exposed. The method does not work for the 239Pu measurements in non-occupationally exposed people because the population average is essentially zero.

  3. Seasonal variability of the Red Sea, from GRACE time-variable gravity and altimeter sea surface height measurements

    Science.gov (United States)

    Wahr, John; Smeed, David; Leuliette, Eric; Swenson, Sean

    2014-05-01

    Seasonal variability of sea surface height and mass within the Red Sea, occurs mostly through the exchange of heat with the atmosphere and wind-driven inflow and outflow of water through the strait of Bab el Mandab that opens into the Gulf of Aden to the south. The seasonal effects of precipitation and evaporation, of water exchange through the Suez Canal to the north, and of runoff from the adjacent land, are all small. The flow through the Bab el Mandab involves a net mass transfer into the Red Sea during the winter and a net transfer out during the summer. But that flow has a multi-layer pattern, so that in the summer there is actually an influx of cool water at intermediate (~100 m) depths. Thus, summer water in the southern Red Sea is warmer near the surface due to higher air temperatures, but cooler at intermediate depths (especially in the far south). Summer water in the northern Red Sea experiences warming by air-sea exchange only. The temperature profile affects the water density, which impacts the sea surface height but has no effect on vertically integrated mass. Here, we study this seasonal cycle by combining GRACE time-variable mass estimates, altimeter (Jason-1, Jason-2, and Envisat) measurements of sea surface height, and steric sea surface height contributions derived from depth-dependent, climatological values of temperature and salinity obtained from the World Ocean Atlas. We find good consistency, particularly in the northern Red Sea, between these three data types. Among the general characteristics of our results are: (1) the mass contributions to seasonal SSHT variations are much larger than the steric contributions; (2) the mass signal is largest in winter, consistent with winds pushing water into the Red Sea through the Strait of Bab el Mandab in winter, and out during the summer; and (3) the steric signal is largest in summer, consistent with summer sea surface warming.

  4. Impulsive buying tendency: Measuring important relationships with a new perspective and an indigenous scale

    Directory of Open Access Journals (Sweden)

    Anant Jyoti Badgaiyan

    2016-12-01

    Full Text Available With the opening up of the economy and the proliferation of mall culture, the economic relevance of impulsive buying behaviour has assumed significance. Impulsive buying behaviour is better understood by examining the impulsive buying tendency that shapes such behaviour, and since consumer behaviour differs across cultures, by incorporating an indigenous perspective in understanding and measuring the tendency. Studies were conducted to develop an Indian scale for measuring impulsive buying tendency and to validate it by examining its association with other relevant variables. A two factor, 8-item scale was developed; a significant positive relationship was seen between impulsive buying tendency and impulsive buying behaviour, and the relationship between impulsive buying tendency and self-control was found to be inversely significant. Results also showed significant relationship between impulsive buying tendency and the two personality constructs of Conscientiousness and Extraversion.

  5. Estimations of natural variability between satellite measurements of trace species concentrations

    Science.gov (United States)

    Sheese, P.; Walker, K. A.; Boone, C. D.; Degenstein, D. A.; Kolonjari, F.; Plummer, D. A.; von Clarmann, T.

    2017-12-01

    In order to validate satellite measurements of atmospheric states, it is necessary to understand the range of random and systematic errors inherent in the measurements. On occasions where the measurements do not agree within those errors, a common "go-to" explanation is that the unexplained difference can be chalked up to "natural variability". However, the expected natural variability is often left ambiguous and rarely quantified. This study will look to quantify the expected natural variability of both O3 and NO2 between two satellite instruments: ACE-FTS (Atmospheric Chemistry Experiment - Fourier Transform Spectrometer) and OSIRIS (Optical Spectrograph and Infrared Imaging System). By sampling the CMAM30 (30-year specified dynamics simulation of the Canadian Middle Atmosphere Model) climate chemistry model throughout the upper troposphere and stratosphere at times and geolocations of coincident ACE-FTS and OSIRIS measurements at varying coincidence criteria, height-dependent expected values of O3 and NO2 variability will be estimated and reported on. The results could also be used to better optimize the coincidence criteria used in satellite measurement validation studies.

  6. Measuring myokines with cardiovascular functions: pre-analytical variables affecting the analytical output.

    Science.gov (United States)

    Lombardi, Giovanni; Sansoni, Veronica; Banfi, Giuseppe

    2017-08-01

    In the last few years, a growing number of molecules have been associated to an endocrine function of the skeletal muscle. Circulating myokine levels, in turn, have been associated with several pathophysiological conditions including the cardiovascular ones. However, data from different studies are often not completely comparable or even discordant. This would be due, at least in part, to the whole set of situations related to the preparation of the patient prior to blood sampling, blood sampling procedure, processing and/or store. This entire process constitutes the pre-analytical phase. The importance of the pre-analytical phase is often not considered. However, in routine diagnostics, the 70% of the errors are in this phase. Moreover, errors during the pre-analytical phase are carried over in the analytical phase and affects the final output. In research, for example, when samples are collected over a long time and by different laboratories, a standardized procedure for sample collecting and the correct procedure for sample storage are acknowledged. In this review, we discuss the pre-analytical variables potentially affecting the measurement of myokines with cardiovascular functions.

  7. A study of the effect of measurement error in predictor variables in nondestructive assay

    International Nuclear Information System (INIS)

    Burr, Tom L.; Knepper, Paula L.

    2000-01-01

    It is not widely known that ordinary least squares estimates exhibit bias if there are errors in the predictor variables. For example, enrichment measurements are often fit to two predictors: Poisson-distributed count rates in the region of interest and in the background. Both count rates have at least random variation due to counting statistics. Therefore, the parameter estimates will be biased. In this case, the effect of bias is a minor issue because there is almost no interest in the parameters themselves. Instead, the parameters will be used to convert count rates into estimated enrichment. In other cases, this bias source is potentially more important. For example, in tomographic gamma scanning, there is an emission stage which depends on predictors (the 'system matrix') that are estimated with error during the transmission stage. In this paper, we provide background information for the impact and treatment of errors in predictors, present results of candidate methods of compensating for the effect, review some of the nondestructive assay situations where errors in predictors occurs, and provide guidance for when errors in predictors should be considered in nondestructive assay

  8. Measuring pH variability using an experimental sensor on an underwater glider

    Directory of Open Access Journals (Sweden)

    M. P. Hemming

    2017-05-01

    Full Text Available Autonomous underwater gliders offer the capability of measuring oceanic parameters continuously at high resolution in both vertical and horizontal planes, with timescales that can extend to many months. An experimental ion-sensitive field-effect transistor (ISFET sensor measuring pH on the total scale was attached to a glider during the REP14-MED experiment in June 2014 in the Sardinian Sea in the northwestern Mediterranean. During the deployment, pH was sampled at depths of up to 1000 m along an 80 km transect over a period of 12 days. Water samples were collected from a nearby ship and analysed for dissolved inorganic carbon concentration and total alkalinity to derive the pH for validating the ISFET sensor measurements. The vertical resolution of the pH sensor was good (1 to 2 m, but stability was poor and the sensor drifted in a non-monotonous fashion. In order to remove the sensor drift, a depth-constant time-varying offset was applied throughout the water column for each dive, reducing the spread of the data by approximately two-thirds. Furthermore, the ISFET sensor required temperature- and pressure-based corrections, which were achieved using linear regression. Correcting for this decreased the apparent sensor pH variability by a further 13 to 31 %. Sunlight caused an apparent sensor pH decrease of up to 0.1 in surface waters around local noon, highlighting the importance of shielding the sensor from light in future deployments. The corrected pH from the ISFET sensor is presented along with potential temperature, salinity, potential density anomalies (σθ, and dissolved oxygen concentrations (c(O2 measured by the glider, providing insights into the physical and biogeochemical variability in the Sardinian Sea. The pH maxima were identified close to the depth of the summer chlorophyll maximum, where high c(O2 values were also found. Longitudinal pH variations at depth (σθ > 28. 8 kg m−3 highlighted the variability of

  9. Variability of carotid artery measurements on 3-Tesla MRI and its impact on sample size calculation for clinical research.

    Science.gov (United States)

    Syed, Mushabbar A; Oshinski, John N; Kitchen, Charles; Ali, Arshad; Charnigo, Richard J; Quyyumi, Arshed A

    2009-08-01

    Carotid MRI measurements are increasingly being employed in research studies for atherosclerosis imaging. The majority of carotid imaging studies use 1.5 T MRI. Our objective was to investigate intra-observer and inter-observer variability in carotid measurements using high resolution 3 T MRI. We performed 3 T carotid MRI on 10 patients (age 56 +/- 8 years, 7 male) with atherosclerosis risk factors and ultrasound intima-media thickness > or =0.6 mm. A total of 20 transverse images of both right and left carotid arteries were acquired using T2 weighted black-blood sequence. The lumen and outer wall of the common carotid and internal carotid arteries were manually traced; vessel wall area, vessel wall volume, and average wall thickness measurements were then assessed for intra-observer and inter-observer variability. Pearson and intraclass correlations were used in these assessments, along with Bland-Altman plots. For inter-observer variability, Pearson correlations ranged from 0.936 to 0.996 and intraclass correlations from 0.927 to 0.991. For intra-observer variability, Pearson correlations ranged from 0.934 to 0.954 and intraclass correlations from 0.831 to 0.948. Calculations showed that inter-observer variability and other sources of error would inflate sample size requirements for a clinical trial by no more than 7.9%, indicating that 3 T MRI is nearly optimal in this respect. In patients with subclinical atherosclerosis, 3 T carotid MRI measurements are highly reproducible and have important implications for clinical trial design.

  10. Kiloampere, Variable-Temperature, Critical-Current Measurements of High-Field Superconductors.

    Science.gov (United States)

    Goodrich, L F; Cheggour, N; Stauffer, T C; Filla, B J; Lu, X F

    2013-01-01

    We review variable-temperature, transport critical-current (I c) measurements made on commercial superconductors over a range of critical currents from less than 0.1 A to about 1 kA. We have developed and used a number of systems to make these measurements over the last 15 years. Two exemplary variable-temperature systems with coil sample geometries will be described: a probe that is only variable-temperature and a probe that is variable-temperature and variable-strain. The most significant challenge for these measurements is temperature stability, since large amounts of heat can be generated by the flow of high current through the resistive sample fixture. Therefore, a significant portion of this review is focused on the reduction of temperature errors to less than ±0.05 K in such measurements. A key feature of our system is a pre-regulator that converts a flow of liquid helium to gas and heats the gas to a temperature close to the target sample temperature. The pre-regulator is not in close proximity to the sample and it is controlled independently of the sample temperature. This allows us to independently control the total cooling power, and thereby fine tune the sample cooling power at any sample temperature. The same general temperature-control philosophy is used in all of our variable-temperature systems, but the addition of another variable, such as strain, forces compromises in design and results in some differences in operation and protocol. These aspects are analyzed to assess the extent to which the protocols for our systems might be generalized to other systems at other laboratories. Our approach to variable-temperature measurements is also placed in the general context of measurement-system design, and the perceived advantages and disadvantages of design choices are presented. To verify the accuracy of the variable-temperature measurements, we compared critical-current values obtained on a specimen immersed in liquid helium ("liquid" or I c liq) at 5

  11. Challenge in Enhancing the Teaching and Learning of Variable Measurements in Quantitative Research

    Science.gov (United States)

    Kee, Chang Peng; Osman, Kamisah; Ahmad, Fauziah

    2013-01-01

    Statistical analysis is one component that cannot be avoided in a quantitative research. Initial observations noted that students in higher education institution faced difficulty analysing quantitative data which were attributed to the confusions of various variable measurements. This paper aims to compare the outcomes of two approaches applied in…

  12. Endotracheal temperature and humidity measurements in laryngectomized patients: intra- and inter-patient variability

    NARCIS (Netherlands)

    Scheenstra, R.J.; Muller, S.H.; Vincent, A.; Sinaasappel, M.; Zuur, J.K.; Hilgers, F.J.M.

    2009-01-01

    This study assesses intra- and inter-patient variability in endotracheal climate (temperature and humidity) and effects of heat and moister exchangers (HME) in 16 laryngectomized individuals, measured repeatedly (N = 47). Inhalation Breath Length (IBL) was 1.35 s without HME and 1.05 s with HME (P <

  13. Endotracheal temperature and humidity measurements in laryngectomized patients: intra- and inter-patient variability

    NARCIS (Netherlands)

    Scheenstra, R. J.; Muller, S. H.; Vincent, A.; Sinaasappel, M.; Zuur, J. K.; Hilgers, Frans J. M.

    2009-01-01

    This study assesses intra- and inter-patient variability in endotracheal climate (temperature and humidity) and effects of heat and moister exchangers (HME) in 16 laryngectomized individuals, measured repeatedly (N = 47). Inhalation Breath Length (IBL) was 1.35 s without HME and 1.05 s with HME (P

  14. Measure Guideline. Replacing Single-Speed Pool Pumps with Variable Speed Pumps for Energy Savings

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, A. [Building Media and the Building America Retrofit Alliance (BARA), Wilmington, DE (United States); Easley, S. [Building Media and the Building America Retrofit Alliance (BARA), Wilmington, DE (United States)

    2012-05-01

    This measure guideline evaluates potential energy savings by replacing traditional single-speed pool pumps with variable speed pool pumps, and provides a basic cost comparison between continued uses of traditional pumps verses new pumps. A simple step-by-step process for inspecting the pool area and installing a new pool pump follows.

  15. COMPARISON OF EXPERIMENTAL-DESIGNS COMBINING PROCESS AND MIXTURE VARIABLES .2. DESIGN EVALUATION ON MEASURED DATA

    NARCIS (Netherlands)

    DUINEVELD, C. A. A.; Smilde, A. K.; Doornbos, D. A.

    1993-01-01

    The construction of a small experimental design for a combination of process and mixture variables is a problem which has not been solved completely by now. In a previous paper we evaluated some designs with theoretical measures. This second paper evaluates the capabilities of the best of these

  16. COMPARISON OF EXPERIMENTAL-DESIGNS COMBINING PROCESS AND MIXTURE VARIABLES .2. DESIGN EVALUATION ON MEASURED DATA

    NARCIS (Netherlands)

    DUINEVELD, CAA; SMILDE, AK; DOORNBOS, DA

    The construction of a small experimental design for a combination of process and mixture variables is a problem which has not been solved completely by now. In a previous paper we evaluated some designs with theoretical measures. This second paper evaluates the capabilities of the best of these

  17. Soil temperature variability in complex terrain measured using fiber-optic distributed temperature sensing

    Science.gov (United States)

    Soil temperature (Ts) exerts critical controls on hydrologic and biogeochemical processes but magnitude and nature of Ts variability in a landscape setting are rarely documented. Fiber optic distributed temperature sensing systems (FO-DTS) potentially measure Ts at high density over a large extent. ...

  18. Variability in measured current structure on the southwest continental shelf of India

    Digital Repository Service at National Institute of Oceanography (India)

    DineshKumar, P.K.; Srinivas, K.

    -1 Variability in Measured Current Structure on the Southwest Continental Shelf of India P.K. Dinesh Kumar and K. Srinivas National Institute of Oceanography, Regional Centre P.O.Box 1913, Cochin - 682018,India Email: dineshku@niokochi.org ABSTRACT... WORDS: Direct current measurements, tidal currents, southwest coast of India. INTRODUCTION The circulation pattern of the eastern Arabian Sea over the southwest continental shelf of India (inferred...

  19. Spatial Variability Analysis of Within-Field Winter Wheat Nitrogen and Grain Quality Using Canopy Fluorescence Sensor Measurements

    Directory of Open Access Journals (Sweden)

    Xiaoyu Song

    2017-03-01

    Full Text Available Wheat grain protein content (GPC is a key component when evaluating wheat nutrition. It is also important to determine wheat GPC before harvest for agricultural and food process enterprises in order to optimize the wheat grading process. Wheat GPC across a field is spatially variable due to the inherent variability of soil properties and position in the landscape. The objectives of this field study were: (i to assess the spatial and temporal variability of wheat nitrogen (N attributes related to the grain quality of winter wheat production through canopy fluorescence sensor measurements; and (ii to examine the influence of spatial variability of soil N and moisture across different growth stages on the wheat grain quality. A geostatistical approach was used to analyze data collected from 110 georeferenced locations. In particular, Ordinary Kriging Analysis (OKA was used to produce maps of wheat GPC, GPC yield, and wheat canopy fluorescence parameters, including simple florescence ratio and Nitrogen Balance Indices (NBI. Soil Nitrate-Nitrogen (NO3-N content and soil Time Domain Reflectometry (TDR value in the study field were also interpolated through the OKA method. The fluorescence parameter maps, soil NO3-N and soil TDR maps obtained from the OKA output were compared with the wheat GPC and GPC yield maps in order to assess their relationships. The results of this study indicate that the NBI spatial variability map in the late stage of wheat growth can be used to distinguish areas that produce higher GPC.

  20. Continuous performance task in ADHD: Is reaction time variability a key measure?

    Science.gov (United States)

    Levy, Florence; Pipingas, Andrew; Harris, Elizabeth V; Farrow, Maree; Silberstein, Richard B

    2018-01-01

    To compare the use of the Continuous Performance Task (CPT) reaction time variability (intraindividual variability or standard deviation of reaction time), as a measure of vigilance in attention-deficit hyperactivity disorder (ADHD), and stimulant medication response, utilizing a simple CPT X-task vs an A-X-task. Comparative analyses of two separate X-task vs A-X-task data sets, and subgroup analyses of performance on and off medication were conducted. The CPT X-task reaction time variability had a direct relationship to ADHD clinician severity ratings, unlike the CPT A-X-task. Variability in X-task performance was reduced by medication compared with the children's unmedicated performance, but this effect did not reach significance. When the coefficient of variation was applied, severity measures and medication response were significant for the X-task, but not for the A-X-task. The CPT-X-task is a useful clinical screening test for ADHD and medication response. In particular, reaction time variability is related to default mode interference. The A-X-task is less useful in this regard.

  1. Importance measures in risk-informed decision making: Ranking, optimisation and configuration control

    Energy Technology Data Exchange (ETDEWEB)

    Vaurio, Jussi K., E-mail: jussi.vaurio@pp1.inet.fi [Prometh Solutions, Hiihtaejaenkuja 3K, 06100 Porvoo (Finland)

    2011-11-15

    This paper describes roles, extensions and applications of importance measures of components and configurations for making risk-informed decisions relevant to system operations, maintenance and safety. Basic importance measures and their relationships are described for independent and mutually exclusive events and for groups of events associated with common cause failures. The roles of importances are described mainly in two groups of activities: (a) ranking safety significance of systems, structures, components and human actions for preventive safety assurance activities, and (b) making decisions about permissible permanent and temporary configurations and allowed configuration times for regulation, technical specifications and for on-line risk monitoring. Criticality importance and sums of criticalities turn out to be appropriate measures for ranking and optimization. Several advantages are pointed out and consistent ranking of pipe segments for in-service inspection is provided as an example. Risk increase factor and its generalization risk gain are most appropriately used to assess corrective priorities and acceptability of a situation when components are already failed or when planning to take one or more components out of service for maintenance. Precise definitions are introduced for multi-failure configurations and it is shown how they can be assessed under uncertainties, in particular when common cause failures or success states may be involved. A general weighted average method is compared to other candidate methods in benchmark cases. It is the preferable method for prediction when a momentary configuration is known or only partially known. Potential applications and optimization of allowed outage times are described. The results show how to generalize and apply various importance measures to ranking and optimization and how to manage configurations in uncertain multi-failure situations. - Highlights: > Rigorous methods developed for using importances

  2. Importance measures in risk-informed decision making: Ranking, optimisation and configuration control

    International Nuclear Information System (INIS)

    Vaurio, Jussi K.

    2011-01-01

    This paper describes roles, extensions and applications of importance measures of components and configurations for making risk-informed decisions relevant to system operations, maintenance and safety. Basic importance measures and their relationships are described for independent and mutually exclusive events and for groups of events associated with common cause failures. The roles of importances are described mainly in two groups of activities: (a) ranking safety significance of systems, structures, components and human actions for preventive safety assurance activities, and (b) making decisions about permissible permanent and temporary configurations and allowed configuration times for regulation, technical specifications and for on-line risk monitoring. Criticality importance and sums of criticalities turn out to be appropriate measures for ranking and optimization. Several advantages are pointed out and consistent ranking of pipe segments for in-service inspection is provided as an example. Risk increase factor and its generalization risk gain are most appropriately used to assess corrective priorities and acceptability of a situation when components are already failed or when planning to take one or more components out of service for maintenance. Precise definitions are introduced for multi-failure configurations and it is shown how they can be assessed under uncertainties, in particular when common cause failures or success states may be involved. A general weighted average method is compared to other candidate methods in benchmark cases. It is the preferable method for prediction when a momentary configuration is known or only partially known. Potential applications and optimization of allowed outage times are described. The results show how to generalize and apply various importance measures to ranking and optimization and how to manage configurations in uncertain multi-failure situations. - Highlights: → Rigorous methods developed for using importances

  3. Comparing measured and modelled soil carbon: which site-specific variables are linked to high stability?

    Science.gov (United States)

    Robertson, Andy; Schipanski, Meagan; Ma, Liwang; Ahuja, Lajpat; McNamara, Niall; Smith, Pete; Davies, Christian

    2016-04-01

    Changes in soil carbon (C) stocks have been studied in depth over the last two decades, as net greenhouse gas (GHG) sinks are highlighted to be a partial solution to the causes of climate change. However, the stability of this soil C is often overlooked when measuring these changes. Ultimately a net sequestration in soils is far less beneficial if labile C is replacing more stable forms. To date there is no accepted framework for measuring soil C stability, and as a result there is considerable uncertainty associated with the simulated impacts of land management and land use change when using process-based systems models. However, a recent effort to equate measurable soil C fractions to model pools has generated data that help to assess the impacts of land management, and can ultimately help to reduce the uncertainty of model predictions. Our research compiles this existing fractionation data along with site metadata to create a simplistic statistical model able to quantify the relative importance of different site-specific conditions. Data was mined from 23 published studies and combined with original data to generate a dataset of 100+ land use change sites across Europe. For sites to be included they required soil C fractions isolated using the Zimmermann et al. (2007) method and specific site metadata (mean annual precipitation, MAP; mean annual temperature, MAT; soil pH; land use; altitude). Of the sites, 75% were used to develop a generalized linear mixed model (GLMM) to create coefficients where site parameters can be used to predict influence on the measured soil fraction C stocks. The remaining 25% of sites were used to evaluate uncertainty and validate this empirical model. Further, four of the aforementioned sites were used to simulate soil C dynamics using the RothC, DayCent and RZWQM2 models. A sensitivity analysis (4096 model runs for each variable applying Latin hypercube random sampling techniques) was then used to observe whether these models place

  4. Importance of the variability of hydrographic preconditioning for deep convection in the Gulf of Lion, NW Mediterranean

    Directory of Open Access Journals (Sweden)

    L. Grignon

    2010-06-01

    Full Text Available We study the variability of hydrographic preconditioning defined as the heat and salt contents in the Ligurian Sea before convection. The stratification is found to reach a maximum in the intermediate layer in December, whose causes and consequences for the interannual variability of convection are investigated. Further study of the interannual variability and correlation tests between the properties of the deep water formed and the winter surface fluxes support the description of convection as a process that transfers the heat and salt contents from the top and intermediate layers to the deep layer. A proxy for the rate of transfer is given by the final convective mixed layer depth, that is shown to depend equally on the surface fluxes and on the preconditioning. In particular, it is found that deep convection in winter 2004–2005 would have happened even with normal winter conditions, due to low pre-winter stratification.

  5. Applying Importance-Performance Analysis as a Service Quality Measure in Food Service Industry

    OpenAIRE

    Tzeng, Gwo-Hshiung; Chang, Hung-Fan

    2011-01-01

    As the global economy becomes a service oriented economy, food service accounts for over 20% of service revenue, with an annual growth rate of more than 3%. Compared to physical products, service features are invisible, and the production and sale occurs simultaneously. There is not easy to measure the performance of service. Therefore, the service quality of catering services is considered to be an important topic of service management. According Market Intelligence & Consulting Institute (M...

  6. The importance of risk-aversion as a measurable psychological parameter governing risk-taking behaviour

    Science.gov (United States)

    Thomas, P. J.

    2013-09-01

    A utility function with risk-aversion as its sole parameter is developed and used to examine the well-known psychological phenomenon, whereby risk averse people adopt behavioural strategies that are extreme and apparently highly risky. The pioneering work of the psychologist, John W. Atkinson, is revisited, and utility theory is used to extend his mathematical model. His explanation of the psychology involved is improved by regarding risk-aversion not as a discrete variable with three possible states: risk averse, risk neutral and risk confident, but as continuous and covering a large range. A probability distribution is derived, the "motivational density", to describe the process of selecting tasks of different degrees of difficulty. An assessment is then made of practicable methods for measuring risk-aversion.

  7. The importance of risk-aversion as a measurable psychological parameter governing risk-taking behaviour

    International Nuclear Information System (INIS)

    Thomas, P J

    2013-01-01

    A utility function with risk-aversion as its sole parameter is developed and used to examine the well-known psychological phenomenon, whereby risk averse people adopt behavioural strategies that are extreme and apparently highly risky. The pioneering work of the psychologist, John W. Atkinson, is revisited, and utility theory is used to extend his mathematical model. His explanation of the psychology involved is improved by regarding risk-aversion not as a discrete variable with three possible states: risk averse, risk neutral and risk confident, but as continuous and covering a large range. A probability distribution is derived, the m otivational density , to describe the process of selecting tasks of different degrees of difficulty. An assessment is then made of practicable methods for measuring risk-aversion

  8. Association between different measurements of blood pressure variability by ABP monitoring and ankle-brachial index.

    Science.gov (United States)

    Wittke, Estefânia; Fuchs, Sandra C; Fuchs, Flávio D; Moreira, Leila B; Ferlin, Elton; Cichelero, Fábio T; Moreira, Carolina M; Neyeloff, Jeruza; Moreira, Marina B; Gus, Miguel

    2010-11-05

    Blood pressure (BP) variability has been associated with cardiovascular outcomes, but there is no consensus about the more effective method to measure it by ambulatory blood pressure monitoring (ABPM). We evaluated the association between three different methods to estimate BP variability by ABPM and the ankle brachial index (ABI). In a cross-sectional study of patients with hypertension, BP variability was estimated by the time rate index (the first derivative of SBP over time), standard deviation (SD) of 24-hour SBP; and coefficient of variability of 24-hour SBP. ABI was measured with a doppler probe. The sample included 425 patients with a mean age of 57 ± 12 years, being 69.2% women, 26.1% current smokers and 22.1% diabetics. Abnormal ABI (≤ 0.90 or ≥ 1.40) was present in 58 patients. The time rate index was 0.516 ± 0.146 mmHg/min in patients with abnormal ABI versus 0.476 ± 0.124 mmHg/min in patients with normal ABI (P = 0.007). In a logistic regression model the time rate index was associated with ABI, regardless of age (OR = 6.9, 95% CI = 1.1- 42.1; P = 0.04). In a multiple linear regression model, adjusting for age, SBP and diabetes, the time rate index was strongly associated with ABI (P < 0.01). None of the other indexes of BP variability were associated with ABI in univariate and multivariate analyses. Time rate index is a sensible method to measure BP variability by ABPM. Its performance for risk stratification of patients with hypertension should be explored in longitudinal studies.

  9. Measuring the importance of health domains in psoriasis – discrete choice experiment versus rating scales

    Directory of Open Access Journals (Sweden)

    Gutknecht M

    2018-03-01

    Full Text Available Mandy Gutknecht,1 Marthe-Lisa Schaarschmidt,1,2 Marion Danner,3 Christine Blome,1 Matthias Augustin1 1German Center for Health Services Research in Dermatology (CVderm, Institute for Health Services Research in Dermatology and Nursing (IVDP, University Medical Center Hamburg-Eppendorf (UKE, Hamburg, Germany; 2Department of Dermatology, University Medical Center Mannheim, Heidelberg University, Mannheim, Germany; 3Institute for Health Economics and Clinical Epidemiology (IGKE, University Hospital of Cologne, Cologne, Germany Background: Psoriasis affects different aspects of health-related quality of life (eg, physical, psychological, and social impairments; these health domains can be of different importance for patients. The importance of domains can be measured with the Patient Benefit Index (PBI. This questionnaire weights the achievement of treatment goals by Likert scales (0, “not important at all” to 4, “very important” using the Patient Needs Questionnaire (PNQ. Treatment goals assessed with the PBI have been assigned to five health domains; the importance of each domain can be calculated as the average importance of the respective treatment goals. In this study, the PBI approach of deriving importance weights is contrasted to a discrete choice experiment (DCE, in order to determine the importance of health domains in psoriasis, and to find if the resulting weights will differ when derived from these two methods.Methods: Adult patients with psoriasis completed both questionnaires (PNQ, DCE. The PBI domains were used as attributes in the DCE with the levels “did not help at all”, “helped moderately”, and “helped a lot”.Results: Using DCE, “improving physical functioning” was the most important health domain, followed by “improving psychological well-being”. Using PNQ, these domains were ranked in position two and three following “strengthening confidence in the therapy and in a possible healing”. The latter

  10. Mutually unbiased coarse-grained measurements of two or more phase-space variables

    Science.gov (United States)

    Paul, E. C.; Walborn, S. P.; Tasca, D. S.; Rudnicki, Łukasz

    2018-05-01

    Mutual unbiasedness of the eigenstates of phase-space operators—such as position and momentum, or their standard coarse-grained versions—exists only in the limiting case of infinite squeezing. In Phys. Rev. Lett. 120, 040403 (2018), 10.1103/PhysRevLett.120.040403, it was shown that mutual unbiasedness can be recovered for periodic coarse graining of these two operators. Here we investigate mutual unbiasedness of coarse-grained measurements for more than two phase-space variables. We show that mutual unbiasedness can be recovered between periodic coarse graining of any two nonparallel phase-space operators. We illustrate these results through optics experiments, using the fractional Fourier transform to prepare and measure mutually unbiased phase-space variables. The differences between two and three mutually unbiased measurements is discussed. Our results contribute to bridging the gap between continuous and discrete quantum mechanics, and they could be useful in quantum-information protocols.

  11. Importance of education and competence maintenance in metrology field (measurement science)

    International Nuclear Information System (INIS)

    Dobiliene, J; Meskuotiene, A

    2015-01-01

    For certain tasks in metrology field trained employers might be necessary to fulfill specific requirements. It is important to pay attention that metrologists are responsible for fluent work of devices that belong to huge variety of vide spectrum of measurements. People who perform measurements (that are related to our safety, security or everyday life) with reliable measuring instruments must be sure for trueness of their results or conclusions. So with the purpose to reach the harmony between the ordinary man and his used means it is very important to ensure competence of specialists that are responsible for mentioned harmony implementation. Usually these specialists have a university degree and perform highly specified tasks in science, industry or laboratories. Their task is quite narrow. For example, type approval of measuring instrument or calibration and verification. Due to the fact that the number of such employers and their tasks is relatively small in the field of legal metrology, this paper focuses on the significance of training and qualification of legal metrology officers

  12. Observer variability of absolute and relative thrombus density measurements in patients with acute ischemic stroke

    International Nuclear Information System (INIS)

    Santos, Emilie M.M.; Yoo, Albert J.; Beenen, Ludo F.; Majoie, Charles B.; Berkhemer, Olvert A.; Blanken, Mark D. den; Wismans, Carrie; Niessen, Wiro J.; Marquering, Henk A.

    2016-01-01

    Thrombus density may be a predictor for acute ischemic stroke treatment success. However, only limited data on observer variability for thrombus density measurements exist. This study assesses the variability and bias of four common thrombus density measurement methods by expert and non-expert observers. For 132 consecutive patients with acute ischemic stroke, three experts and two trained observers determined thrombus density by placing three standardized regions of interest (ROIs) in the thrombus and corresponding contralateral arterial segment. Subsequently, absolute and relative thrombus densities were determined using either one or three ROIs. Intraclass correlation coefficient (ICC) was determined, and Bland-Altman analysis was performed to evaluate interobserver and intermethod agreement. Accuracy of the trained observer was evaluated with a reference expert observer using the same statistical analysis. The highest interobserver agreement was obtained for absolute thrombus measurements using three ROIs (ICCs ranging from 0.54 to 0.91). In general, interobserver agreement was lower for relative measurements, and for using one instead of three ROIs. Interobserver agreement of trained non-experts and experts was similar. Accuracy of the trained observer measurements was comparable to the expert interobserver agreement and was better for absolute measurements and with three ROIs. The agreement between the one ROI and three ROI methods was good. Absolute thrombus density measurement has superior interobserver agreement compared to relative density measurement. Interobserver variation is smaller when multiple ROIs are used. Trained non-expert observers can accurately and reproducibly assess absolute thrombus densities using three ROIs. (orig.)

  13. Observer variability of absolute and relative thrombus density measurements in patients with acute ischemic stroke

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Emilie M.M. [Erasmus MC - University Medical Center Rotterdam, Department of Radiology, P.O. Box 2040, Rotterdam (Netherlands); Department of Radiology, AMC, Amsterdam (Netherlands); Yoo, Albert J. [Texas Stroke Institute, Plano, TX (United States); Beenen, Ludo F.; Majoie, Charles B. [Department of Radiology, AMC, Amsterdam (Netherlands); Berkhemer, Olvert A. [Department of Radiology, AMC, Amsterdam (Netherlands); Department of Neurology, Erasmus MC, Rotterdam (Netherlands); Blanken, Mark D. den; Wismans, Carrie [AMC, Department of Biomedical Engineering and Physics, Amsterdam (Netherlands); Niessen, Wiro J. [Erasmus MC - University Medical Center Rotterdam, Department of Radiology, P.O. Box 2040, Rotterdam (Netherlands); Delft University of Technology, Faculty of Applied Sciences, Delft (Netherlands); Marquering, Henk A. [Department of Radiology, AMC, Amsterdam (Netherlands); AMC, Department of Biomedical Engineering and Physics, Amsterdam (Netherlands); Collaboration: on behalf of the MR CLEAN investigators

    2016-02-15

    Thrombus density may be a predictor for acute ischemic stroke treatment success. However, only limited data on observer variability for thrombus density measurements exist. This study assesses the variability and bias of four common thrombus density measurement methods by expert and non-expert observers. For 132 consecutive patients with acute ischemic stroke, three experts and two trained observers determined thrombus density by placing three standardized regions of interest (ROIs) in the thrombus and corresponding contralateral arterial segment. Subsequently, absolute and relative thrombus densities were determined using either one or three ROIs. Intraclass correlation coefficient (ICC) was determined, and Bland-Altman analysis was performed to evaluate interobserver and intermethod agreement. Accuracy of the trained observer was evaluated with a reference expert observer using the same statistical analysis. The highest interobserver agreement was obtained for absolute thrombus measurements using three ROIs (ICCs ranging from 0.54 to 0.91). In general, interobserver agreement was lower for relative measurements, and for using one instead of three ROIs. Interobserver agreement of trained non-experts and experts was similar. Accuracy of the trained observer measurements was comparable to the expert interobserver agreement and was better for absolute measurements and with three ROIs. The agreement between the one ROI and three ROI methods was good. Absolute thrombus density measurement has superior interobserver agreement compared to relative density measurement. Interobserver variation is smaller when multiple ROIs are used. Trained non-expert observers can accurately and reproducibly assess absolute thrombus densities using three ROIs. (orig.)

  14. Imported brucellosis in Denmark: Molecular identification and multiple-locus variable number tandem repeat analysis (MLVA) genotyping of the bacteria

    DEFF Research Database (Denmark)

    Aftab, H.; Dargis, R.; Christensen, J. J.

    2011-01-01

    A polymerase chain reaction was used to identify Brucella species isolated from humans in Denmark. Consecutive analysis of referred bacteria and re-examination of historical isolates identified all as Brucella melitensis. Multiple-locus variable number tandem repeat analysis (MLVA) placed...... the isolates in the previously defined 'East Mediterranean' B. melitensis group....

  15. The reproducibility and variability of sequential left ventricular ejection fraction measurements by the nuclear stethoscope

    International Nuclear Information System (INIS)

    Kurata, Chinori; Hayashi, Hideharu; Kobayashi, Akira; Yamazaki, Noboru

    1986-01-01

    We evaluated the reproducibility and variability of sequential left ventricular ejection fraction (LVEF) measurements by the nuclear stethoscope in 72 patients. The group as a whole demonstrated excellent reproducibility (r = 0.96). However, repeat LVEF measurements by the nuclear stethoscope at 5-minute interval showed around 9 % absolute difference, at 95 % confidence levels, from one measurement to the next. The finding indicates that a change in LVEF greater than 9 % is necessary for determining an acute effect of an intervention in individual cases. (author)

  16. Photoplethysmography pulse rate variability as a surrogate measurement of heart rate variability during non-stationary conditions

    International Nuclear Information System (INIS)

    Gil, E; Orini, M; Bailón, R; Laguna, P; Vergara, J M; Mainardi, L

    2010-01-01

    In this paper we assessed the possibility of using the pulse rate variability (PRV) extracted from the photoplethysmography signal as an alternative measurement of the HRV signal in non-stationary conditions. The study is based on analysis of the changes observed during a tilt table test in the heart rate modulation of 17 young subjects. First, the classical indices of HRV analysis were compared to the indices from PRV in intervals where stationarity was assumed. Second, the time-varying spectral properties of both signals were compared by time-frequency (TF) and TF coherence analysis. Third, the effect of replacing PRV with HRV in the assessment of the changes of the autonomic modulation of the heart rate was considered. Time-invariant HRV and PRV indices showed no statistically significant differences (p > 0.05) and high correlation (>0.97). Time-frequency analysis revealed that the TF spectra of both signals were highly correlated (0.99 ± 0.01); the difference between the instantaneous power, in the LF and HF bands, obtained from HRV and PRV was small (<10 −3 s −2 ) and their temporal patterns were highly correlated (0.98 ± 0.04 and 0.95 ± 0.06 in the LF and HF bands, respectively) and TF coherence in the LF and HF bands was high (0.97 ± 0.04 and 0.89 ± 0.08, respectively). Finally, the instantaneous power in the LF band was observed to significantly increase during head-up tilt by both HRV and PRV analysis. These results suggest that although some differences in the time-varying spectral indices extracted from HRV and PRV exist, mainly in the HF band associated with respiration, PRV could be used as a surrogate of HRV during non-stationary conditions, at least during the tilt table test

  17. Measuring Zonal Transport Variability of the Antarctic Circumpolar Current Using GRACE Ocean Bottom Pressure

    Science.gov (United States)

    Makowski, J.; Chambers, D. P.; Bonin, J. A.

    2012-12-01

    Previous studies have suggested that ocean bottom pressure (OBP) can be used to measure the transport variability of the Antarctic Circumpolar Current (ACC). Using OBP data from the JPL ECCO model and the Gravity Recovery and Climate Experiment (GRACE), we examine the zonal transport variability of the ACC integrated between the major fronts between 2003-2010. The JPL ECCO data are used to determine average front positions for the time period studies, as well as where transport is mainly zonal. Statistical analysis will be conducted to determine the uncertainty of the GRACE observations using a simulated data set. We will also begin looking at low frequency changes and how coherent transport variability is from region to region of the ACC. Correlations with bottom pressure south of the ACC and the average basin transports will also be calculated to determine the probability of using bottom pressure south of the ACC as a means for describing the ACC dynamics and transport.

  18. Interobserver and Intraobserver Variability among Measurements of FDG PET/CT Parameters in Pulmonary Tumors

    Directory of Open Access Journals (Sweden)

    Gülgün Büyükdereli

    2016-06-01

    Full Text Available Background: 18F-fluorodeoxyglucose (FDG positron emission tomography computed tomography (PET/CT provides information about metabolic and morphologic status of malignancies. Tumor size and standardized uptake value (SUV measurements are crucial for cancer treatment monitoring.: 18F-fluorodeoxyglucose (FDG positron emission tomography computed tomography (PET/CT provides information about metabolic and morphologic status of malignancies. Tumor size and standardized uptake value (SUV measurements are crucial for cancer treatment monitoring. Aims: The purpose of our study was to assess the variability of these measurements performed by observers evaluating lung tumors. Study Design: Retrospective cross-sectional study. Methods: FDG PET/CT images of 97 patients with pulmonary tumors were independently evaluated by two experienced nuclear medicine physicians. Primary tumor size (UDCT, maximum SUV (SUVmax, mean SUV (SUVmean and maximum SUV normalized to liver mean SUV (SUVnliv max were measured by each observer at two different times with an interval of at least 2 weeks. Interobserver and intraobserver variabilities of measurements were evaluated through statistical methods. Results: Size of the lesions varied from 0.81 to 13.6 cm (mean 4.29±2.24 cm. Very good agreement was shown with correlation, Bland-Altman and regression analysis for all measured PET/CT parameters. In the interobserver and intraobserver variability analysis, the Pearson correlation coefficients were greater than 0.96 and 0.98, respectively. Conclusion: Semi-quantitative measurements of pulmonary tumors were highly reproducible when determined by experienced physicians with clinically available software for routine FDG PET/CT evaluation. Consistency may be improved if the same observer performs serial measurements for any one patient.

  19. Importance of Performance Measurement and MCH Epidemiology Leadership to Quality Improvement Initiatives at the National, State and Local Levels.

    Science.gov (United States)

    Rankin, Kristin M; Gavin, Loretta; Moran, John W; Kroelinger, Charlan D; Vladutiu, Catherine J; Goodman, David A; Sappenfield, William M

    2016-11-01

    Purpose In recognition of the importance of performance measurement and MCH epidemiology leadership to quality improvement (QI) efforts, a plenary session dedicated to this topic was presented at the 2014 CityMatCH Leadership and MCH Epidemiology Conference. This paper summarizes the session and provides two applications of performance measurement to QI in MCH. Description Performance measures addressing processes of care are ubiquitous in the current health system landscape and the MCH community is increasingly applying QI processes, such as Plan-Do-Study-Act (PDSA) cycles, to improve the effectiveness and efficiency of systems impacting MCH populations. QI is maximally effective when well-defined performance measures are used to monitor change. Assessment MCH epidemiologists provide leadership to QI initiatives by identifying population-based outcomes that would benefit from QI, defining and implementing performance measures, assessing and improving data quality and timeliness, reporting variability in measures throughout PDSA cycles, evaluating QI initiative impact, and translating findings to stakeholders. MCH epidemiologists can also ensure that QI initiatives are aligned with MCH priorities at the local, state and federal levels. Two examples of this work, one highlighting use of a contraceptive service performance measure and another describing QI for peripartum hemorrhage prevention, demonstrate MCH epidemiologists' contributions throughout. Challenges remain in applying QI to complex community and systems-level interventions, including those aimed at improving access to quality care. Conclusion MCH epidemiologists provide leadership to QI initiatives by ensuring they are data-informed and supportive of a common MCH agenda, thereby optimizing the potential to improve MCH outcomes.

  20. Quantifying measurement uncertainty and spatial variability in the context of model evaluation

    Science.gov (United States)

    Choukulkar, A.; Brewer, A.; Pichugina, Y. L.; Bonin, T.; Banta, R. M.; Sandberg, S.; Weickmann, A. M.; Djalalova, I.; McCaffrey, K.; Bianco, L.; Wilczak, J. M.; Newman, J. F.; Draxl, C.; Lundquist, J. K.; Wharton, S.; Olson, J.; Kenyon, J.; Marquis, M.

    2017-12-01

    In an effort to improve wind forecasts for the wind energy sector, the Department of Energy and the NOAA funded the second Wind Forecast Improvement Project (WFIP2). As part of the WFIP2 field campaign, a large suite of in-situ and remote sensing instrumentation was deployed to the Columbia River Gorge in Oregon and Washington from October 2015 - March 2017. The array of instrumentation deployed included 915-MHz wind profiling radars, sodars, wind- profiling lidars, and scanning lidars. The role of these instruments was to provide wind measurements at high spatial and temporal resolution for model evaluation and improvement of model physics. To properly determine model errors, the uncertainties in instrument-model comparisons need to be quantified accurately. These uncertainties arise from several factors such as measurement uncertainty, spatial variability, and interpolation of model output to instrument locations, to name a few. In this presentation, we will introduce a formalism to quantify measurement uncertainty and spatial variability. The accuracy of this formalism will be tested using existing datasets such as the eXperimental Planetary boundary layer Instrumentation Assessment (XPIA) campaign. Finally, the uncertainties in wind measurement and the spatial variability estimates from the WFIP2 field campaign will be discussed to understand the challenges involved in model evaluation.

  1. Stepwise data envelopment analysis (DEA); choosing variables for measuring technical efficiency in Norwegian electricity distribution

    International Nuclear Information System (INIS)

    Kittelsen, S.A.C.

    1993-04-01

    Electric power distribution is an activity that in principle delivers a separate product to each customer. A specification of products for a utility as a whole leads potentially to a large number of product aspects including topographic and climatic conditions, and the level of disaggregation of factors and products may give the production and cost functions a high dimensionality. Some aggregation is therefore necessary. Non-parametric methods like Data Envelopment Analysis (DEA) have the advantage that they may give meaningful results when parametric methods would not have enough degrees of freedom, but will have related problems if the variables are collinear or are irrelevant. Although aggregate efficiency measures will not be much affected, rates of transformation will be corrupted and observations with extreme values may be measured as efficient by default. Little work has been done so far on the statistical properties of the non-parametric efficiency measure. This paper utilizes a suggestion by Rajiv Banker to measure the significance of the change in results when disaggregating or introducing an extra variable, and shows how one can let the data participate in deciding which variables should be included in the analysis. 32 refs., 7 figs., 4 tabs

  2. Using atmospheric 14CO to constrain OH variability: concept and potential for future measurements

    Science.gov (United States)

    Petrenko, V. V.; Murray, L. T.; Smith, A. W.

    2017-12-01

    The primary source of 14C-containing carbon monoxide (14CO) in the atmosphere is via 14C production from 14N by secondary cosmic rays, and the primary sink is removal by OH. Variations in the global abundance of 14CO that are not explained by variations in 14C production are mainly driven by variations in the global abundance of OH. Monitoring OH variability via methyl chloroform is becoming increasingly difficult as methyl chloroform abundance is continuing to decline. Measurements of atmospheric 14CO have previously been successfully used to infer OH variability. However, these measurements are currently only continuing at one location (Baring Head, New Zealand), which is insufficient to infer global trends. We propose to restart global 14CO monitoring with the aim of providing another constraint on OH variability. A new analytical system for 14CO sampling and measurements is in development, which will allow to strongly reduce the required sample air volumes (previously ≥ 400 L) and simplify field logistics. A set of test measurements is planned, with sampling at the Mauna Loa Observatory. Preliminary work with a state-of-the-art chemical transport model is identifying the most promising locations for global 14CO sampling.

  3. Variability of gastric emptying measurements in man employing standardized radiolabeled meals

    International Nuclear Information System (INIS)

    Brophy, C.M.; Moore, J.G.; Christian, P.E.; Egger, M.J.; Taylor, A.T.

    1986-01-01

    Radiolabeled liquid and solid portions of standardized 300-g meals were administered on four different study days to eight healthy subjects in an attempt to define the range of inter- and intrasubject variability in gastric emptying. Meal half emptying times, analysis of variance, and intraclass correlations were computed and compared within and between subjects. The mean solid half emptying time was 58 +/- 17 min (range 29-92), while the mean liquid half emptying time was 24 +/- 8 min (range 12-37). A nested random effects analysis of variance showed moderate intrasubject variability for solid emptying and high intrasubject variability for liquid emptying. The variability of solid and liquid emptying was comparable and relatively large when compared with other reports in the literature. The isotopic method for measuring gastric emptying is a valuable tool for investigating problems in gastric pathophysiology, particularly when differences between groups of subjects are sought. However, meal emptying time is a variable phenomenon in healthy subjects with significant inter- and intraindividual day-to-day differences. These day-to-day variations in gastric emptying must be considered in interpreting individual study results

  4. Variability of gastric emptying measurements in man employing standardized radiolabeled meals

    Energy Technology Data Exchange (ETDEWEB)

    Brophy, C.M.; Moore, J.G.; Christian, P.E.; Egger, M.J.; Taylor, A.T.

    1986-08-01

    Radiolabeled liquid and solid portions of standardized 300-g meals were administered on four different study days to eight healthy subjects in an attempt to define the range of inter- and intrasubject variability in gastric emptying. Meal half emptying times, analysis of variance, and intraclass correlations were computed and compared within and between subjects. The mean solid half emptying time was 58 +/- 17 min (range 29-92), while the mean liquid half emptying time was 24 +/- 8 min (range 12-37). A nested random effects analysis of variance showed moderate intrasubject variability for solid emptying and high intrasubject variability for liquid emptying. The variability of solid and liquid emptying was comparable and relatively large when compared with other reports in the literature. The isotopic method for measuring gastric emptying is a valuable tool for investigating problems in gastric pathophysiology, particularly when differences between groups of subjects are sought. However, meal emptying time is a variable phenomenon in healthy subjects with significant inter- and intraindividual day-to-day differences. These day-to-day variations in gastric emptying must be considered in interpreting individual study results.

  5. Pneumophonic coordination impairments in parkinsonian dysarthria: importance of aerodynamic parameters measurements.

    Science.gov (United States)

    Moustapha, S M; Alain, G; Robert, E; Bernard, T; Mourtalla, Kâ M; Lamine, G; François, V

    2012-01-01

    Among Parkinsonian axial signs, dysarthria represents an important disabling symptom able to lead towards a significant reduction of oral communication. Several methods of dysarthria assessment have been used but aerodynamic evaluation is rare in the literature. To highlight the importance of aerodynamic parameters measurements in assessment of parkinsonian dysarthria. Using a dedicated system (EVA2), 24 parkinsonian patients were recorded after withdrawal of L-dopa for at least 12 h (condition called OFF DOPA) in order to evaluate intra-oral pressure (IOP), mean oral air flow (MOAF) and laryngeal resistance (LR) on six /p/ during realization of the sentence "Papa ne m'a pas parle' de beau-papa" ("Daddy did not speak to me about daddy-in-law") which corresponds to a breath group. 50 control subjects were recorded in parallel in order to define reference measurements. It appeared that there is in Parkinson's disease aerodynamic impairments which were evidenced by the fall in IOP and that of MOAF in patients compared with control subjects. The difference between the two groups was statistically significant. In addition a greater instability of LR in patients compared with control subjects was also noted. Our results show that measurements of aerodynamics parameters, by reflecting the dysfunction induced by disease, may well be relevant factors in parkinsonian dysarthria evaluation.

  6. Association between different measurements of blood pressure variability by ABP monitoring and ankle-brachial index

    Directory of Open Access Journals (Sweden)

    Moreira Leila B

    2010-11-01

    Full Text Available Abstract Background Blood pressure (BP variability has been associated with cardiovascular outcomes, but there is no consensus about the more effective method to measure it by ambulatory blood pressure monitoring (ABPM. We evaluated the association between three different methods to estimate BP variability by ABPM and the ankle brachial index (ABI. Methods and Results In a cross-sectional study of patients with hypertension, BP variability was estimated by the time rate index (the first derivative of SBP over time, standard deviation (SD of 24-hour SBP; and coefficient of variability of 24-hour SBP. ABI was measured with a doppler probe. The sample included 425 patients with a mean age of 57 ± 12 years, being 69.2% women, 26.1% current smokers and 22.1% diabetics. Abnormal ABI (≤ 0.90 or ≥ 1.40 was present in 58 patients. The time rate index was 0.516 ± 0.146 mmHg/min in patients with abnormal ABI versus 0.476 ± 0.124 mmHg/min in patients with normal ABI (P = 0.007. In a logistic regression model the time rate index was associated with ABI, regardless of age (OR = 6.9, 95% CI = 1.1- 42.1; P = 0.04. In a multiple linear regression model, adjusting for age, SBP and diabetes, the time rate index was strongly associated with ABI (P Conclusion Time rate index is a sensible method to measure BP variability by ABPM. Its performance for risk stratification of patients with hypertension should be explored in longitudinal studies.

  7. Variability of indoor and outdoor VOC measurements: An analysis using variance components

    International Nuclear Information System (INIS)

    Jia, Chunrong; Batterman, Stuart A.; Relyea, George E.

    2012-01-01

    This study examines concentrations of volatile organic compounds (VOCs) measured inside and outside of 162 residences in southeast Michigan, U.S.A. Nested analyses apportioned four sources of variation: city, residence, season, and measurement uncertainty. Indoor measurements were dominated by seasonal and residence effects, accounting for 50 and 31%, respectively, of the total variance. Contributions from measurement uncertainty (<20%) and city effects (<10%) were small. For outdoor measurements, season, city and measurement variation accounted for 43, 29 and 27% of variance, respectively, while residence location had negligible impact (<2%). These results show that, to obtain representative estimates of indoor concentrations, measurements in multiple seasons are required. In contrast, outdoor VOC concentrations can use multi-seasonal measurements at centralized locations. Error models showed that uncertainties at low concentrations might obscure effects of other factors. Variance component analyses can be used to interpret existing measurements, design effective exposure studies, and determine whether the instrumentation and protocols are satisfactory. - Highlights: ► The variability of VOC measurements was partitioned using nested analysis. ► Indoor VOCs were primarily controlled by seasonal and residence effects. ► Outdoor VOC levels were homogeneous within neighborhoods. ► Measurement uncertainty was high for many outdoor VOCs. ► Variance component analysis is useful for designing effective sampling programs. - Indoor VOC concentrations were primarily controlled by seasonal and residence effects; and outdoor concentrations were homogeneous within neighborhoods. Variance component analysis is a useful tool for designing effective sampling programs.

  8. Simultaneously measured pupillary light reflex and heart rate variability in healthy children

    International Nuclear Information System (INIS)

    Daluwatte, C; Yao, G; Miles, J H

    2012-01-01

    We investigated the potential inter-relationship between two measures of autonomic nervous system: pupillary light reflex (PLR) and heart rate variability (HRV), in healthy children of 8–16 years old. PLR was measured at both dark- and light-adapted conditions with various stimulation intensities. Simultaneously measured HRV was obtained in five different PLR testing phases: before PLR test, light-adapted PLR test, dark adaptation, dark-adapted PLR test and after PLR test. The frequency domain HRV parameters measured during the PLR test were significantly different from those measured during rest. Both the regression analysis and factor analysis indicated that PLR and HRV parameters were not correlated, which suggests that they may provide complementary assessment of different aspects of the overall autonomic nervous system. (paper)

  9. Simultaneously measured pupillary light reflex and heart rate variability in healthy children

    Energy Technology Data Exchange (ETDEWEB)

    Daluwatte, C; Yao, G [Department of Biological Engineering, University of Missouri, Columbia, MO 65211 (United States); Miles, J H, E-mail: YaoG@missouri.edu [Child Health and Thompson Center for Autism and Neurodevelopmental Disorders, University of Missouri, Columbia, MO 65211 (United States)

    2012-06-15

    We investigated the potential inter-relationship between two measures of autonomic nervous system: pupillary light reflex (PLR) and heart rate variability (HRV), in healthy children of 8–16 years old. PLR was measured at both dark- and light-adapted conditions with various stimulation intensities. Simultaneously measured HRV was obtained in five different PLR testing phases: before PLR test, light-adapted PLR test, dark adaptation, dark-adapted PLR test and after PLR test. The frequency domain HRV parameters measured during the PLR test were significantly different from those measured during rest. Both the regression analysis and factor analysis indicated that PLR and HRV parameters were not correlated, which suggests that they may provide complementary assessment of different aspects of the overall autonomic nervous system. (paper)

  10. Identification of voltage stability condition of a power system using measurements of bus variables

    Directory of Open Access Journals (Sweden)

    Durlav Hazarika

    2014-12-01

    Full Text Available Several online methods were proposed for investigating the voltage stability condition of an interconnected power system using the measurements of voltage and current phasors at a bus. For this purpose, phasor measurement units (PMUs are used. A PMU is a device which measures the electrical waves on an electrical network, using a common time source (reference bus for synchronisation. This study proposes a method for online monitoring of voltage stability condition of a power system using measurements of bus variables namely – (i real power, (ii reactive power and (iii bus voltage magnitude at a bus. The measurements of real power, reactive power and bus voltage magnitude could be extracted/captured from a smart energy meter. The financial involvement for implementation of the proposed method would significantly lower compared with the PMU-based method.

  11. Measurement of transplanted pancreatic volume using computed tomography: reliability by intra- and inter-observer variability

    International Nuclear Information System (INIS)

    Lundqvist, Eva; Segelsjoe, Monica; Magnusson, Anders; Andersson, Anna; Biglarnia, Ali-Reza

    2012-01-01

    Background Unlike other solid organ transplants, pancreas allografts can undergo a substantial decrease in baseline volume after transplantation. This phenomenon has not been well characterized, as there are insufficient data on reliable and reproducible volume assessments. We hypothesized that characterization of pancreatic volume by means of computed tomography (CT) could be a useful method for clinical follow-up in pancreas transplant patients. Purpose To evaluate the feasibility and reliability of pancreatic volume assessment using CT scan in transplanted patients. Material and Methods CT examinations were performed on 21 consecutive patients undergoing pancreas transplantation. Volume measurements were carried out by two observers tracing the pancreatic contours in all slices. The observers performed the measurements twice for each patient. Differences in volume measurement were used to evaluate intra- and inter-observer variability. Results The intra-observer variability for the pancreatic volume measurements of Observers 1 and 2 was found to be in almost perfect agreement, with an intraclass correlation coefficient (ICC) of 0.90 (0.77-0.96) and 0.99 (0.98-1.0), respectively. Regarding inter-observer validity, the ICCs for the first and second measurements were 0.90 (range, 0.77-0.96) and 0.95 (range, 0.85-0.98), respectively. Conclusion CT volumetry is a reliable and reproducible method for measurement of transplanted pancreatic volume

  12. Measurement of transplanted pancreatic volume using computed tomography: reliability by intra- and inter-observer variability

    Energy Technology Data Exchange (ETDEWEB)

    Lundqvist, Eva; Segelsjoe, Monica; Magnusson, Anders [Uppsala Univ., Dept. of Radiology, Oncology and Radiation Science, Section of Radiology, Uppsala (Sweden)], E-mail: eva.lundqvist.8954@student.uu.se; Andersson, Anna; Biglarnia, Ali-Reza [Dept. of Surgical Sciences, Section of Transplantation Surgery, Uppsala Univ. Hospital, Uppsala (Sweden)

    2012-11-15

    Background Unlike other solid organ transplants, pancreas allografts can undergo a substantial decrease in baseline volume after transplantation. This phenomenon has not been well characterized, as there are insufficient data on reliable and reproducible volume assessments. We hypothesized that characterization of pancreatic volume by means of computed tomography (CT) could be a useful method for clinical follow-up in pancreas transplant patients. Purpose To evaluate the feasibility and reliability of pancreatic volume assessment using CT scan in transplanted patients. Material and Methods CT examinations were performed on 21 consecutive patients undergoing pancreas transplantation. Volume measurements were carried out by two observers tracing the pancreatic contours in all slices. The observers performed the measurements twice for each patient. Differences in volume measurement were used to evaluate intra- and inter-observer variability. Results The intra-observer variability for the pancreatic volume measurements of Observers 1 and 2 was found to be in almost perfect agreement, with an intraclass correlation coefficient (ICC) of 0.90 (0.77-0.96) and 0.99 (0.98-1.0), respectively. Regarding inter-observer validity, the ICCs for the first and second measurements were 0.90 (range, 0.77-0.96) and 0.95 (range, 0.85-0.98), respectively. Conclusion CT volumetry is a reliable and reproducible method for measurement of transplanted pancreatic volume.

  13. The importance of optical methods for non-invasive measurements in the skin care industry

    Science.gov (United States)

    Stamatas, Georgios N.

    2010-02-01

    Pharmaceutical and cosmetic industries are concerned with treating skin disease, as well as maintaining and promoting skin health. They are dealing with a unique tissue that defines our body in space. As such, skin provides not only the natural boundary with the environment inhibiting body dehydration as well as penetration of exogenous aggressors to the body, it is also ideally situated for optical measurements. A plurality of spectroscopic and imaging methods is being used to understand skin physiology and pathology and document the effects of topically applied products on the skin. The obvious advantage of such methods over traditional biopsy techniques is the ability to measure the cutaneous tissue in vivo and non-invasively. In this work, we will review such applications of various spectroscopy and imaging methods in skin research that is of interest the cosmetic and pharmaceutical industry. Examples will be given on the importance of optical techniques in acquiring new insights about acne pathogenesis and infant skin development.

  14. Monitoring and measurement of variables - assurance of observability and controllability of the reactor equipment

    International Nuclear Information System (INIS)

    Durnev, V.N.; Mitelman, M.G.

    2001-01-01

    The presentation presents main conclusions on the basis of analysis of the existing situation with assurance of observability and controllability of the reactor installation. A methodology of classification of variables of the controlled object state and proposals on selection and substantiation of measurement and inspection monitoring techniques is given. Main problems associated with assurance of observability and controllability of the reactor installation are presented for various operation modes. (Authors)

  15. Gait stability and variability measures show effects of impaired cognition and dual tasking in frail people

    Directory of Open Access Journals (Sweden)

    de Vries Oscar J

    2011-01-01

    Full Text Available Abstract Background Falls in frail elderly are a common problem with a rising incidence. Gait and postural instability are major risk factors for falling, particularly in geriatric patients. As walking requires attention, cognitive impairments are likely to contribute to an increased fall risk. An objective quantification of gait and balance ability is required to identify persons with a high tendency to fall. Recent studies have shown that stride variability is increased in elderly and under dual task condition and might be more sensitive to detect fall risk than walking speed. In the present study we complemented stride related measures with measures that quantify trunk movement patterns as indicators of dynamic balance ability during walking. The aim of the study was to quantify the effect of impaired cognition and dual tasking on gait variability and stability in geriatric patients. Methods Thirteen elderly with dementia (mean age: 82.6 ± 4.3 years and thirteen without dementia (79.4 ± 5.55 recruited from a geriatric day clinic, walked at self-selected speed with and without performing a verbal dual task. The Mini Mental State Examination and the Seven Minute Screen were administered. Trunk accelerations were measured with an accelerometer. In addition to walking speed, mean, and variability of stride times, gait stability was quantified using stochastic dynamical measures, namely regularity (sample entropy, long range correlations and local stability exponents of trunk accelerations. Results Dual tasking significantly (p Conclusions The observed trunk adaptations were a consistent instability factor. These results support the concept that changes in cognitive functions contribute to changes in the variability and stability of the gait pattern. Walking under dual task conditions and quantifying gait using dynamical parameters can improve detecting walking disorders and might help to identify those elderly who are able to adapt walking

  16. Heart Rate Variability as a Measure of Airport Ramp-Traffic Controllers Workload

    Science.gov (United States)

    Hayashi, Miwa; Dulchinos, Victoria Lee

    2016-01-01

    Heart Rate Variability (HRV) has been reported to reflect the person's cognitive and emotional stress levels, and may offer an objective measure of human-operator's workload levels, which are recorded continuously and unobtrusively to the task performance. The present paper compares the HRV data collected during a human-in-the-loop simulation of airport ramp-traffic control operations with the controller participants' own verbal self-reporting ratings of their workload.

  17. Importance of Viral Sequence Length and Number of Variable and Informative Sites in Analysis of HIV Clustering.

    Science.gov (United States)

    Novitsky, Vlad; Moyo, Sikhulile; Lei, Quanhong; DeGruttola, Victor; Essex, M

    2015-05-01

    To improve the methodology of HIV cluster analysis, we addressed how analysis of HIV clustering is associated with parameters that can affect the outcome of viral clustering. The extent of HIV clustering and tree certainty was compared between 401 HIV-1C near full-length genome sequences and subgenomic regions retrieved from the LANL HIV Database. Sliding window analysis was based on 99 windows of 1,000 bp and 45 windows of 2,000 bp. Potential associations between the extent of HIV clustering and sequence length and the number of variable and informative sites were evaluated. The near full-length genome HIV sequences showed the highest extent of HIV clustering and the highest tree certainty. At the bootstrap threshold of 0.80 in maximum likelihood (ML) analysis, 58.9% of near full-length HIV-1C sequences but only 15.5% of partial pol sequences (ViroSeq) were found in clusters. Among HIV-1 structural genes, pol showed the highest extent of clustering (38.9% at a bootstrap threshold of 0.80), although it was significantly lower than in the near full-length genome sequences. The extent of HIV clustering was significantly higher for sliding windows of 2,000 bp than 1,000 bp. We found a strong association between the sequence length and proportion of HIV sequences in clusters, and a moderate association between the number of variable and informative sites and the proportion of HIV sequences in clusters. In HIV cluster analysis, the extent of detectable HIV clustering is directly associated with the length of viral sequences used, as well as the number of variable and informative sites. Near full-length genome sequences could provide the most informative HIV cluster analysis. Selected subgenomic regions with a high extent of HIV clustering and high tree certainty could also be considered as a second choice.

  18. Variability of ferritin measurements in chronic kidney disease; implications for iron management.

    Science.gov (United States)

    Ford, Bradley A; Coyne, Daniel W; Eby, Charles S; Scott, Mitchell G

    2009-01-01

    Serum ferritin levels are a proxy measure of iron stores; and existing guidelines for managing anemia in hemodialysis patients suggest that serum ferritin concentrations should be maintained at >200 ng/ml. The KDOQI recommendations further state there is insufficient evidence advocating routine intravenous iron when ferritin levels exceed 500 ng/ml. Here we determined the interassay differences and short-term intraindividual variability of serum ferritin measurements in patients on chronic hemodialysis to illustrate how these variances may affect treatment decisions. Intermethod variations of up to 150 ng/ml were found comparing six commonly used ferritin assays that evaluated thirteen pools of serum from hemodialysis and nonhemodialysis patients. The intraindividual variability for ferritin in 60 stable hemodialysis patients ranged between 2-62% measured over an initial two-week period and from 3-52% when factored over a six-week period. Our results suggests that single serum ferritin values should not be used to guide clinical decisions regarding treatment of chronic hemodialysis patients with intravenous iron due to significant analytical and intraindividual variability.

  19. Fault-tolerant measurement-based quantum computing with continuous-variable cluster states.

    Science.gov (United States)

    Menicucci, Nicolas C

    2014-03-28

    A long-standing open question about Gaussian continuous-variable cluster states is whether they enable fault-tolerant measurement-based quantum computation. The answer is yes. Initial squeezing in the cluster above a threshold value of 20.5 dB ensures that errors from finite squeezing acting on encoded qubits are below the fault-tolerance threshold of known qubit-based error-correcting codes. By concatenating with one of these codes and using ancilla-based error correction, fault-tolerant measurement-based quantum computation of theoretically indefinite length is possible with finitely squeezed cluster states.

  20. Measurement of the ratio Γbb/Γhad using event shape variables

    Science.gov (United States)

    Buskulic, D.; de Bonis, I.; Decamp, D.; Ghez, P.; Goy, C.; Lees, J.-P.; Minard, M.-N.; Pietrzyk, B.; Ariztizabal, F.; Comas, P.; Crespo, J. M.; Delfino, M.; Efthymiopoulos, I.; Fernandez, E.; Fernandez-Bosman, M.; Gaitan, V.; Garrido, Ll.; Mattison, T.; Pacheco, A.; Padilla, C.; Pascual, A.; Creanza, D.; de Palma, M.; Farilla, A.; Iaselli, G.; Maggi, G.; Natali, S.; Nuzzo, S.; Quattromini, M.; Ranieri, A.; Raso, G.; Romano, F.; Ruggieri, F.; Selvaggi, G.; Silvestris, L.; Tempesta, P.; Zito, G.; Chai, Y.; Hu, H.; Huang, D.; Huang, X.; Lin, J.; Wang, T.; Xie, Y.; Xie, D.; Xu, D.; Xu, R.; Zhang, J.; Zhang, L.; Zhao, W.; Blucher, E.; Bonvicini, G.; Boudreau, J.; Casper, D.; Drevermann, H.; Forty, R. W.; Ganis, G.; Gay, C.; Hagelberg, R.; Harvey, J.; Hilgart, J.; Jacobsen, R.; Jost, B.; Knobloch, J.; Lehraus, I.; Lohse, T.; Maggi, M.; Markou, C.; Martinez, M.; Mato, P.; Meinhard, H.; Minten, A.; Miotto, A.; Miquel, R.; Moser, H.-G.; Palazzi, P.; Pater, J. R.; Perlas, J. A.; Pusztaszeri, J.-F.; Ranjard, F.; Redlinger, G.; Rolandi, L.; Rothberg, J.; Ruan, T.; Saich, M.; Schlatter, D.; Schmelling, M.; Sefkow, F.; Tejessy, W.; Tomalin, I. R.; Veenhof, R.; Wachsmuth, H.; Wasserbaech, S.; Wiedenmann, W.; Wildish, T.; Witzeling, W.; Wotschack, J.; Ajaltouni, Z.; Badaud, F.; Bardadin-Otwinowska, M.; El Fellous, R.; Falvard, A.; Gay, P.; Guicheney, C.; Henrard, P.; Jousset, J.; Michel, B.; Montret, J.-C.; Pallin, D.; Perret, P.; Podlyski, F.; Proriol, J.; Prulhière, F.; Saadi, F.; Fearnley, T.; Hansen, J. B.; Hansen, J. D.; Hansen, J. R.; Hansen, P. H.; Møllerud, R.; Nilsson, B. S.; Kyriakis, A.; Simopoulou, E.; Siotis, I.; Vayaki, A.; Zachariadou, K.; Badier, J.; Blondel, A.; Bonneaud, G.; Brient, J. C.; Fouque, G.; Orteu, S.; Rougé, A.; Rumpf, M.; Tanaka, R.; Verderi, M.; Videau, H.; Candlin, D. J.; Parsons, M. I.; Veitch, E.; Focardi, E.; Moneta, L.; Parrini, G.; Corden, M.; Georgiopoulos, C.; Ikeda, M.; Levinthal, D.; Antonelli, A.; Baldini, R.; Bencivenni, G.; Bologna, G.; Bossi, F.; Campana, P.; Capon, G.; Cerutti, F.; Chiarella, V.; D'Ettorre-Piazzoli, B.; Felici, G.; Laurelli, P.; Mannocchi, G.; Murtas, F.; Murtas, G. P.; Passalacqua, L.; Pepe-Altarelli, M.; Picchi, P.; Colrain, P.; Ten Have, I.; Lynch, J. G.; Maitland, W.; Morton, W. T.; Raine, C.; Reeves, P.; Scarr, J. M.; Smith, K.; Smith, M. G.; Thompson, A. S.; Turnbull, R. M.; Brandl, B.; Braun, O.; Geweniger, C.; Hanke, P.; Hepp, V.; Kluge, E. E.; Maumary, Y.; Putzer, A.; Rensch, B.; Stahl, A.; Tittel, K.; Wunsch, M.; Beuselinck, R.; Binnie, D. M.; Cameron, W.; Cattaneo, M.; Colling, D. J.; Dornan, P. J.; Greene, A. M.; Hassard, J. F.; Lieske, N. M.; Moutoussi, A.; Nash, J.; Patton, S.; Payne, D. G.; Phillips, M. J.; San Martin, G.; Sedgbeer, J. K.; Wright, A. G.; Girtler, P.; Kuhn, D.; Rudolph, G.; Vogl, R.; Bowdery, C. K.; Brodbeck, T. J.; Finch, A. J.; Foster, F.; Hughes, G.; Jackson, D.; Keemer, N. R.; Nuttall, M.; Patel, A.; Sloan, T.; Snow, S. W.; Whelan, E. P.; Kleinknecht, K.; Raab, J.; Renk, B.; Sander, H.-G.; Schmidt, H.; Steeg, F.; Walther, S. M.; Wanke, R.; Wolf, B.; Bencheikh, A. M.; Benchouk, C.; Bonissent, A.; Carr, J.; Coyle, P.; Drinkard, J.; Etienne, F.; Nicod, D.; Papalexiou, S.; Payre, P.; Roos, L.; Rousseau, D.; Schwemling, P.; Talby, M.; Adlung, S.; Assmann, R.; Bauer, C.; Blum, W.; Brown, D.; Cattaneo, P.; Dehning, B.; Dietl, H.; Dydak, F.; Frank, M.; Halley, A. W.; Jakobs, K.; Lauber, J.; Lütjens, G.; Lutz, G.; Männer, W.; Richter, R.; Schröder, J.; Schwarz, A. S.; Settles, R.; Seywerd, H.; Stierlin, U.; Stiegler, U.; St. Denis, R.; Wolf, G.; Alemany, R.; Boucrot, J.; Callot, O.; Cordier, A.; Davier, M.; Duflot, L.; Grivaz, J.-F.; Heusse, Ph.; Jaffe, D. E.; Janot, P.; Kim, D. W.; Le Diberder, F.; Lefrançois, J.; Lutz, A.-M.; Schune, M.-H.; Veillet, J.-J.; Videau, I.; Zhang, Z.; Abbaneo, D.; Bagliesi, G.; Batignani, G.; Bottigli, U.; Bozzi, C.; Calderini, G.; Carpinelli, M.; Ciocci, M. A.; dell'Orso, R.; Ferrante, I.; Fidecaro, F.; Foà, L.; Forti, F.; Giassi, A.; Giorgi, M. A.; Gregorio, A.; Ligabue, F.; Lusiani, A.; Mannelli, E. B.; Marrocchesi, P. S.; Messineo, A.; Palla, F.; Rizzo, G.; Sanguinetti, G.; Spagnolo, P.; Steinberger, J.; Tenchini, R.; Tonelli, G.; Triggiani, G.; Vannini, C.; Venturi, A.; Verdini, P. G.; Walsh, J.; Betteridge, A. P.; Gao, Y.; Green, M. G.; March, P. V.; Mir, Ll. M.; Medcalf, T.; Quazi, I. S.; Strong, J. A.; West, L. R.; Botterill, D. R.; Clifft, R. W.; Edgecock, T. R.; Haywood, S.; Norton, P. R.; Thompson, J. C.; Bloch-Devaux, B.; Colas, P.; Duarte, H.; Emery, S.; Kozanecki, W.; Lançon, E.; Lemaire, M. C.; Locci, E.; Marx, B.; Perez, P.; Rander, J.; Renardy, J.-F.; Rosowsky, A.; Roussarie, A.; Schuller, J.-P.; Schwindling, J.; Si Mohand, D.; Vallage, B.; Johnson, R. P.; Litke, A. M.; Taylor, G.; Wear, J.; Ashman, J. G.; Babbage, W.; Booth, C. N.; Buttar, C.; Cartwright, S.; Combley, F.; Dawson, I.; Thompson, L. F.; Barberio, E.; Böhrer, A.; Brandt, S.; Cowan, G.; Grupen, C.; Lutters, G.; Rivera, F.; Schäfer, U.; Smolik, L.; Bosisio, L.; della Marina, R.; Giannini, G.; Gobbo, B.; Ragusa, F.; Bellantoni, L.; Chen, W.; Conway, J. S.; Feng, Z.; Ferguson, D. P. S.; Gao, Y. S.; Grahl, J.; Harton, J. L.; Hayes, O. J.; Nachtman, J. M.; Pan, Y. B.; Saadi, Y.; Schmitt, M.; Scott, I.; Sharma, V.; Shi, Z. H.; Turk, J. D.; Walsh, A. M.; Weber, F. V.; Lan Wu, Sau; Wu, X.; Zheng, M.; Zobernig, G.

    1993-09-01

    The branching fraction of Z --> bb relative to all hadronic decays of the Z has been measured using event shape variables to preferentially select Z --> bb events. The method chosen applies a combination of shape discriminators and the selection of high transverse momentum leptons to event hemispheres. From a sample of 440 000 hadronic Z decays collected with the ALEPH detector at LEP, the ration Γbb/Γhad = 0.228+/-0.005(stat.)+/-0.005(syst.) is measured. Supported by the US Department of Energy, contract DE-AC02-76ER00881.

  1. ECOGENETICS AND PHARMACOGENETICS: THE IMPORTANCE OF GENETIC POLYMORPHISMS IN THE VARIABILITY OF ORGANISMS RESPONSE TO ENVIRONMENTAL FACTORS

    Directory of Open Access Journals (Sweden)

    Cristian Tudose

    2005-08-01

    protect confidentiality and privacy of individual genetic information may make such research infeasible. In the present paper we expose some general considerations about the importance of the borderline disciplines which are studying the cited aspects (ecogenetics, pharmacogenetics and pharmacogenomics, emphasising the importance of human populations genome polymorphisms affecting drug efficiency and producing adverse reactions; eventually we expose the most recent trends in pharmacogenomics related to the subject

  2. A new computational method of a moment-independent uncertainty importance measure

    International Nuclear Information System (INIS)

    Liu Qiao; Homma, Toshimitsu

    2009-01-01

    For a risk assessment model, the uncertainty in input parameters is propagated through the model and leads to the uncertainty in the model output. The study of how the uncertainty in the output of a model can be apportioned to the uncertainty in the model inputs is the job of sensitivity analysis. Saltelli [Sensitivity analysis for importance assessment. Risk Analysis 2002;22(3):579-90] pointed out that a good sensitivity indicator should be global, quantitative and model free. Borgonovo [A new uncertainty importance measure. Reliability Engineering and System Safety 2007;92(6):771-84] further extended these three requirements by adding the fourth feature, moment-independence, and proposed a new sensitivity measure, δ i . It evaluates the influence of the input uncertainty on the entire output distribution without reference to any specific moment of the model output. In this paper, a new computational method of δ i is proposed. It is conceptually simple and easier to implement. The feasibility of this new method is proved by applying it to two examples.

  3. A min cut-set-wise truncation procedure for importance measures computation in probabilistic safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Duflot, Nicolas [Universite de technologie de Troyes, Institut Charles Delaunay/LM2S, FRE CNRS 2848, 12, rue Marie Curie, BP2060, F-10010 Troyes cedex (France)], E-mail: nicolas.duflot@areva.com; Berenguer, Christophe [Universite de technologie de Troyes, Institut Charles Delaunay/LM2S, FRE CNRS 2848, 12, rue Marie Curie, BP2060, F-10010 Troyes cedex (France)], E-mail: christophe.berenguer@utt.fr; Dieulle, Laurence [Universite de technologie de Troyes, Institut Charles Delaunay/LM2S, FRE CNRS 2848, 12, rue Marie Curie, BP2060, F-10010 Troyes cedex (France)], E-mail: laurence.dieulle@utt.fr; Vasseur, Dominique [EPSNA Group (Nuclear PSA and Application), EDF Research and Development, 1, avenue du Gal de Gaulle, 92141 Clamart cedex (France)], E-mail: dominique.vasseur@edf.fr

    2009-11-15

    A truncation process aims to determine among the set of minimal cut-sets (MCS) produced by a probabilistic safety assessment (PSA) model which of them are significant. Several truncation processes have been proposed for the evaluation of the probability of core damage ensuring a fixed accuracy level. However, the evaluation of new risk indicators as importance measures requires to re-examine the truncation process in order to ensure that the produced estimates will be accurate enough. In this paper a new truncation process is developed permitting to estimate from a single set of MCS the importance measure of any basic event with the desired accuracy level. The main contribution of this new method is to propose an MCS-wise truncation criterion involving two thresholds: an absolute threshold in addition to a new relative threshold concerning the potential probability of the MCS of interest. The method has been tested on a complete level 1 PSA model of a 900 MWe NPP developed by 'Electricite de France' (EDF) and the results presented in this paper indicate that to reach the same accuracy level the proposed method produces a set of MCS whose size is significantly reduced.

  4. A min cut-set-wise truncation procedure for importance measures computation in probabilistic safety assessment

    International Nuclear Information System (INIS)

    Duflot, Nicolas; Berenguer, Christophe; Dieulle, Laurence; Vasseur, Dominique

    2009-01-01

    A truncation process aims to determine among the set of minimal cut-sets (MCS) produced by a probabilistic safety assessment (PSA) model which of them are significant. Several truncation processes have been proposed for the evaluation of the probability of core damage ensuring a fixed accuracy level. However, the evaluation of new risk indicators as importance measures requires to re-examine the truncation process in order to ensure that the produced estimates will be accurate enough. In this paper a new truncation process is developed permitting to estimate from a single set of MCS the importance measure of any basic event with the desired accuracy level. The main contribution of this new method is to propose an MCS-wise truncation criterion involving two thresholds: an absolute threshold in addition to a new relative threshold concerning the potential probability of the MCS of interest. The method has been tested on a complete level 1 PSA model of a 900 MWe NPP developed by 'Electricite de France' (EDF) and the results presented in this paper indicate that to reach the same accuracy level the proposed method produces a set of MCS whose size is significantly reduced.

  5. Sequential Measurement of Intermodal Variability in Public Transportation PM2.5 and CO Exposure Concentrations.

    Science.gov (United States)

    Che, W W; Frey, H Christopher; Lau, Alexis K H

    2016-08-16

    A sequential measurement method is demonstrated for quantifying the variability in exposure concentration during public transportation. This method was applied in Hong Kong by measuring PM2.5 and CO concentrations along a route connecting 13 transportation-related microenvironments within 3-4 h. The study design takes into account ventilation, proximity to local sources, area-wide air quality, and meteorological conditions. Portable instruments were compacted into a backpack to facilitate measurement under crowded transportation conditions and to quantify personal exposure by sampling at nose level. The route included stops next to three roadside monitors to enable comparison of fixed site and exposure concentrations. PM2.5 exposure concentrations were correlated with the roadside monitors, despite differences in averaging time, detection method, and sampling location. Although highly correlated in temporal trend, PM2.5 concentrations varied significantly among microenvironments, with mean concentration ratios versus roadside monitor ranging from 0.5 for MTR train to 1.3 for bus terminal. Measured inter-run variability provides insight regarding the sample size needed to discriminate between microenvironments with increased statistical significance. The study results illustrate the utility of sequential measurement of microenvironments and policy-relevant insights for exposure mitigation and management.

  6. Spatial Variability of Soil-Water Storage in the Southern Sierra Critical Zone Observatory: Measurement and Prediction

    Science.gov (United States)

    Oroza, C.; Bales, R. C.; Zheng, Z.; Glaser, S. D.

    2017-12-01

    Predicting the spatial distribution of soil moisture in mountain environments is confounded by multiple factors, including complex topography, spatial variably of soil texture, sub-surface flow paths, and snow-soil interactions. While remote-sensing tools such as passive-microwave monitoring can measure spatial variability of soil moisture, they only capture near-surface soil layers. Large-scale sensor networks are increasingly providing soil-moisture measurements at high temporal resolution across a broader range of depths than are accessible from remote sensing. It may be possible to combine these in-situ measurements with high-resolution LIDAR topography and canopy cover to estimate the spatial distribution of soil moisture at high spatial resolution at multiple depths. We study the feasibility of this approach using six years (2009-2014) of daily volumetric water content measurements at 10-, 30-, and 60-cm depths from the Southern Sierra Critical Zone Observatory. A non-parametric, multivariate regression algorithm, Random Forest, was used to predict the spatial distribution of depth-integrated soil-water storage, based on the in-situ measurements and a combination of node attributes (topographic wetness, northness, elevation, soil texture, and location with respect to canopy cover). We observe predictable patterns of predictor accuracy and independent variable ranking during the six-year study period. Predictor accuracy is highest during the snow-cover and early recession periods but declines during the dry period. Soil texture has consistently high feature importance. Other landscape attributes exhibit seasonal trends: northness peaks during the wet-up period, and elevation and topographic-wetness index peak during the recession and dry period, respectively.

  7. Link importance incorporated failure probability measuring solution for multicast light-trees in elastic optical networks

    Science.gov (United States)

    Li, Xin; Zhang, Lu; Tang, Ying; Huang, Shanguo

    2018-03-01

    The light-tree-based optical multicasting (LT-OM) scheme provides a spectrum- and energy-efficient method to accommodate emerging multicast services. Some studies focus on the survivability technologies for LTs against a fixed number of link failures, such as single-link failure. However, a few studies involve failure probability constraints when building LTs. It is worth noting that each link of an LT plays different important roles under failure scenarios. When calculating the failure probability of an LT, the importance of its every link should be considered. We design a link importance incorporated failure probability measuring solution (LIFPMS) for multicast LTs under independent failure model and shared risk link group failure model. Based on the LIFPMS, we put forward the minimum failure probability (MFP) problem for the LT-OM scheme. Heuristic approaches are developed to address the MFP problem in elastic optical networks. Numerical results show that the LIFPMS provides an accurate metric for calculating the failure probability of multicast LTs and enhances the reliability of the LT-OM scheme while accommodating multicast services.

  8. The Use of Importance Measures for Quantification of Multi-unit Risk

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Seung-Cheol; Lim, Ho-Gon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    In this paper, we focus on the quantification of the multi-unit accident sequences frequencies, i.e., conditional core damage probability (CCDP) for a MUI from the SUR model. The paper proposes a method for the estimation of the r-units CCDP considered the interunit dependency, using importance measures. Note that the 1{sup st} and 2{sup nd} terms in the left hand side of the equation have different units of risk measure, i.e., reactor operating year and site operating year, respectively. The total risk of multi-unit reactor accidents concurred by the independent accident sequences each single-unit (the 1{sup st} term in the right-hand side of the Equation 1) can approximate the sum of n single-unit risk conservatively. It corresponds to the traditional multi-unit risk profile concept having used since post-PSA era. Simultaneously, the Equation 1 represent that multi-unit risk within a site with n units has been underestimated as much as the amount of the 2{sup nd} term (MUR by multi-unit initiators), which consists of three parts: 1) the frequency estimation of a MUI, , 2) the quantification of the multi-unit accident sequences frequencies for a MUI, , and 3) the multi-unit consequence analysis for a MUI. The paper proposes a method for the estimation of the r-units CCDP considered the inter-unit dependency, using importance measures. It can facilitate the treatment of the inter-unit dependencies in the multi-unit risk model and can give more comprehensive and more practicable technical platform for estimating multi-unit site risk.

  9. The Use of Importance Measures for Quantification of Multi-unit Risk

    International Nuclear Information System (INIS)

    Jang, Seung-Cheol; Lim, Ho-Gon

    2015-01-01

    In this paper, we focus on the quantification of the multi-unit accident sequences frequencies, i.e., conditional core damage probability (CCDP) for a MUI from the SUR model. The paper proposes a method for the estimation of the r-units CCDP considered the interunit dependency, using importance measures. Note that the 1 st and 2 nd terms in the left hand side of the equation have different units of risk measure, i.e., reactor operating year and site operating year, respectively. The total risk of multi-unit reactor accidents concurred by the independent accident sequences each single-unit (the 1 st term in the right-hand side of the Equation 1) can approximate the sum of n single-unit risk conservatively. It corresponds to the traditional multi-unit risk profile concept having used since post-PSA era. Simultaneously, the Equation 1 represent that multi-unit risk within a site with n units has been underestimated as much as the amount of the 2 nd term (MUR by multi-unit initiators), which consists of three parts: 1) the frequency estimation of a MUI, , 2) the quantification of the multi-unit accident sequences frequencies for a MUI, , and 3) the multi-unit consequence analysis for a MUI. The paper proposes a method for the estimation of the r-units CCDP considered the inter-unit dependency, using importance measures. It can facilitate the treatment of the inter-unit dependencies in the multi-unit risk model and can give more comprehensive and more practicable technical platform for estimating multi-unit site risk

  10. Variability in baseline laboratory measurements of the Brazilian Longitudinal Study of Adult Health (ELSA-Brasil).

    Science.gov (United States)

    Ladwig, R; Vigo, A; Fedeli, L M G; Chambless, L E; Bensenor, I; Schmidt, M I; Vidigal, P G; Castilhos, C D; Duncan, B B

    2016-08-01

    Multi-center epidemiological studies must ascertain that their measurements are accurate and reliable. For laboratory measurements, reliability can be assessed through investigation of reproducibility of measurements in the same individual. In this paper, we present results from the quality control analysis of the baseline laboratory measurements from the ELSA-Brasil study. The study enrolled 15,105 civil servants at 6 research centers in 3 regions of Brazil between 2008-2010, with multiple biochemical analytes being measured at a central laboratory. Quality control was ascertained through standard laboratory evaluation of intra- and inter-assay variability and test-retest analysis in a subset of randomly chosen participants. An additional sample of urine or blood was collected from these participants, and these samples were handled in the same manner as the original ones, locally and at the central laboratory. Reliability was assessed with the intraclass correlation coefficient (ICC), estimated through a random effects model. Coefficients of variation (CV) and Bland-Altman plots were additionally used to assess measurement variability. Laboratory intra and inter-assay CVs varied from 0.86% to 7.77%. From test-retest analyses, the ICCs were high for the majority of the analytes. Notably lower ICCs were observed for serum sodium (ICC=0.50; 95%CI=0.31-0.65) and serum potassium (ICC=0.73; 95%CI=0.60-0.83), due to the small biological range of these analytes. The CVs ranged from 1 to 14%. The Bland-Altman plots confirmed these results. The quality control analyses showed that the collection, processing and measurement protocols utilized in the ELSA-Brasil produced reliable biochemical measurements.

  11. Modelling global water stress of the recent past: on the relative importance of trends in water demand and climate variability

    Science.gov (United States)

    Wada, Y.; van Beek, L. P. H.; Bierkens, M. F. P.

    2011-08-01

    During the past decades, human water use more than doubled, yet available freshwater resources are finite. As a result, water scarcity has been prevalent in various regions of the world. Here, we present the first global assessment of past development of water scarcity considering not only climate variability but also growing water demand, desalinated water use and non-renewable groundwater abstraction over the period 1960-2001 at a spatial resolution of 0.5°. Agricultural water demand is estimated based on past extents of irrigated areas and livestock densities. We approximate past economic development based on GDP, energy and household consumption and electricity production, which is subsequently used together with population numbers to estimate industrial and domestic water demand. Climate variability is expressed by simulated blue water availability defined by freshwater in rivers, lakes and reservoirs by means of the global hydrological model PCR-GLOBWB. The results show a drastic increase in the global population living under water-stressed conditions (i.e., moderate to high water stress) due to the growing water demand, primarily for irrigation, which more than doubled from 1708/818 to 3708/1832 km3 yr-1 (gross/net) over the period 1960-2000. We estimate that 800 million people or 27 % of the global population were under water-stressed conditions for 1960. This number increased to 2.6 billion or 43 % for 2000. Our results indicate that increased water demand is the decisive factor for the heightened water stress, enhancing the intensity of water stress up to 200 %, while climate variability is often the main determinant of onsets for extreme events, i.e. major droughts. However, our results also suggest that in several emerging and developing economies (e.g., India, Turkey, Romania and Cuba) some of the past observed droughts were anthropogenically driven due to increased water demand rather than being climate-induced. In those countries, it can be seen

  12. Modelling global water stress of the recent past: on the relative importance of trends in water demand and climate variability

    Science.gov (United States)

    Wada, Y.; van Beek, L. P. H.; Bierkens, M. F. P.

    2011-12-01

    During the past decades, human water use has more than doubled, yet available freshwater resources are finite. As a result, water scarcity has been prevalent in various regions of the world. Here, we present the first global assessment of past development of water stress considering not only climate variability but also growing water demand, desalinated water use and non-renewable groundwater abstraction over the period 1960-2001 at a spatial resolution of 0.5°. Agricultural water demand is estimated based on past extents of irrigated areas and livestock densities. We approximate past economic development based on GDP, energy and household consumption and electricity production, which are subsequently used together with population numbers to estimate industrial and domestic water demand. Climate variability is expressed by simulated blue water availability defined by freshwater in rivers, lakes, wetlands and reservoirs by means of the global hydrological model PCR-GLOBWB. We thus define blue water stress by comparing blue water availability with corresponding net total blue water demand by means of the commonly used, Water Scarcity Index. The results show a drastic increase in the global population living under water-stressed conditions (i.e. moderate to high water stress) due to growing water demand, primarily for irrigation, which has more than doubled from 1708/818 to 3708/1832 km3 yr-1 (gross/net) over the period 1960-2000. We estimate that 800 million people or 27% of the global population were living under water-stressed conditions for 1960. This number is eventually increased to 2.6 billion or 43% for 2000. Our results indicate that increased water demand is a decisive factor for heightened water stress in various regions such as India and North China, enhancing the intensity of water stress up to 200%, while climate variability is often a main determinant of extreme events. However, our results also suggest that in several emerging and developing economies

  13. Spatial variability of methane production and methanogen communities within a eutrophic reservoir: evaluating the importance of organic matter source and quantity

    Science.gov (United States)

    Freshwater reservoirs are an important source of the greenhouse gas methane (CH4) to the atmosphere, but there is a wide range of estimates of global emissions, due in part to variability of methane emissions rates within reservoirs. While morphological characteristics, including...

  14. A Study of the Relative Importance of Communication and Economic Variables in Diffusion: Dwarf Wheats on Unirrigated Small Holdings in Pakistan.

    Science.gov (United States)

    Rochin, Refugio I.

    The purpose of this paper is twofold: (1) it presents some empirical findings of the relative importance of both "economic" and "communication" variables in the diffusion of an innovation (dwarf wheats) in an unirrigated region of Pakistan which is densely populated by smallholders. The sample of farmers reported are…

  15. Gait variability measurements in lumbar spinal stenosis patients: part A. Comparison with healthy subjects

    International Nuclear Information System (INIS)

    Papadakis, N C; Christakis, D G; Tzagarakis, G N; Chlouverakis, G I; Kampanis, N A; Stergiopoulos, K N; Katonis, P G

    2009-01-01

    The objective of this study is to compare the gait variability of patients with lumbar spinal stenosis (experimental group) with healthy individuals (control group). The hypothesis is that the preoperative gait variability of the experimental group is higher than the control group. The experimental group consisted of 35 adults (18 males, 17 females). The subjects of the experimental group suffered exclusively from spinal stenosis. The patients were determined by MRI scans. A tri-axial accelerometer sensor was used for the gait measurement, and differential entropy algorithm was used to quantify the gait acceleration signal. The Oswestry Low Back Pain Questionnaire was used to determine the condition on the day of the measurement. Receiver operating characteristic (ROC) was utilized to assess the diagnostic value of the method and determine a cut-off value. There is a statistically significant difference between gait variability in the control group and the experimental group. ROC analysis determines a cut-off differential entropy value. The cut-off value has a 97.6% probability of separating patients with spinal stenosis from healthy subjects. The Oswestry Low Back Questionnaire is well correlated with the spectral differential entropy values

  16. Correlations between Sportsmen’s Morpho-Functional Measurements and Voice Acoustic Variables

    Directory of Open Access Journals (Sweden)

    Rexhepi Agron M.

    2016-12-01

    Full Text Available Purpose. Since human voice characteristics are specific to each individual, numerous anthropological studies have been oriented to find significant relationships between voice and morpho-functional features. The goal of this study was to identify the correlation between seven morpho-functional variables and six voice acoustic parameters in sportsmen. Methods. Following the protocols of the International Biological Program, seven morpho-functional variables and six voice acoustic parameters have been measured in 88 male professional athletes from Kosovo, aged 17-35 years, during the period of April-October 2013. The statistical analysis was accomplished through the SPSS program, version 20. The obtained data were analysed through descriptive parameters and with Spearman’s method of correlation analysis. Results. Spearman’s method of correlation showed significant negative correlations (R = -0.215 to -0.613; p = 0.05 between three voice acoustic variables of the fundamental frequency of the voice sample (Mean, Minimum, and Maximum Pitch and six morpho-functional measures (Body Height, Body Weight, Margaria-Kalamen Power Test, Sargent Jump Test, Pull-up Test, and VO2max.abs. Conclusions. The significant correlations imply that the people with higher stature have longer vocal cords and a lower voice. These results encourage investigations on predicting sportsmen’s functional abilities on the basis of their voice acoustic parameters.

  17. A protocol for measuring spatial variables in soft-sediment tide pools

    Directory of Open Access Journals (Sweden)

    Marina R. Brenha-Nunes

    2016-01-01

    Full Text Available ABSTRACT We present a protocol for measuring spatial variables in large (>50 m2 soft-sediment tide pool. Secondarily, we present the fish capture efficiency of a sampling protocol that based on such spatial variables to calculate relative abundances. The area of the pool is estimated by summing areas of basic geometric forms; the depth, by taken representative measurements of the depth variability of each pool's sector, previously determined according to its perimeter; and the volume, by considering the pool as a prism. These procedures were a trade-off between the acquisition of reliable estimates and the minimization of both the cost of operating and the time spent in field. The fish sampling protocol is based on two con secutive stages: 1 two people search for fishes under structures (e.g., rocks and litters on the pool and capture them with hand seines; 2 these structures are removed and then a beach-seine is hauled over the whole pool. Our method is cheaper than others and fast to operate considering the time in low tides. The method to sample fish is quite efficient resulting in a capture efficiency of 89%.

  18. Measures of the sheep-goat variable, transliminality, and their correlates.

    Science.gov (United States)

    Thalbourne, M A

    2001-04-01

    In this study a battery of pencil-and-paper tests was given to 125 first-year psychology students (27% men). This battery included as measures of belief in the paranormal (the so-called "sheep-goat variable") the Australian Sheep-Goat Scale, Tobacyk's Revised Paranormal Belief Scale (comprised of two scales--New Age Philosophy and Traditional Paranormal Beliefs), and the Anomalous Experience Inventory (comprised of five scales: Anomalous/Paranormal Experience, Belief, Ability, Fear, and Drug Use). Also included were the 29-item Transliminality Scale, a 35-item Kundalini Scale, an experimental 13-item Determinism/Free Will scale, and a number of single-question items aimed specifically at transliminality. The results were, first, that virtually all the measures of the sheep-goat variable were intercorrelated with each other (range, .34 to .77), thereby providing support for their convergent validity. Second, scores on the Kundalini Scale and Drug Use correlated significantly with scores on the sheep-goat variable, replicating previous findings. And, finally, many correlates of transliminality were found, again including scores on the Kundalini Scale and Drug Use (prescribed and illicit) as well as certain determinism-related beliefs. Beliefs, experiences, and behaviors associated with transliminality and the Kundalini experience may reflect a desire to escape a negative state of being.

  19. Equations of bark thickness and volume profiles at different heights with easy-measurement variables

    Energy Technology Data Exchange (ETDEWEB)

    Cellini, J. M.; Galarza, M.; Burns, S. L.; Martinez-Pastur, G. J.; Lencinas, M. V.

    2012-11-01

    The objective of this work was to develop equations of thickness profile and bark volume at different heights with easy-measurement variables, taking as a study case Nothofagus pumilio forests, growing in different site qualities and growth phases in Southern Patagonia. Data was collected from 717 harvested trees. Three models were fitted using multiple, non-lineal regression and generalized linear model, by stepwise methodology, iteratively reweighted least squares method for maximum likelihood estimation and Marquardt algorithm. The dependent variables were diameter at 1.30 m height (DBH), relative height (RH) and growth phase (GP). The statistic evaluation was made through the adjusted determinant coefficient (r2-adj), standard error of the estimation (SEE), mean absolute error and residual analysis. All models presented good fitness with a significant correlation with the growth phase. A decrease in the thickness was observed when the relative height increase. Moreover, a bark coefficient was made to calculate volume with and without bark of individual trees, where significant differences according to site quality of the stands and DBH class of the trees were observed. It can be concluded that the prediction of bark thickness and bark coefficient is possible using DBH, height, site quality and growth phase, common and easy measurement variables used in forest inventories. (Author) 23 refs.

  20. The importance of deep, basinwide measurements in optimized Atlantic Meridional Overturning Circulation observing arrays

    Science.gov (United States)

    McCarthy, G. D.; Menary, M. B.; Mecking, J. V.; Moat, B. I.; Johns, W. E.; Andrews, M. B.; Rayner, D.; Smeed, D. A.

    2017-03-01

    The Atlantic Meridional Overturning Circulation (AMOC) is a key process in the global redistribution of heat. The AMOC is defined as the maximum of the overturning stream function, which typically occurs near 30°N in the North Atlantic. The RAPID mooring array has provided full-depth, basinwide, continuous estimates of this quantity since 2004. Motivated by both the need to deliver near real-time data and optimization of the array to reduce costs, we consider alternative configurations of the mooring array. Results suggest that the variability observed since 2004 could be reproduced by a single tall mooring on the western boundary and a mooring to 1500 m on the eastern boundary. We consider the potential future evolution of the AMOC in two generations of the Hadley Centre climate models and a suite of additional CMIP5 models. The modeling studies show that deep, basinwide measurements are essential to capture correctly the future decline of the AMOC. We conclude that, while a reduced array could be useful for estimates of the AMOC on subseasonal to decadal time scales as part of a near real-time data delivery system, extreme caution must be applied to avoid the potential misinterpretation or absence of a climate time scale AMOC decline that is a key motivation for the maintenance of these observations.Plain Language SummaryThe Atlantic Overturning Circulation is a system of ocean currents that carries heat northwards in the Atlantic. This heat is crucial to maintaining the mild climate of northwest Europe. The Overturning Circulation is predicted to slow in future in response to man-made climate change. The RAPID program is designed to measure the Overturning Circulation using a number of fixed point observations spanning the Atlantic between the Canary Islands and the Bahamas. We look at whether we could reduce the number of these fixed point observations to continue to get accurate estimates of the overturning strength but for less cost. We conclude that

  1. Measuring perceptions related to e-cigarettes: Important principles and next steps to enhance study validity.

    Science.gov (United States)

    Gibson, Laura A; Creamer, MeLisa R; Breland, Alison B; Giachello, Aida Luz; Kaufman, Annette; Kong, Grace; Pechacek, Terry F; Pepper, Jessica K; Soule, Eric K; Halpern-Felsher, Bonnie

    2018-04-01

    Measuring perceptions associated with e-cigarette use can provide valuable information to help explain why youth and adults initiate and continue to use e-cigarettes. However, given the complexity of e-cigarette devices and their continuing evolution, measures of perceptions of this product have varied greatly. Our goal, as members of the working group on e-cigarette measurement within the Tobacco Centers of Regulatory Science (TCORS) network, is to provide guidance to researchers developing surveys concerning e-cigarette perceptions. We surveyed the 14 TCORS sites and received and reviewed 371 e-cigarette perception items from seven sites. We categorized the items based on types of perceptions asked, and identified measurement approaches that could enhance data validity and approaches that researchers may consider avoiding. The committee provides suggestions in four areas: (1) perceptions of benefits, (2) harm perceptions, (3) addiction perceptions, and (4) perceptions of social norms. Across these 4 areas, the most appropriate way to assess e-cigarette perceptions depends largely on study aims. The type and number of items used to examine e-cigarette perceptions will also vary depending on respondents' e-cigarette experience (i.e., user vs. non-user), level of experience (e.g., experimental vs. established), type of e-cigarette device (e.g., cig-a-like, mod), and age. Continuous formative work is critical to adequately capture perceptions in response to the rapidly changing e-cigarette landscape. Most important, it is imperative to consider the unique perceptual aspects of e-cigarettes, building on the conventional cigarette literature as appropriate, but not relying on existing conventional cigarette perception items without adjustment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Serodiagnosis of Borrelia miyamotoi disease by measuring antibodies against GlpQ and variable major proteins

    DEFF Research Database (Denmark)

    Koetsveld, J.; Kolyasnikova, N. M.; Wagemakers, A.

    2018-01-01

    previously shown the differential expression of antigenic variable major proteins (Vmps) in B. miyamotoi, our aim was to study antibody responses against GlpQ and Vmps in PCR-proven BMD patients and controls. Methods: We assessed seroreactivity against GlpQ and four Vmps in a well-described, longitudinal......, and IgG between 21 and 50 days, after disease onset. Various combinations of GlpQ and Vmps increased sensitivity and/or specificity compared to single antigens. Notably, the GlpQ or variable large protein (Vlp)-15/16 combination yielded a sensitivity of 94.7% (95% CI: 75.4–99.7) 11–20 days after disease......Objectives: Borrelia miyamotoi disease (BMD) is an emerging tick-borne disease in the Northern hemisphere. Serodiagnosis by measuring antibodies against glycerophosphodiester-phosphodiesterase (GlpQ) has been performed experimentally but has not been extensively clinically validated. Because we had...

  3. Temporal variability in the importance of hydrologic, biotic, and climatic descriptors of dissolved oxygen dynamics in a shallow tidal-marsh creek

    Science.gov (United States)

    Nelson, N.; Munoz-Carpena, R.; Neale, P.; Tzortziou, M.; Megonigal, P.

    2017-12-01

    Due to strong abiotic forcing, dissolved oxygen (DO) in shallow tidal creeks often disobeys the conventional explanation of general aquatic DO cycling as biologically-regulated. In the present work, we seek to quantify the relative importance of abiotic (hydrologic and climatic), and biotic (primary productivity as represented by chlorophyll-a) descriptors of tidal creek DO. By fitting multiple linear regression models of DO to hourly chlorophyll-a, water quality, hydrology, and weather data collected in a tidal creek of a Chesapeake Bay marsh (Maryland, USA), temporal shifts (summer - early winter) in the relative importance of tidal creek DO descriptors were uncovered. Moreover, this analysis identified an alternative approach to evaluating tidal stage as a driver of DO by dividing stage into two DO-relevant variables: stage above and below bankfull depth. Within the hydrologic variable class, stage below bankfull depth dominated as an important descriptor, thus highlighting the role of pore water drainage and mixing as influential processes forcing tidal creek DO. Study findings suggest that tidal creek DO dynamics are explained by a balance of hydrologic, climatic, and biotic descriptors during warmer seasons due to many of these variables (i.e., chlorophyll-a, water temperature) acting as tracers of estuarine-marsh water mixing; conversely, in early winter months when estuarine and marsh waters differ less distinctly, hydrologic variables increase in relative importance as descriptors of tidal creek DO. These findings underline important distinctions in the underlying mechanisms dictating DO variability in shallow tidal marsh-creek environments relative to open water estuarine systems.

  4. Importance of well logging measurements in the design of underground railway tunnels

    International Nuclear Information System (INIS)

    Kiss, E.Z.; Szlaboczky, P.

    1981-01-01

    The paper shows how logs can be used in the construction of underground railway tunnels in terciary sediments. Even standard well logging techniques (electric conductivity, gamma logging) can provide important additional information on the wells if conclusions concerning construction technology are gained from the logs. In the course of continuous research work the application of well logs renders an essential help if the measurements give in-situ information on absolute values of the well sections by revealing the various geological formations based on the distribution of characteristic parameters. Well logging increases the resolving power of the mechanical method of layer differentiation. Beside the usual geological interpretation of logs the zones of shifting rocks, hard and friable formations as well as intercalations leading to problems in construction technology can be pointed out. (author)

  5. Importance of measuring discharge and sediment transport in lesser tributaries when closing sediment budgets

    Science.gov (United States)

    Griffiths, Ronald E.; Topping, David J.

    2017-11-01

    Sediment budgets are an important tool for understanding how riverine ecosystems respond to perturbations. Changes in the quantity and grain size distribution of sediment within river systems affect the channel morphology and related habitat resources. It is therefore important for resource managers to know if a river reach is in a state of sediment accumulation, deficit or stasis. Many sediment-budget studies have estimated the sediment loads of ungaged tributaries using regional sediment-yield equations or other similar techniques. While these approaches may be valid in regions where rainfall and geology are uniform over large areas, use of sediment-yield equations may lead to poor estimations of loads in regions where rainfall events, contributing geology, and vegetation have large spatial and/or temporal variability. Previous estimates of the combined mean-annual sediment load of all ungaged tributaries to the Colorado River downstream from Glen Canyon Dam vary by over a factor of three; this range in estimated sediment loads has resulted in different researchers reaching opposite conclusions on the sign (accumulation or deficit) of the sediment budget for particular reaches of the Colorado River. To better evaluate the supply of fine sediment (sand, silt, and clay) from these tributaries to the Colorado River, eight gages were established on previously ungaged tributaries in Glen, Marble, and Grand canyons. Results from this sediment-monitoring network show that previous estimates of the annual sediment loads of these tributaries were too high and that the sediment budget for the Colorado River below Glen Canyon Dam is more negative than previously calculated by most researchers. As a result of locally intense rainfall events with footprints smaller than the receiving basin, floods from a single tributary in semi-arid regions can have large (≥ 10 ×) differences in sediment concentrations between equal magnitude flows. Because sediment loads do not

  6. Variables Associated with the Use of Coercive Measures on Psychiatric Patients in Spanish Penitentiary Centers

    Directory of Open Access Journals (Sweden)

    E. Girela

    2014-01-01

    Full Text Available We have studied the use of coercive medical measures (forced medication, isolation, and mechanical restraint in mentally ill inmates within two secure psychiatric hospitals (SPH and three regular prisons (RP in Spain. Variables related to adopted coercive measures were analyzed, such as type of measure, causes of indication, opinion of patient inmate, opinion of medical staff, and more frequent morbidity. A total of 209 patients (108 from SPH and 101 from RP were studied. Isolation (41.35% was the most frequent coercive measure, followed by mechanical restraint (33.17% and forced medication (25.48%. The type of center has some influence; specifically in RP there is less risk of isolation and restraint than in SPH. Not having had any previous imprisonment reduces isolation and restraint risk while increases the risk of forced medication, as well as previous admissions to psychiatric inpatient units does. Finally, the fact of having lived with a partner before imprisonment reduces the risk of forced medication and communication with the family decreases the risk of isolation. Patients subjected to a coercive measure exhibited a pronounced psychopathology and most of them had been subjected to such measures on previous occasions. The mere fact of external assessment of compliance with human rights slows down the incidence of coercive measures.

  7. Variables associated with the use of coercive measures on psychiatric patients in Spanish penitentiary centers.

    Science.gov (United States)

    Girela, E; López, A; Ortega, L; De-Juan, J; Ruiz, F; Bosch, J I; Barrios, L F; Luna, J D; Torres-González, F

    2014-01-01

    We have studied the use of coercive medical measures (forced medication, isolation, and mechanical restraint) in mentally ill inmates within two secure psychiatric hospitals (SPH) and three regular prisons (RP) in Spain. Variables related to adopted coercive measures were analyzed, such as type of measure, causes of indication, opinion of patient inmate, opinion of medical staff, and more frequent morbidity. A total of 209 patients (108 from SPH and 101 from RP) were studied. Isolation (41.35%) was the most frequent coercive measure, followed by mechanical restraint (33.17%) and forced medication (25.48%). The type of center has some influence; specifically in RP there is less risk of isolation and restraint than in SPH. Not having had any previous imprisonment reduces isolation and restraint risk while increases the risk of forced medication, as well as previous admissions to psychiatric inpatient units does. Finally, the fact of having lived with a partner before imprisonment reduces the risk of forced medication and communication with the family decreases the risk of isolation. Patients subjected to a coercive measure exhibited a pronounced psychopathology and most of them had been subjected to such measures on previous occasions. The mere fact of external assessment of compliance with human rights slows down the incidence of coercive measures.

  8. Fuzzy central tendency measure for time series variability analysis with application to fatigue electromyography signals.

    Science.gov (United States)

    Xie, Hong-Bo; Dokos, Socrates

    2013-01-01

    A new method, namely fuzzy central tendency measure (fCTM) analysis, that could enable measurement of the variability of a time series, is presented in this study. Tests on simulated data sets show that fCTM is superior to the conventional central tendency measure (CTM) in several respects, including improved relative consistency and robustness to noise. The proposed fCTM method was applied to electromyograph (EMG) signals recorded during sustained isometric contraction for tracking local muscle fatigue. The results showed that the fCTM increased significantly during the development of muscle fatigue, and it was more sensitive to the fatigue phenomenon than mean frequency (MNF), the most commonly-used muscle fatigue indicator.

  9. Circumpolar analysis of the Adélie Penguin reveals the importance of environmental variability in phenological mismatch

    Science.gov (United States)

    Youngflesh, Casey; Jenouvrier, Stephanie; Li, Yun; Ji, Rubao; Ainley, David G.; Ballard, Grant; Barbraud, Christophe; Delord, Karine; Dugger, Catherine; Emmerson, Loiuse M.; Fraser, William R.; Hinke, Jefferson T.; Lyver, Phil O'B.; Olmastroni, Silvia; Southwell, Colin J.; Trivelpiece, Susan G.; Trivelpiece, Wayne Z.; Lynch, Heather J.

    2017-01-01

    Evidence of climate-change-driven shifts in plant and animal phenology have raised concerns that certain trophic interactions may be increasingly mismatched in time, resulting in declines in reproductive success. Given the constraints imposed by extreme seasonality at high latitudes and the rapid shifts in phenology seen in the Arctic, we would also expect Antarctic species to be highly vulnerable to climate-change-driven phenological mismatches with their environment. However, few studies have assessed the impacts of phenological change in Antarctica. Using the largest database of phytoplankton phenology, sea-ice phenology, and Adélie Penguin breeding phenology and breeding success assembled to date, we find that, while a temporal match between Penguin breeding phenology and optimal environmental conditions sets an upper limit on breeding success, only a weak relationship to the mean exists. Despite previous work suggesting that divergent trends in Adélie Penguin breeding phenology are apparent across the Antarctic continent, we find no such trends. Furthermore, we find no trend in the magnitude of phenological mismatch, suggesting that mismatch is driven by interannual variability in environmental conditions rather than climate-change-driven trends, as observed in other systems. We propose several criteria necessary for a species to experience a strong climate-change-driven phenological mismatch, of which several may be violated by this system.

  10. Effects of weight loss diet therapy on anthropometric measurements and biochemical variables in schizophrenic patients.

    Science.gov (United States)

    Urhan, Murat; Ergün, Can; Aksoy, Meral; Ayer, Ahmet

    2015-07-01

    Prevalence of obesity in schizophrenic patients is two to three times higher than in the general population and unhealthy dietary patterns, a sedentary lifestyle and antipsychotic medication use may contribute to the higher levels of obesity among schizophrenic patients. We evaluated the effects of diet therapy on weight loss, anthropometric and biochemical variables in overweight or obese (body mass index, BMI ≥ 27 kg/m(2)) female schizophrenic patients who use antipsychotic medications and in healthy volunteers. Primary demographic variables were collected via questionnaire; blood samples and anthropometric measurements were obtained. Personalized diet recipes were prepared and nutritional education was shared. We logged the physical activity of the patients and maintained food consumption records at 3-day intervals. Participants were weighed every week; anthropometric measurements and blood samples were collected at the end of the first and second months. At the end of the study, reductions in body weight and other anthropometric measurements were statistically significant (P < 0.05). Reductions in body weight and BMI values for patient group were - 4.05 ± 1.73 kg and - 1.62 ± 0.73 kg/m(2) and for the control group were - 6.79 ± 1.80 kg and - 2.55 ± 0.64 kg/m(2), respectively. When compared with the patient group, reductions in the anthropometric variables of the control group were statistically significant (P < 0.05). Fasting glucose, blood lipids, albumin and leptin levels were decreased; insulin and homeostatic model assessment-measured insulin resistance (HOMA-IR) levels were increased insignificantly. Increases in the blood ghrelin levels for both groups were statistically significant (P < 0.05). Improvements to the diets of schizophrenic patient led to improvements in anthropometric measurements and biochemical variables and reduced the health risks caused by antipsychotic medications. Furthermore, we hypothesize that antipsychotic medications do not

  11. Applying Importance-Performance Analysis as a Service Quality Measure in Food Service Industry

    Directory of Open Access Journals (Sweden)

    Gwo-Hshiung Tzeng

    2011-09-01

    Full Text Available As the global economy becomes a service oriented economy, food service accounts for over 20% of service revenue, with an annual growth rate of more than 3%. Compared to physical products, service features are invisible, and the production and sale occurs simultaneously. There is not easy to measure the performance of service. Therefore, the service quality of catering services is considered to be an important topic of service management. According Market Intelligence & Consulting Institute (MIC to apply blog text analyzing to point out top 10 restaurants of blog in Taiwan, what it’s popular restaurant in food service industries. This paper attempts to identify both the importance and performance of restaurant service quality in the Taiwan food service industry using the SERVQUAL and IPA model. We can conclude with certainty that three methods (SERVQUAL, IF and IPA are able to explain significant amount of service quality. At the same time, the service quality factors of IPA model had more comprehensive consideration in comparison to those of SERVQUAL and IF.

  12. Variability of creatinine measurements in clinical laboratories: results from the CRIC study.

    Science.gov (United States)

    Joffe, Marshall; Hsu, Chi-yuan; Feldman, Harold I; Weir, Matthew; Landis, J R; Hamm, L Lee

    2010-01-01

    Estimating equations using serum creatinine (SCr) are often used to assess glomerular filtration rate (GFR). Such creatinine (Cr)-based formulae may produce biased estimates of GFR when using Cr measurements that have not been calibrated to reference laboratories. In this paper, we sought to examine the degree of this variation in Cr assays in several laboratories associated with academic medical centers affiliated with the Chronic Renal Insufficiency Cohort (CRIC) Study; to consider how best to correct for this variation, and to quantify the impact of such corrections on eligibility for participation in CRIC. Variability of Cr is of particular concern in the conduct of CRIC, a large multicenter study of subjects with chronic renal disease, because eligibility for the study depends on Cr-based assessment of GFR. A library of 5 large volume plasma specimens from apheresis patients was assembled, representing levels of plasma Cr from 0.8 to 2.4 mg/dl. Samples from this library were used for measurement of Cr at each of the 14 CRIC laboratories repetitively over time. We used graphical displays and linear regression methods to examine the variability in Cr, and used linear regression to develop calibration equations. We also examined the impact of the various calibration equations on the proportion of subjects screened as potential participants who were actually eligible for the study. There was substantial variability in Cr assays across laboratories and over time. We developed calibration equations for each laboratory; these equations varied substantially among laboratories and somewhat over time in some laboratories. The laboratory site contributed the most to variability (51% of the variance unexplained by the specimen) and variation with time accounted for another 15%. In some laboratories, calibration equations resulted in differences in eligibility for CRIC of as much as 20%. The substantial variability in SCr assays across laboratories necessitates calibration

  13. Continuous-variable measurement-device-independent quantum key distribution with virtual photon subtraction

    Science.gov (United States)

    Zhao, Yijia; Zhang, Yichen; Xu, Bingjie; Yu, Song; Guo, Hong

    2018-04-01

    The method of improving the performance of continuous-variable quantum key distribution protocols by postselection has been recently proposed and verified. In continuous-variable measurement-device-independent quantum key distribution (CV-MDI QKD) protocols, the measurement results are obtained from untrusted third party Charlie. There is still not an effective method of improving CV-MDI QKD by the postselection with untrusted measurement. We propose a method to improve the performance of coherent-state CV-MDI QKD protocol by virtual photon subtraction via non-Gaussian postselection. The non-Gaussian postselection of transmitted data is equivalent to an ideal photon subtraction on the two-mode squeezed vacuum state, which is favorable to enhance the performance of CV-MDI QKD. In CV-MDI QKD protocol with non-Gaussian postselection, two users select their own data independently. We demonstrate that the optimal performance of the renovated CV-MDI QKD protocol is obtained with the transmitted data only selected by Alice. By setting appropriate parameters of the virtual photon subtraction, the secret key rate and tolerable excess noise are both improved at long transmission distance. The method provides an effective optimization scheme for the application of CV-MDI QKD protocols.

  14. Variability of bronchial measurements obtained by sequential CT using two computer-based methods

    International Nuclear Information System (INIS)

    Brillet, Pierre-Yves; Fetita, Catalin I.; Mitrea, Mihai; Preteux, Francoise; Capderou, Andre; Dreuil, Serge; Simon, Jean-Marc; Grenier, Philippe A.

    2009-01-01

    This study aimed to evaluate the variability of lumen (LA) and wall area (WA) measurements obtained on two successive MDCT acquisitions using energy-driven contour estimation (EDCE) and full width at half maximum (FWHM) approaches. Both methods were applied to a database of segmental and subsegmental bronchi with LA > 4 mm 2 containing 42 bronchial segments of 10 successive slices that best matched on each acquisition. For both methods, the 95% confidence interval between repeated MDCT was between -1.59 and 1.5 mm 2 for LA, and -3.31 and 2.96 mm 2 for WA. The values of the coefficient of measurement variation (CV 10 , i.e., percentage ratio of the standard deviation obtained from the 10 successive slices to their mean value) were strongly correlated between repeated MDCT data acquisitions (r > 0.72; p 2 , whereas WA values were lower for bronchi with WA 2 ; no systematic EDCE underestimation or overestimation was observed for thicker-walled bronchi. In conclusion, variability between CT examinations and assessment techniques may impair measurements. Therefore, new parameters such as CV 10 need to be investigated to study bronchial remodeling. Finally, EDCE and FWHM are not interchangeable in longitudinal studies. (orig.)

  15. Variability of standard artificial soils: Physico-chemical properties and phenanthrene desorption measured by means of supercritical fluid extraction

    International Nuclear Information System (INIS)

    Bielská, Lucie; Hovorková, Ivana; Komprdová, Klára; Hofman, Jakub

    2012-01-01

    The study is focused on artificial soil which is supposed to be a standardized “soil like” medium. We compared physico-chemical properties and extractability of Phenanthrene from 25 artificial soils prepared according to OECD standardized procedures at different laboratories. A substantial range of soil properties was found, also for parameters which should be standardized because they have an important influence on the bioavailability of pollutants (e.g. total organic carbon ranged from 1.4 to 6.1%). The extractability of Phe was measured by supercritical fluid extraction (SFE) at harsh and mild conditions. Highly variable Phe extractability from different soils (3–89%) was observed. The extractability was strongly related (R 2 = 0.87) to total organic carbon content, 0.1–2 mm particle size, and humic/fulvic acid ratio in the following multiple regression model: SFE (%) = 1.35 * sand (%) − 0.77 * TOC (%)2 + 0.27 * HA/FA. - Highlights: ► We compared properties and extractability of Phe from 25 different artificial soils. ► Substantial range of soil properties was found, also for important parameters. ► Phe extractability was measured by supercritical fluid extraction (SFE) at 2 modes. ► Phe extractability was highly variable from different soils (3–89%). ► Extractability was strongly related to TOC, 0.1–2 mm particles, and HA/FA. - Significant variability in physico-chemical properties exists between artificial soils prepared at different laboratories and affects behavior of contaminants in these soils.

  16. Robust shot-noise measurement for continuous-variable quantum key distribution

    Science.gov (United States)

    Kunz-Jacques, Sébastien; Jouguet, Paul

    2015-02-01

    We study a practical method to measure the shot noise in real time in continuous-variable quantum key distribution systems. The amount of secret key that can be extracted from the raw statistics depends strongly on this quantity since it affects in particular the computation of the excess noise (i.e., noise in excess of the shot noise) added by an eavesdropper on the quantum channel. Some powerful quantum hacking attacks relying on faking the estimated value of the shot noise to hide an intercept and resend strategy were proposed. Here, we provide experimental evidence that our method can defeat the saturation attack and the wavelength attack.

  17. The use of cognitive ability measures as explanatory variables in regression analysis.

    Science.gov (United States)

    Junker, Brian; Schofield, Lynne Steuerle; Taylor, Lowell J

    2012-12-01

    Cognitive ability measures are often taken as explanatory variables in regression analysis, e.g., as a factor affecting a market outcome such as an individual's wage, or a decision such as an individual's education acquisition. Cognitive ability is a latent construct; its true value is unobserved. Nonetheless, researchers often assume that a test score , constructed via standard psychometric practice from individuals' responses to test items, can be safely used in regression analysis. We examine problems that can arise, and suggest that an alternative approach, a "mixed effects structural equations" (MESE) model, may be more appropriate in many circumstances.

  18. Context matters! sources of variability in weekend physical activity among families: a repeated measures study

    Directory of Open Access Journals (Sweden)

    Robert J. Noonan

    2017-04-01

    Full Text Available Abstract Background Family involvement is an essential component of effective physical activity (PA interventions in children. However, little is known about the PA levels and characteristics of PA among families. This study used a repeated measures design and multiple data sources to explore the variability and characteristics of weekend PA among families. Methods Families (including a ‘target’ child aged 9–11 years, their primary caregiver(s and siblings aged 6–8 years were recruited through primary schools in Liverpool, UK. Participants completed a paper-based PA diary and wore an ActiGraph GT9X accelerometer on their left wrist for up to 16 weekend days. ActiGraph.csv files were analysed using the R-package GGIR version 1.1–4. Mean minutes of moderate-to-vigorous PA (MVPA for each weekend of measurement were calculated using linear mixed models, and variance components were estimated for participant (inter-individual, weekend of measurement, and residual error (intra-individual. Intraclass correlation coefficients (ICC were calculated from the proportion of total variance accounted for by inter-individual sources, and used as a measure of reliability. Diary responses were summed to produce frequency counts. To offer contextual insight into weekend PA among family units, demographic, accelerometer, and diary data were combined to form two case studies representative of low and high active families. Results Twenty-five participants from 7 families participated, including 7 ‘target’ children (mean age 9.3 ± 1.1 years, 4 boys, 6 siblings (mean age 7.2 ± 0.7 years; 4 boys and 12 adults (7 mothers and 5 fathers. There was a high degree of variability in target children’s (ICC = 0.55, siblings (ICC = 0.38, and mothers’ MVPA (ICC = 0.58, but not in fathers’ MVPA (ICC = 0.83. Children’s weekend PA was mostly unstructured in nature and undertaken with friends, whereas a greater proportion of parents’ weekend

  19. Measurement of high-departure aspheres using subaperture stitching with the Variable Optical Null (VON)

    Science.gov (United States)

    Kulawiec, Andrew; Murphy, Paul; DeMarco, Michael

    2010-10-01

    Aspheric surfaces are proven to provide significant benefits to a wide variety of optical systems, but the ability to produce high-precision aspheric surfaces has historically been limited by the ability (or lack thereof) to measure them. Traditionally, aspheric measurements have required dedicated null optics, but the cost, lead time, and calibration difficulty of using null optics has made the use of aspheres more challenging and less attractive. In the past three years, QED has developed the Subaperture Stitching Interferometer for Aspheres (SSI-A®) to help address this limitation, providing flexible aspheric measurement capability of up to 200 waves of aspheric departure from best-fit sphere. Some aspheres, however, have thousands of waves of departure. We have recently developed Variable Optical Null (VON) technology that can null much of the aspheric departure in a subaperture. The VON is automatically configurable and is adjusted to nearly null each specific subaperture of an asphere. This ability to nearly null a local subaperture of an asphere provides a significant boost in aspheric measurement capability, enabling aspheres with up to 1000 waves of departure to be measured, without the use of dedicated null optics. We outline the basic principles of subaperture stitching and VON technology, demonstrate the extended capability provided by the VON, and present measurement results from the new Aspheric Stitching Interferometer (ASI®).

  20. Spatial variability of carbon dioxide in the urban canopy layer and implications for flux measurements

    Science.gov (United States)

    Crawford, B.; Christen, A.

    2014-12-01

    This contribution reports CO2 mixing ratios measured in the urban canopy layer (UCL) of a residential neighborhood in Vancouver, BC, Canada and discusses the relevance of UCL CO2 temporal and spatial variability to local-scale eddy covariance (EC) fluxes measured above the UCL. Measurements were conducted from a mobile vehicle-mounted platform over a continuous, 26-h period in the longterm turbulent flux source area of an urban EC tower. Daytime mixing ratios were highest along arterial roads and largely a function of proximity to vehicle traffic CO2 sources. At night, there was a distinct negative correlation between potential air temperature and CO2 mixing ratios. The spatial distribution of CO2 was controlled by topography and micro-scale advective processes (i.e. cold-air pooling). Mobile CO2 measurements were then used to calculate CO2 storage changes (FS) in the UCL volume and compared to single-layer FS estimates calculated from the EC system. In total, five variations of FS were calculated. On average, the choice of FS calculation method affected net measured hourly emissions (FC) by 5.2%. Analysis of FS using a four-year dataset measured at the EC tower show FS was 2.8% of hourly FC for this site on average. At this urban EC location, FS was relatively minor compared to FC and calculation of FS using a single-layer method was adequate, though FS still represents a potentially large uncertainty during individual hours.

  1. Sensitivity of adaptive enrichment trial designs to accrual rates, time to outcome measurement, and prognostic variables

    Directory of Open Access Journals (Sweden)

    Tianchen Qian

    2017-12-01

    Full Text Available Adaptive enrichment designs involve rules for restricting enrollment to a subset of the population during the course of an ongoing trial. This can be used to target those who benefit from the experimental treatment. Trial characteristics such as the accrual rate and the prognostic value of baseline variables are typically unknown when a trial is being planned; these values are typically assumed based on information available before the trial starts. Because of the added complexity in adaptive enrichment designs compared to standard designs, it may be of special concern how sensitive the trial performance is to deviations from assumptions. Through simulation studies, we evaluate the sensitivity of Type I error, power, expected sample size, and trial duration to different design characteristics. Our simulation distributions mimic features of data from the Alzheimer's Disease Neuroimaging Initiative cohort study, and involve two subpopulations based on a genetic marker. We investigate the impact of the following design characteristics: the accrual rate, the time from enrollment to measurement of a short-term outcome and the primary outcome, and the prognostic value of baseline variables and short-term outcomes. To leverage prognostic information in baseline variables and short-term outcomes, we use a semiparametric, locally efficient estimator, and investigate its strengths and limitations compared to standard estimators. We apply information-based monitoring, and evaluate how accurately information can be estimated in an ongoing trial.

  2. Hypersonic Boundary Layer Measurements with Variable Blowing Rates Using Molecular Tagging Velocimetry

    Science.gov (United States)

    Bathel, Brett F.; Danehy, Paul M.; Johansen, Craig T.; Jones, Stephen B.; Goyne, Christopher P.

    2012-01-01

    Measurements of mean and instantaneous streamwise velocity profiles in a hypersonic boundary layer with variable rates of mass injection (blowing) of nitrogen dioxide (NO2) were obtained over a 10-degree half-angle wedge model. The NO2 was seeded into the flow from a slot located 29.4 mm downstream of the sharp leading edge. The top surface of the wedge was oriented at a 20 degree angle in the Mach 10 flow, yielding an edge Mach number of approximately 4.2. The streamwise velocity profiles and streamwise fluctuating velocity component profiles were obtained using a three-laser NO2->NO photolysis molecular tagging velocimetry method. Observed trends in the mean streamwise velocity profiles and profiles of the fluctuating component of streamwise velocity as functions of the blowing rate are described. An effort is made to distinguish between the effect of blowing rate and wall temperature on the measured profiles. An analysis of the mean velocity profiles for a constant blowing rate is presented to determine the uncertainty in the measurement for different probe laser delay settings. Measurements of streamwise velocity were made to within approximately 120 gm of the model surface. The streamwise spatial resolution in this experiment ranged from 0.6 mm to 2.6 mm. An improvement in the spatial precision of the measurement technique has been made, with spatial uncertainties reduced by about a factor of 2 compared to previous measurements. For the quiescent flow calibration measurements presented, uncertainties as low as 2 m/s are obtained at 95% confidence for long delay times (25 gs). For the velocity measurements obtained with the wind tunnel operating, average single-shot uncertainties of less than 44 m/s are obtained at 95% confidence with a probe laser delay setting of 1 gs. The measurements were performed in the 31-inch Mach 10 Air Tunnel at the NASA Langley Research Center.

  3. MRI-based measurements of respiratory motion variability and assessment of imaging strategies for radiotherapy planning

    International Nuclear Information System (INIS)

    Blackall, J M; Ahmad, S; Miquel, M E; McClelland, J R; Landau, D B; Hawkes, D J

    2006-01-01

    Respiratory organ motion has a significant impact on the planning and delivery of radiotherapy (RT) treatment for lung cancer. Currently widespread techniques, such as 4D-computed tomography (4DCT), cannot be used to measure variability of this motion from one cycle to the next. In this paper, we describe the use of fast magnetic resonance imaging (MRI) techniques to investigate the intra- and inter-cycle reproducibility of respiratory motion and also to estimate the level of errors that may be introduced into treatment delivery by using various breath-hold imaging strategies during lung RT planning. A reference model of respiratory motion is formed to enable comparison of different breathing cycles at any arbitrary position in the respiratory cycle. This is constructed by using free-breathing images from the inhale phase of a single breathing cycle, then co-registering the images, and thereby tracking landmarks. This reference model is then compared to alternative models constructed from images acquired during the exhale phase of the same cycle and the inhale phase of a subsequent cycle, to assess intra- and inter-cycle variability ('hysteresis' and 'reproducibility') of organ motion. The reference model is also compared to a series of models formed from breath-hold data at exhale and inhale. Evaluation of these models is carried out on data from ten healthy volunteers and five lung cancer patients. Free-breathing models show good levels of intra- and inter-cycle reproducibility across the tidal breathing range. Mean intra-cycle errors in the position of organ surface landmarks of 1.5(1.4)-3.5(3.3) mm for volunteers and 2.8(1.8)-5.2(5.2) mm for patients. Equivalent measures of inter-cycle variability across this range are 1.7(1.0)-3.9(3.3) mm for volunteers and 2.8(1.8)-3.3(2.2) mm for patients. As expected, models based on breath-hold sequences do not represent normal tidal motion as well as those based on free-breathing data, with mean errors of 4

  4. Measurement and modeling of diel variability of polybrominated diphenyl ethers and chlordanes in air.

    Science.gov (United States)

    Moeckel, Claudia; Macleod, Matthew; Hungerbühler, Konrad; Jones, Kevin C

    2008-05-01

    Short-term variability of concentrations of polybrominated diphenyl ethers (PBDEs) and chlordanes in air at a semirural site in England over a 5 day period is reported. Four-hour air samples were collected during a period dominated by a high pressure system that produced stable diel (24-h) patterns of meteorological conditions such as temperature and atmospheric boundary layer height. PBDE and chlordane concentrations showed clear diel variability with concentrations in the afternoon and evening being 1.9 - 2.7 times higher than in the early morning. The measurements are interpreted using a multimedia mass balance model parametrized with forcing functions representing local temperature, atmospheric boundary layer height, wind speed and hydroxyl radical concentrations. Model results indicate that reversible, temperature-controlled air-surface exchange is the primary driver of the diel concentration pattern observed for chlordanes and PBDE 28. For higher brominated PBDE congeners (47, 99 and 100), the effect of variable atmospheric mixing height in combination with irreversible deposition on aerosol particles is dominant and explains the diel patterns almost entirely. Higher concentrations of chlordanes and PBDEs in air observed at the end of the study period could be related to likely source areas using back trajectory analysis. This is the first study to clearly document diel variability in concentrations of PBDEs in air over a period of several days. Our model analysis indicates that high daytime and low nighttime concentrations of semivolatile organic chemicals can arise from different underlying driving processes, and are not necessarily evidence of reversible air-surface exchange on a 24-h time scale.

  5. On the intrinsic timescales of temporal variability in measurements of the surface solar radiation

    Science.gov (United States)

    Bengulescu, Marc; Blanc, Philippe; Wald, Lucien

    2018-01-01

    This study is concerned with the intrinsic temporal scales of the variability in the surface solar irradiance (SSI). The data consist of decennial time series of daily means of the SSI obtained from high-quality measurements of the broadband solar radiation impinging on a horizontal plane at ground level, issued from different Baseline Surface Radiation Network (BSRN) ground stations around the world. First, embedded oscillations sorted in terms of increasing timescales of the data are extracted by empirical mode decomposition (EMD). Next, Hilbert spectral analysis is applied to obtain an amplitude-modulation-frequency-modulation (AM-FM) representation of the data. The time-varying nature of the characteristic timescales of variability, along with the variations in the signal intensity, are thus revealed. A novel, adaptive null hypothesis based on the general statistical characteristics of noise is employed in order to discriminate between the different features of the data, those that have a deterministic origin and those being realizations of various stochastic processes. The data have a significant spectral peak corresponding to the yearly variability cycle and feature quasi-stochastic high-frequency variability components, irrespective of the geographical location or of the local climate. Moreover, the amplitude of this latter feature is shown to be modulated by variations in the yearly cycle, which is indicative of nonlinear multiplicative cross-scale couplings. The study has possible implications on the modeling and the forecast of the surface solar radiation, by clearly discriminating the deterministic from the quasi-stochastic character of the data, at different local timescales.

  6. Measurement methods and variability assessment of the Norway spruce total leaf area: Implications for remote sensing

    NARCIS (Netherlands)

    Homolova, L.; Lukes, P.; Malenovsky, Z.; Lhotakova, Z.; Kaplan, V.; Hanus, J.

    2013-01-01

    Estimation of total leaf area (LAT) is important to express biochemical properties in plant ecology and remote sensing studies. A measurement of LAT is easy in broadleaf species, but it remains challenging in coniferous canopies. We proposed a new geometrical model to estimate Norway spruce LAT and

  7. Development of a scale to measure adherence to self-monitoring of blood glucose with latent variable measurement.

    Science.gov (United States)

    Wagner, J A; Schnoll, R A; Gipson, M T

    1998-07-01

    Adherence to self-monitoring of blood glucose (SMBG) is problematic for many people with diabetes. Self-reports of adherence have been found to be unreliable, and existing paper-and-pencil measures have limitations. This study developed a brief measure of SMBG adherence with good psychometric properties and a useful factor structure that can be used in research and in practice. A total of 216 adults with diabetes responded to 30 items rated on a 9-point Likert scale that asked about blood monitoring habits. In part I of the study, items were evaluated and retained based on their psychometric properties. The sample was divided into exploratory and confirmatory halves. Using the exploratory half, items with acceptable psychometric properties were subjected to a principal components analysis. In part II of the study, structural equation modeling was used to confirm the component solution with the entire sample. Structural modeling was also used to test the relationship between these components. It was hypothesized that the scale would produce four correlated factors. Principal components analysis suggested a two-component solution, and confirmatory factor analysis confirmed this solution. The first factor measures the degree to which patients rely on others to help them test and thus was named "social influence." The second component measures the degree to which patients use physical symptoms of blood glucose levels to help them test and thus was named "physical influence." Results of the structural model show that the components are correlated and make up the higher-order latent variable adherence. The resulting 15-item scale provides a short, reliable way to assess patient adherence to SMBG. Despite the existence of several aspects of adherence, this study indicates that the construct consists of only two components. This scale is an improvement on previous measures of adherence because of its good psychometric properties, its interpretable factor structure, and its

  8. Violation of Continuous-Variable Einstein-Podolsky-Rosen Steering with Discrete Measurements

    Science.gov (United States)

    Schneeloch, James; Dixon, P. Ben; Howland, Gregory A.; Broadbent, Curtis J.; Howell, John C.

    2013-03-01

    In this Letter, we derive an entropic Einstein-Podolsky-Rosen (EPR) steering inequality for continuous-variable systems using only experimentally measured discrete probability distributions and details of the measurement apparatus. We use this inequality to witness EPR steering between the positions and momenta of photon pairs generated in spontaneous parametric down-conversion. We examine the asymmetry between parties in this inequality, and show that this asymmetry can be used to reduce the technical requirements of experimental setups intended to demonstrate the EPR paradox. Furthermore, we develop a more stringent steering inequality that is symmetric between parties, and use it to show that the down-converted photon pairs also exhibit symmetric EPR steering.

  9. Conceptualising computerized adaptive testing for measurement of latent variables associated with physical objects

    International Nuclear Information System (INIS)

    Camargo, F R; Henson, B

    2015-01-01

    The notion of that more or less of a physical feature affects in different degrees the users' impression with regard to an underlying attribute of a product has frequently been applied in affective engineering. However, those attributes exist only as a premise that cannot directly be measured and, therefore, inferences based on their assessment are error-prone. To establish and improve measurement of latent attributes it is presented in this paper the concept of a stochastic framework using the Rasch model for a wide range of independent variables referred to as an item bank. Based on an item bank, computerized adaptive testing (CAT) can be developed. A CAT system can converge into a sequence of items bracketing to convey information at a user's particular endorsement level. It is through item banking and CAT that the financial benefits of using the Rasch model in affective engineering can be realised

  10. Measurement Variability of Vertical Scanning Interferometry Tool Used for Orbiter Window Defect Assessment

    Science.gov (United States)

    Padula, Santo, II

    2009-01-01

    The ability to sufficiently measure orbiter window defects to allow for window recertification has been an ongoing challenge for the orbiter vehicle program. The recent Columbia accident has forced even tighter constraints on the criteria that must be met in order to recertify windows for flight. As a result, new techniques are being investigated to improve the reliability, accuracy and resolution of the defect detection process. The methodology devised in this work, which is based on the utilization of a vertical scanning interferometric (VSI) tool, shows great promise for meeting the ever increasing requirements for defect detection. This methodology has the potential of a 10-100 fold greater resolution of the true defect depth than can be obtained from the currently employed micrometer based methodology. An added benefit is that it also produces a digital elevation map of the defect, thereby providing information about the defect morphology which can be utilized to ascertain the type of debris that induced the damage. However, in order to successfully implement such a tool, a greater understanding of the resolution capability and measurement repeatability must be obtained. This work focused on assessing the variability of the VSI-based measurement methodology and revealed that the VSI measurement tool was more repeatable and more precise than the current micrometer based approach, even in situations where operator variation could affect the measurement. The analysis also showed that the VSI technique was relatively insensitive to the hardware and software settings employed, making the technique extremely robust and desirable

  11. Measuring variability in trophic status in the Lake Waco/Bosque River Watershed

    Directory of Open Access Journals (Sweden)

    Rodriguez Angela D

    2008-01-01

    Full Text Available Abstract Background Nutrient management in rivers and streams is difficult due to the spatial and temporal variability of algal growth responses. The objectives of this project were to determine the spatial and seasonal in situ variability of trophic status in the Lake Waco/Bosque River watershed, determine the variability in the lotic ecosystem trophic status index (LETSI at each site as indicators of the system's nutrient sensitivity, and determine if passive diffusion periphytometers could provide threshold algal responses to nutrient enrichment. Methods We used the passive diffusion periphytometer to measure in-situ nutrient limitation and trophic status at eight sites in five streams in the Lake Waco/Bosque River Watershed in north-central Texas from July 1997 through October 1998. The chlorophyll a production in the periphytometers was used as an indicator of baseline chlorophyll a productivity and of maximum primary productivity (MPP in response to nutrient enrichment (nitrogen and phosphorus. We evaluated the lotic ecosystem trophic status index (LETSI using the ratio of baseline primary productivity to MPP, and evaluated the trophic class of each site. Results The rivers and streams in the Lake Waco/Bosque River Watershed exhibited varying degrees of nutrient enrichment over the 18-month sampling period. The North Bosque River at the headwaters (NB-02 located below the Stephenville, Texas wastewater treatment outfall consistently exhibited the highest degree of water quality impact due to nutrient enrichment. Sites at the outlet of the watershed (NB-04 and NB-05 were the next most enriched sites. Trophic class varied for enriched sites over seasons. Conclusion Seasonality played a significant role in the trophic class and sensitivity of each site to nutrients. Managing rivers and streams for nutrients will require methods for measuring in situ responses and sensitivities to nutrient enrichment. Nutrient enrichment periphytometers show

  12. Variability of skin autofluorescence measurement over 6 and 12 weeks and the influence of benfotiamine treatment.

    Science.gov (United States)

    Stirban, Alin; Pop, Alexandra; Fischer, Annelie; Heckermann, Sascha; Tschoepe, Diethelm

    2013-09-01

    Measurements of skin autofluorescence (SAF) allow for a simple and noninvasive quantification of tissue advanced glycation end-products (AGEs), a marker linked to the risk of diabetes complications. The aim of this study was to test the repeatability of SAF over 6 and 12 weeks and to test whether benfotiamine, a thiamine prodrug suggested to reduce AGEs formation under hyperglycemic conditions, is able to attenuate SAF when administered over 6 weeks. In a double-blind, placebo-controlled, randomized, crossover study, 22 patients with type 2 diabetes mellitus (T2DM) received 900 mg/day benfotiamine or placebo for 6 weeks (washout period of 6 weeks between). At the beginning and at the end of each treatment period, SAF was assessed in the fasting state, as well as 2, 4, and 6 h following a mixed test meal. The respective intra-individual and inter-individual variability of fasting SAF was 6.9% and 24.5% within 6 weeks and 10.9% and 23.1% within 12 weeks. The respective variability calculated for triplicate comparisons was 9.9% and 27.7%. A short-term therapy with benfotiamine did not influence SAF significantly, nor did we find a significant postprandial SAF increase. In patients with T2DM, repeated, timely spaced SAF measurements have an intra-subject variability of below 11%. Using these data, sample sizes were calculated for interventional studies aiming at reducing SAF. Benfotiamine treatment for 6 weeks did not significantly influence SAF; for this, a longer-term therapy is probably needed.

  13. Some metabolic and anthropometric variables in obes children by measuring serum insulin, and leptin

    International Nuclear Information System (INIS)

    Nour Eldin, A.M.

    2004-01-01

    The present study aimed to assess serum leptin level in obese children to study its correlation with some metabolic variables as serum insulin and serum glucose. The study was conducted on 30 obese children of age from 9-14 years with body mass index (BMI) > 27.8 Kg/m 2 . All children were subjected to history taking, clinical examination, anthropometric measurements and laboratory investigations including fasting serum leptin, insulin and blood glucose. Serum leptin was significantly higher in obese children (102.3± 56.2 ng/ml) compared to non-obese ones (48.15±26.1 ng/ml). The relation between serum leptin and anthropometric measurements and laboratory investigations including fasting serum insulin and blood glucose. Serum leptin was significantly higher in obese children (102.3± 56.2 ng/ml)compared to non-obese ones (48.15±26.1 ng/ml). The relation between serum leptin and anthropometric variables was positively correlated with BMI r s = 0.68, (p s = 0.59.(p<0.01). It is concluded that serum leptin is increased in obesity and its concentration effects the size of the body. Moreover, the relation of leptin and insulin suggests a positive role of leptin in insulin resistance, which are common metabolic disorders associated with obesity

  14. Mini-UAV based sensory system for measuring environmental variables in greenhouses.

    Science.gov (United States)

    Roldán, Juan Jesús; Joossen, Guillaume; Sanz, David; del Cerro, Jaime; Barrientos, Antonio

    2015-02-02

    This paper describes the design, construction and validation of a mobile sensory platform for greenhouse monitoring. The complete system consists of a sensory system on board a small quadrotor (i.e., a four rotor mini-UAV). The goals of this system include taking measures of temperature, humidity, luminosity and CO2 concentration and plotting maps of these variables. These features could potentially allow for climate control, crop monitoring or failure detection (e.g., a break in a plastic cover). The sensors have been selected by considering the climate and plant growth models and the requirements for their integration onboard the quadrotor. The sensors layout and placement have been determined through a study of quadrotor aerodynamics and the influence of the airflows from its rotors. All components of the system have been developed, integrated and tested through a set of field experiments in a real greenhouse. The primary contributions of this paper are the validation of the quadrotor as a platform for measuring environmental variables and the determination of the optimal location of sensors on a quadrotor.

  15. Mini-UAV Based Sensory System for Measuring Environmental Variables in Greenhouses

    Directory of Open Access Journals (Sweden)

    Juan Jesús Roldán

    2015-02-01

    Full Text Available This paper describes the design, construction and validation of a mobile sensory platform for greenhouse monitoring. The complete system consists of a sensory system on board a small quadrotor (i.e., a four rotor mini-UAV. The goals of this system include taking measures of temperature, humidity, luminosity and CO2 concentration and plotting maps of these variables. These features could potentially allow for climate control, crop monitoring or failure detection (e.g., a break in a plastic cover. The sensors have been selected by considering the climate and plant growth models and the requirements for their integration onboard the quadrotor. The sensors layout and placement have been determined through a study of quadrotor aerodynamics and the influence of the airflows from its rotors. All components of the system have been developed, integrated and tested through a set of field experiments in a real greenhouse. The primary contributions of this paper are the validation of the quadrotor as a platform for measuring environmental variables and the determination of the optimal location of sensors on a quadrotor.

  16. Can Wearable Devices Accurately Measure Heart Rate Variability? A Systematic Review.

    Science.gov (United States)

    Georgiou, Konstantinos; Larentzakis, Andreas V; Khamis, Nehal N; Alsuhaibani, Ghadah I; Alaska, Yasser A; Giallafos, Elias J

    2018-03-01

    A growing number of wearable devices claim to provide accurate, cheap and easily applicable heart rate variability (HRV) indices. This is mainly accomplished by using wearable photoplethysmography (PPG) and/or electrocardiography (ECG), through simple and non-invasive techniques, as a substitute of the gold standard RR interval estimation through electrocardiogram. Although the agreement between pulse rate variability (PRV) and HRV has been evaluated in the literature, the reported results are still inconclusive especially when using wearable devices. The purpose of this systematic review is to investigate if wearable devices provide a reliable and precise measurement of classic HRV parameters in rest as well as during exercise. A search strategy was implemented to retrieve relevant articles from MEDLINE and SCOPUS databases, as well as, through internet search. The 308 articles retrieved were reviewed for further evaluation according to the predetermined inclusion/exclusion criteria. Eighteen studies were included. Sixteen of them integrated ECG - HRV technology and two of them PPG - PRV technology. All of them examined wearable devices accuracy in RV detection during rest, while only eight of them during exercise. The correlation between classic ECG derived HRV and the wearable RV ranged from very good to excellent during rest, yet it declined progressively as exercise level increased. Wearable devices may provide a promising alternative solution for measuring RV. However, more robust studies in non-stationary conditions are needed using appropriate methodology in terms of number of subjects involved, acquisition and analysis techniques implied.

  17. Measuring behaviours for escaping from house fires: use of latent variable models to summarise multiple behaviours.

    Science.gov (United States)

    Ploubidis, G B; Edwards, P; Kendrick, D

    2015-12-15

    This paper reports the development and testing of a construct measuring parental fire safety behaviours for planning escape from a house fire. Latent variable modelling of data on parental-reported fire safety behaviours and plans for escaping from a house fire and multivariable logistic regression to quantify the association between groups defined by the latent variable modelling and parental-report of having a plan for escaping from a house fire. Data comes from 1112 participants in a cluster randomised controlled trial set in children's centres in 4 study centres in the UK. A two class model provided the best fit to the data, combining responses to five fire safety planning behaviours. The first group ('more behaviours for escaping from a house fire') comprised 86% of participants who were most likely to have a torch, be aware of how their smoke alarm sounds, to have external door and window keys accessible, and exits clear. The second group ('fewer behaviours for escaping from a house fire') comprised 14% of participants who were less likely to report these five behaviours. After adjusting for potential confounders, participants allocated to the 'more behaviours for escaping from a house fire group were 2.5 times more likely to report having an escape plan (OR 2.48; 95% CI 1.59-3.86) than those in the "fewer behaviours for escaping from a house fire" group. Multiple fire safety behaviour questions can be combined into a single binary summary measure of fire safety behaviours for escaping from a house fire. Our findings will be useful to future studies wishing to use a single measure of fire safety planning behaviour as measures of outcome or exposure. NCT 01452191. Date of registration 13/10/2011.

  18. Measurement of circulating cell-derived microparticles by flow cytometry: sources of variability within the assay.

    Science.gov (United States)

    Ayers, Lisa; Kohler, Malcolm; Harrison, Paul; Sargent, Ian; Dragovic, Rebecca; Schaap, Marianne; Nieuwland, Rienk; Brooks, Susan A; Ferry, Berne

    2011-04-01

    Circulating cell-derived microparticles (MPs) have been implicated in several disease processes and elevated levels are found in many pathological conditions. The detection and accurate measurement of MPs, although attracting widespread interest, is hampered by a lack of standardisation. The aim of this study was to establish a reliable flow cytometric assay to measure distinct subtypes of MPs in disease and to identify any significant causes of variability in MP quantification. Circulating MPs within plasma were identified by their phenotype (platelet, endothelial, leukocyte and annexin-V positivity (AnnV+). The influence of key variables (i.e. time between venepuncture and centrifugation, washing steps, the number of centrifugation steps, freezing/long-term storage and temperature of thawing) on MP measurement were investigated. Increasing time between venepuncture and centrifugation leads to increased MP levels. Washing samples results in decreased AnnV+MPs (P=0.002) and platelet-derived MPs (PMPs) (P=0.002). Double centrifugation of MPs prior to freezing decreases numbers of AnnV+MPs (P=0.0004) and PMPs (P=0.0004). A single freeze thaw cycle of samples led to an increase in AnnV+MPs (P=0.0020) and PMPs (P=0.0039). Long-term storage of MP samples at -80° resulted in decreased MP levels. This study found that minor protocol changes significantly affected MP levels. This is one of the first studies attempting to standardise a method for obtaining and measuring circulating MPs. Standardisation will be essential for successful development of MP technologies, allowing direct comparison of results between studies and leading to a greater understanding of MPs in disease. Crown Copyright © 2010. Published by Elsevier Ltd. All rights reserved.

  19. Health status measurement in COPD: the minimal clinically important difference of the clinical COPD questionnaire

    Directory of Open Access Journals (Sweden)

    van den Berg JWK

    2006-04-01

    Full Text Available Abstract Background Patient-reported outcomes (PRO questionnaires are being increasingly used in COPD clinical studies. The challenge facing investigators is to determine what change is significant, ie what is the minimal clinically important difference (MCID. This study aimed to identify the MCID for the clinical COPD questionnaire (CCQ in terms of patient referencing, criterion referencing, and by the standard error of measurement (SEM. Methods Patients were ≥40 years of age, diagnosed with COPD, had a smoking history of >10 pack-years, and were participating in a randomized, controlled clinical trial comparing intravenous and oral prednisolone in patients admitted with an acute exacerbation of COPD. The CCQ was completed on Days 1–7 and 42. A Global Rating of Change (GRC assessment was taken to establish the MCID by patient referencing. For criterion referencing, health events during a period of 1 year after Day 42 were included in this analysis. Results 210 patients were recruited, 168 completed the CCQ questionnaire on Day42. The MCID of the CCQ total score, as indicated by patient referencing in terms of the GRC, was 0.44. The MCID of the CCQ in terms of criterion referencing for the major outcomes was 0.39, and calculation of the SEM resulted in a value of 0.21. Conclusion This investigation, which is the first to determine the MCID of a PRO questionnaire via more than one approach, indicates that the MCID of the CCQ total score is 0.4.

  20. Pragmatic characteristics of patient-reported outcome measures are important for use in clinical practice.

    Science.gov (United States)

    Kroenke, Kurt; Monahan, Patrick O; Kean, Jacob

    2015-09-01

    Measures for assessing patient-reported outcomes (PROs) that may have initially been developed for research are increasingly being recommended for use in clinical practice as well. Although psychometric rigor is essential, this article focuses on pragmatic characteristics of PROs that may enhance uptake into clinical practice. Three sources were drawn on in identifying pragmatic criteria for PROs: (1) selected literature review including recommendations by other expert groups; (2) key features of several model public domain PROs; and (3) the authors' experience in developing practical PROs. Eight characteristics of a practical PRO include: (1) actionability (i.e., scores guide diagnostic or therapeutic actions/decision making); (2) appropriateness for the relevant clinical setting; (3) universality (i.e., for screening, severity assessment, and monitoring across multiple conditions); (4) self-administration; (5) item features (number of items and bundling issues); (6) response options (option number and dimensions, uniform vs. varying options, time frame, intervals between options); (7) scoring (simplicity and interpretability); and (8) accessibility (nonproprietary, downloadable, available in different languages and for vulnerable groups, and incorporated into electronic health records). Balancing psychometric and pragmatic factors in the development of PROs is important for accelerating the incorporation of PROs into clinical practice. Published by Elsevier Inc.

  1. Important Variables When Screening for Students at Suicidal Risk: Findings from the French Cohort of the SEYLE Study

    Directory of Open Access Journals (Sweden)

    Jean-Pierre Kahn

    2015-09-01

    Full Text Available Due to early detection of mental ill-health being an important suicide preventive strategy, the multi-centre EU funded “Saving and Empowering Young Lives in Europe” (SEYLE study compared three school-based mental health promotion programs to a control group. In France, 1007 students with a mean age of 15.2 years were recruited from 20 randomly assigned schools. This paper explores the French results of the SEYLE’s two-stage screening program (ProfScreen and of the cross-program suicidal emergency procedure. Two-hundred-thirty-five ProfScreen students were screened using 13 psychopathological and risk behaviour scales. Students considered at risk because of a positive finding on one or more scales were offered a clinical interview and, if necessary, referred for treatment. A procedure for suicidal students (emergency cases was set up to detect emergencies in the whole cohort (n = 1007. Emergency cases were offered the same clinical interview as the ProfScreen students. The interviewers documented their reasons for referrals in a short report. 16,2% of the ProfScreen students (38/235 were referred to treatment and 2,7% of the emergency cases (27/1007 were also referred to treatment due to high suicidal risk. Frequent symptoms in those students referred for evaluation were depression, alcohol misuse, non-suicidal self-injuries (NSSI, and suicidal behaviours. According to the multivariate regression analysis of ProfScreen, the results show that the best predictors for treatment referral were NSSI (OR 2.85, alcohol misuse (OR 2.80, and depressive symptoms (OR 1.13. Analysis of the proportion for each scale of students referred to treatment showed that poor social relationships (60%, anxiety (50%, and suicidal behaviours (50% generated the highest rate of referrals. Qualitative analysis of clinician’s motivations to refer a student to mental health services revealed that depressive symptoms (51%, anxiety (38%, suicidal behaviours (40%, and

  2. Important Variables When Screening for Students at Suicidal Risk: Findings from the French Cohort of the SEYLE Study

    Science.gov (United States)

    Kahn, Jean-Pierre; Tubiana, Alexandra; Cohen, Renaud F.; Carli, Vladimir; Wasserman, Camilla; Hoven, Christina; Sarchiapone, Marco; Wasserman, Danuta

    2015-01-01

    Due to early detection of mental ill-health being an important suicide preventive strategy, the multi-centre EU funded “Saving and Empowering Young Lives in Europe” (SEYLE) study compared three school-based mental health promotion programs to a control group. In France, 1007 students with a mean age of 15.2 years were recruited from 20 randomly assigned schools. This paper explores the French results of the SEYLE’s two-stage screening program (ProfScreen) and of the cross-program suicidal emergency procedure. Two-hundred-thirty-five ProfScreen students were screened using 13 psychopathological and risk behaviour scales. Students considered at risk because of a positive finding on one or more scales were offered a clinical interview and, if necessary, referred for treatment. A procedure for suicidal students (emergency cases) was set up to detect emergencies in the whole cohort (n = 1007). Emergency cases were offered the same clinical interview as the ProfScreen students. The interviewers documented their reasons for referrals in a short report. 16,2% of the ProfScreen students (38/235) were referred to treatment and 2,7% of the emergency cases (27/1007) were also referred to treatment due to high suicidal risk. Frequent symptoms in those students referred for evaluation were depression, alcohol misuse, non-suicidal self-injuries (NSSI), and suicidal behaviours. According to the multivariate regression analysis of ProfScreen, the results show that the best predictors for treatment referral were NSSI (OR 2.85), alcohol misuse (OR 2.80), and depressive symptoms (OR 1.13). Analysis of the proportion for each scale of students referred to treatment showed that poor social relationships (60%), anxiety (50%), and suicidal behaviours (50%) generated the highest rate of referrals. Qualitative analysis of clinician’s motivations to refer a student to mental health services revealed that depressive symptoms (51%), anxiety (38%), suicidal behaviours (40%), and

  3. Important Variables When Screening for Students at Suicidal Risk: Findings from the French Cohort of the SEYLE Study.

    Science.gov (United States)

    Kahn, Jean-Pierre; Tubiana, Alexandra; Cohen, Renaud F; Carli, Vladimir; Wasserman, Camilla; Hoven, Christina; Sarchiapone, Marco; Wasserman, Danuta

    2015-09-30

    Due to early detection of mental ill-health being an important suicide preventive strategy, the multi-centre EU funded "Saving and Empowering Young Lives in Europe" (SEYLE) study compared three school-based mental health promotion programs to a control group. In France, 1007 students with a mean age of 15.2 years were recruited from 20 randomly assigned schools. This paper explores the French results of the SEYLE's two-stage screening program (ProfScreen) and of the cross-program suicidal emergency procedure. Two-hundred-thirty-five ProfScreen students were screened using 13 psychopathological and risk behaviour scales. Students considered at risk because of a positive finding on one or more scales were offered a clinical interview and, if necessary, referred for treatment. A procedure for suicidal students (emergency cases) was set up to detect emergencies in the whole cohort (n = 1007). Emergency cases were offered the same clinical interview as the ProfScreen students. The interviewers documented their reasons for referrals in a short report. 16,2% of the ProfScreen students (38/235) were referred to treatment and 2,7% of the emergency cases (27/1007) were also referred to treatment due to high suicidal risk. Frequent symptoms in those students referred for evaluation were depression, alcohol misuse, non-suicidal self-injuries (NSSI), and suicidal behaviours. According to the multivariate regression analysis of ProfScreen, the results show that the best predictors for treatment referral were NSSI (OR 2.85), alcohol misuse (OR 2.80), and depressive symptoms (OR 1.13). Analysis of the proportion for each scale of students referred to treatment showed that poor social relationships (60%), anxiety (50%), and suicidal behaviours (50%) generated the highest rate of referrals. Qualitative analysis of clinician's motivations to refer a student to mental health services revealed that depressive symptoms (51%), anxiety (38%), suicidal behaviours (40%), and negative life

  4. A new reliability measure based on specified minimum distances before the locations of random variables in a finite interval

    International Nuclear Information System (INIS)

    Todinov, M.T.

    2004-01-01

    A new reliability measure is proposed and equations are derived which determine the probability of existence of a specified set of minimum gaps between random variables following a homogeneous Poisson process in a finite interval. Using the derived equations, a method is proposed for specifying the upper bound of the random variables' number density which guarantees that the probability of clustering of two or more random variables in a finite interval remains below a maximum acceptable level. It is demonstrated that even for moderate number densities the probability of clustering is substantial and should not be neglected in reliability calculations. In the important special case where the random variables are failure times, models have been proposed for determining the upper bound of the hazard rate which guarantees a set of minimum failure-free operating intervals before the random failures, with a specified probability. A model has also been proposed for determining the upper bound of the hazard rate which guarantees a minimum availability target. Using the models proposed, a new strategy, models and reliability tools have been developed for setting quantitative reliability requirements which consist of determining the intersection of the hazard rate envelopes (hazard rate upper bounds) which deliver a minimum failure-free operating period before random failures, a risk of premature failure below a maximum acceptable level and a minimum required availability. It is demonstrated that setting reliability requirements solely based on an availability target does not necessarily mean a low risk of premature failure. Even at a high availability level, the probability of premature failure can be substantial. For industries characterised by a high cost of failure, the reliability requirements should involve a hazard rate envelope limiting the risk of failure below a maximum acceptable level

  5. Assessing soil hydrological variability at the cm- to dm-scale using air permeameter measurements

    Science.gov (United States)

    Beerten, K.; Vandersmissen, N.; Rogiers, B.; Mallants, D.

    2012-04-01

    Soils and surficial sediments are crucial elements in the hydrological cycle since they are the medium through which infiltrating precipitation percolates to the aquifer. At the same time, soil horizons and shallow stratigraphy may act as hydraulic barriers that can promote runoff or interflow and hamper deep infiltration. For most catchments little is known about the small-scale horizontal and vertical variability of soil hydrological properties. Such information is however required to calculate detailed soil water flow paths and estimate small scale spatial variability in recharge and run-off. We present the results from field air permeameter measurements to assess the small-scale variability of saturated hydraulic conductivity in heterogeneous 2-D soil profiles. To this end, several outcrops in the unsaturated zone (sandy soils with podzolisation) of an interfluve in the Kleine Nete river catchment (Campine area, Northern Belgium) were investigated using a hand-held permeameter. Measurements were done each 10 cm on ~ 2 x 1 m or ~ 2 x 0.5 m grids. The initial results of the measurements (air permeability Kair; millidarcy) are recalculated to saturated hydraulic conductivity (Ks; m/s) using specific transfer functions (Loll et al., 1999; Iversen et al., 2003). Validation of the results is done with independent lab-based constant head Ks measurements. The results show that field based Ks values generally range between 10-3 m/s and 10-7 m/s within one profile, but extremely high values (up to 10-1 m/s) have been measured as well. The lowest values are found in the organic- and silt-rich Bh horizon of podzol soils observed within the profiles (~ 10-6-10-7m/s), while the highest values are observed in overlying dune sands less than 40 cm deep (up to 10-3 m/s with outliers to 10-1 m/s). Comparison of field and laboratory based Ks data reveals there is fair agreement between both methods, apart from several outliers. Scatter plots indicate that almost all points

  6. Remote sensing of Essential Biodiversity Variables: new measurements linking ecosystem structure, function and composition

    Science.gov (United States)

    Schimel, D.; Pavlick, R.; Stavros, E. N.; Townsend, P. A.; Ustin, S.; Thompson, D. R.

    2017-12-01

    Remote sensing can inform a wide variety of essential biodiversity variables, including measurements that define primary productivity, forest structure, biome distribution, plant communities, land use-land cover change and climate drivers of change. Emerging remote sensing technologies can add significantly to remote sensing of EBVs, providing new, large scale insights on plant and habitat diversity itself, as well as causes and consequences of biodiversity change. All current biodiversity assessments identify major data gaps, with insufficient coverage in critical regions, limited observations to monitor change over time, with very limited revisit of sample locations, as well as taxon-specific biased biases. Remote sensing cannot fill many of the gaps in global biodiversity observations, but spectroscopic measurements in terrestrial and marine environments can aid in assessing plant/phytoplankton functional diversity and efficiently reveal patterns in space, as well as changes over time, and, by making use of chlorophyll fluorescence, reveal associated patterns in photosynthesis. LIDAR and RADAR measurements quantify ecosystem structure, and can precisely define changes due to growth, disturbance and land use. Current satellite-based EBVs have taken advantage of the extraordinary time series from LANDSAT and MODIS, but new measurements more directly reveal ecosystem structure, function and composition. We will present results from pre-space airborne studies showing the synergistic ability of a suite of new remote observation techniques to quantify biodiversity and ecosystem function and show how it changes during major disturbance events.

  7. Evaluation of Heart Rate Variability by means of Laser Doppler Vibrometry measurements

    International Nuclear Information System (INIS)

    Cosoli, G; Casacanditella, L; Tomasini, EP; Scalise, L

    2015-01-01

    Heart Rate Variability (HRV) analysis aims to study the physiological variability of the Heart Rate (HR), which is related to the health conditions of the subject. HRV is assessed measuring heart periods (HP) on a time window of >5 minutes (1)-(2). HPs are determined from signals of different nature: electrocardiogram (ECG), photoplethysmogram (PPG), phonocardiogram (PCG) or vibrocardiogram (VCG) (3)-(4)-(5). The fundamental aspect is the identification of a feature in each heartbeat that allows to accurately compute cardiac periods (such as R peaks in ECG), in order to make possible the measurement of all the typical HRV evaluations on those intervals. VCG is a non-contact technique (4), very favourable in medicine, which detects the vibrations on the skin surface (e.g. on the carotid artery) resulting from vascular blood motion consequent to electrical signal (ECG).In this paper, we propose the use of VCG for the measurement of a signal related to HRV and the use of a novel algorithm based on signal geometry (7) to detect signal peaks, in order to accurately determine cardiac periods and the Poincare plot (9)-(10). The results reported are comparable to the ones reached with the gold standard (ECG) and in literature (3)-(5). We report mean values of HP of 832±54 ms and 832±55 ms by means of ECG and VCG, respectively. Moreover, this algorithm allow us to identify particular features of ECG and VCG signals, so that in the future we will be able to evaluate specific correlations between the two. (paper)

  8. Difference infiltrometer: a method to measure temporally variable infiltration rates during rainstorms

    Science.gov (United States)

    Moody, John A.; Ebel, Brian A.

    2012-01-01

    We developed a difference infiltrometer to measure time series of non-steady infiltration rates during rainstorms at the point scale. The infiltrometer uses two, tipping bucket rain gages. One gage measures rainfall onto, and the other measures runoff from, a small circular plot about 0.5-m in diameter. The small size allows the infiltration rate to be computed as the difference of the cumulative rainfall and cumulative runoff without having to route water through a large plot. Difference infiltrometers were deployed in an area burned by the 2010 Fourmile Canyon Fire near Boulder, Colorado, USA, and data were collected during the summer of 2011. The difference infiltrometer demonstrated the capability to capture different magnitudes of infiltration rates and temporal variability associated with convective (high intensity, short duration) and cyclonic (low intensity, long duration) rainstorms. Data from the difference infiltrometer were used to estimate saturated hydraulic conductivity of soil affected by the heat from a wildfire. The difference infiltrometer is portable and can be deployed in rugged, steep terrain and does not require the transport of water, as many rainfall simulators require, because it uses natural rainfall. It can be used to assess infiltration models, determine runoff coefficients, identify rainfall depth or rainfall intensity thresholds to initiate runoff, estimate parameters for infiltration models, and compare remediation treatments on disturbed landscapes. The difference infiltrometer can be linked with other types of soil monitoring equipment in long-term studies for detecting temporal and spatial variability at multiple time scales and in nested designs where it can be linked to hillslope and basin-scale runoff responses.

  9. System importance measures: A new approach to resilient systems-of-systems

    Science.gov (United States)

    Uday, Payuna

    Resilience is the ability to withstand and recover rapidly from disruptions. While this attribute has been the focus of research in several fields, in the case of system-of-systems (SoSs), addressing resilience is particularly interesting and challenging. As infrastructure SoSs, such as power, transportation, and communication networks, grow in complexity and interconnectivity, measuring and improving the resilience of these SoSs is vital in terms of safety and providing uninterrupted services. The characteristics of systems-of-systems make analysis and design of resilience challenging. However, these features also offer opportunities to make SoSs resilient using unconventional methods. In this research, we present a new approach to the process of resilience design. The core idea behind the proposed design process is a set of system importance measures (SIMs) that identify systems crucial to overall resilience. Using the results from the SIMs, we determine appropriate strategies from a list of design principles to improve SoS resilience. The main contribution of this research is the development of an aid to design that provides specific guidance on where and how resources need to be targeted. Based on the needs of an SoS, decision-makers can iterate through the design process to identify a set of practical and effective design improvements. We use two case studies to demonstrate how the SIM-based design process can inform decision-making in the context of SoS resilience. The first case study focuses on a naval warfare SoS and describes how the resilience framework can leverage existing simulation models to support end-to-end design. We proceed through stages of the design approach using an agent-based model (ABM) that enables us to demonstrate how simulation tools and analytical models help determine the necessary inputs for the design process and, subsequently, inform decision-making regarding SoS resilience. The second case study considers the urban

  10. Recent Methods for Measuring Dopamine D3 receptor Occupancy In Vivo: Importance for Drug Development

    Directory of Open Access Journals (Sweden)

    Bernard eLe Foll

    2014-07-01

    Full Text Available There is considerable interest in developing highly selective dopamine D3 receptor ligands for a variety of mental health disorders. Dopamine D3 receptors have been implicated in Parkinson’s Disease, schizophrenia, anxiety, depression, and substance use disorders. The most concrete evidence suggests a role for the D3 receptor in drug-seeking behaviors. D3 receptors are a subtype of D2 receptors, and traditionally the functional role of these two receptors has been difficult to differentiate. Over the past 10-15 years a number of compounds selective for D3 over D2 receptors have been developed. However, translating these findings into clinical research has been difficult as many of these compounds cannot be used in humans. Therefore, the functional data involving the D3 receptor in drug addiction mostly comes from preclinical studies. Recently, with the advent of [11C]-(+-PHNO, it has become possible to image D3 receptors in the human brain with increased selectivity and sensitivity. This is a significant innovation over traditional methods such as [11C]-raclopride that cannot differentiate between D2 and D3 receptors. The use of [11C]-(+-PHNO will allow for further delineation of the role of D3 receptors. Here, we review recent evidence that the role of the D3 receptor has functional importance and is distinct from the role of the D2 receptor. We then introduce the utility of analyzing [11C]-(+-PHNO binding by region of interest. This novel methodology can be used in preclinical and clinical approaches for the measurement of occupancy of both D3 and D2 receptors. Evidence that [11C]-(+-PHNO can provide insights into the function of D3 receptors in addiction is also presented.

  11. The effects of metronome breathing on the variability of autonomic activity measurements.

    Science.gov (United States)

    Driscoll, D; Dicicco, G

    2000-01-01

    Many chiropractors hypothesize that spinal manipulation affects the autonomic nervous system (ANS). However, the ANS responses to chiropractic manipulative therapy are not well documented, and more research is needed to support this hypothesis. This study represents a step toward the development of a reliable method by which to document that chiropractic manipulative therapy does affect the ANS by exploring the use of paced breathing as a way to reduce the inherent variability in ANS measurements. To examine the hypothesis that the variability of ANS measurements would be reduced if breathing were paced to a metronome at 12 breaths/min. The study was performed at Parker College Research Institute. Eight normotensive subjects were recruited from the student body and staff. Respiration frequency was measured through a strain gauge. A 3-lead electrocardiogram (ECG) was used to register the electric activity of the heart, and arterial tonometry monitors were used to record the left and right radial artery blood pressures. Signals were recorded on an IBM-compatible computer with a sampling frequency of 100 Hz. Normal breathing was used for the first 3 recordings, and breathing was paced to a metronome for the final 3 recordings at 12 breaths/min. Fourier analysis was performed on the beat-by-beat fluctuations of the ECG-determined R-R interval and systolic arterial pressure (SBP). Low-frequency fluctuations (LF; 0.04-0.15 Hz) reflected sympathetic activity, whereas high-frequency fluctuations (HF; 0.15-0.4 Hz) represented parasympathetic activity. Sympathovagal indices were determined from the ratio of the two bandwidths (LF/HF). The coefficient of variation (CV%) for autonomic parameters was calculated ([average/SD] x 100%) to compare breathing normally and breathing to a metronome with respect to variability. One-way analysis of variance was used to detect differences. A value of P Metronome breathing did not produce any significant changes in blood pressure for the

  12. MEASURING X-RAY VARIABILITY IN FAINT/SPARSELY SAMPLED ACTIVE GALACTIC NUCLEI

    Energy Technology Data Exchange (ETDEWEB)

    Allevato, V. [Department of Physics, University of Helsinki, Gustaf Haellstroemin katu 2a, FI-00014 Helsinki (Finland); Paolillo, M. [Department of Physical Sciences, University Federico II, via Cinthia 6, I-80126 Naples (Italy); Papadakis, I. [Department of Physics and Institute of Theoretical and Computational Physics, University of Crete, 71003 Heraklion (Greece); Pinto, C. [SRON Netherlands Institute for Space Research, Sorbonnelaan 2, 3584-CA Utrecht (Netherlands)

    2013-07-01

    We study the statistical properties of the normalized excess variance of variability process characterized by a ''red-noise'' power spectral density (PSD), as in the case of active galactic nuclei (AGNs). We perform Monte Carlo simulations of light curves, assuming both a continuous and a sparse sampling pattern and various signal-to-noise ratios (S/Ns). We show that the normalized excess variance is a biased estimate of the variance even in the case of continuously sampled light curves. The bias depends on the PSD slope and on the sampling pattern, but not on the S/N. We provide a simple formula to account for the bias, which yields unbiased estimates with an accuracy better than 15%. We show that the normalized excess variance estimates based on single light curves (especially for sparse sampling and S/N < 3) are highly uncertain (even if corrected for bias) and we propose instead the use of an ''ensemble estimate'', based on multiple light curves of the same object, or on the use of light curves of many objects. These estimates have symmetric distributions, known errors, and can also be corrected for biases. We use our results to estimate the ability to measure the intrinsic source variability in current data, and show that they could also be useful in the planning of the observing strategy of future surveys such as those provided by X-ray missions studying distant and/or faint AGN populations and, more in general, in the estimation of the variability amplitude of sources that will result from future surveys such as Pan-STARRS and LSST.

  13. Monitoring variables affecting positron emission tomography measurements of cerebral blood flow in anaesthetized pigs

    DEFF Research Database (Denmark)

    Alstrup, Aage Kristian Olsen; Zois, Nora Elisabeth; Simonsen, Mette

    2018-01-01

    Background Positron emission tomography (PET) imaging of anaesthetized pig brains is a useful tool in neuroscience. Stable cerebral blood flow (CBF) is essential for PET, since variations can affect the distribution of several radiotracers. However, the effect of physiological factors regulating...... and the monitoring parameters. Results No significant statistical correlations were found between CBF and the nine monitoring variables. However, we found that arterial carbon dioxide tension (PaCO2) and body temperature were important predictors of CBF that should be observed and kept constant. In addition, we...... found that long-duration anaesthesia was significantly correlated with high heart rate, low arterial oxygen tension, and high body temperature, but not with CBF. Conclusions The findings indicate that PaCO2 and body temperature are crucial for maintaining stable levels of CBF and thus optimizing PET...

  14. Biogenic carbon in combustible waste: waste composition, variability and measurement uncertainty.

    Science.gov (United States)

    Larsen, Anna W; Fuglsang, Karsten; Pedersen, Niels H; Fellner, Johann; Rechberger, Helmut; Astrup, Thomas

    2013-10-01

    Obtaining accurate data for the contents of biogenic and fossil carbon in thermally-treated waste is essential for determination of the environmental profile of waste technologies. Relations between the variability of waste chemistry and the biogenic and fossil carbon emissions are not well described in the literature. This study addressed the variability of biogenic and fossil carbon in combustible waste received at a municipal solid waste incinerator. Two approaches were compared: (1) radiocarbon dating ((14)C analysis) of carbon dioxide sampled from the flue gas, and (2) mass and energy balance calculations using the balance method. The ability of the two approaches to accurately describe short-term day-to-day variations in carbon emissions, and to which extent these short-term variations could be explained by controlled changes in waste input composition, was evaluated. Finally, the measurement uncertainties related to the two approaches were determined. Two flue gas sampling campaigns at a full-scale waste incinerator were included: one during normal operation and one with controlled waste input. Estimation of carbon contents in the main waste types received was included. Both the (14)C method and the balance method represented promising methods able to provide good quality data for the ratio between biogenic and fossil carbon in waste. The relative uncertainty in the individual experiments was 7-10% (95% confidence interval) for the (14)C method and slightly lower for the balance method.

  15. A Systematic Review of Statistical Methods Used to Test for Reliability of Medical Instruments Measuring Continuous Variables

    Directory of Open Access Journals (Sweden)

    Rafdzah Zaki

    2013-06-01

    Full Text Available   Objective(s: Reliability measures precision or the extent to which test results can be replicated. This is the first ever systematic review to identify statistical methods used to measure reliability of equipment measuring continuous variables. This studyalso aims to highlight the inappropriate statistical method used in the reliability analysis and its implication in the medical practice.   Materials and Methods: In 2010, five electronic databases were searched between 2007 and 2009 to look for reliability studies. A total of 5,795 titles were initially identified. Only 282 titles were potentially related, and finally 42 fitted the inclusion criteria. Results: The Intra-class Correlation Coefficient (ICC is the most popular method with 25 (60% studies having used this method followed by the comparing means (8 or 19%. Out of 25 studies using the ICC, only 7 (28% reported the confidence intervals and types of ICC used. Most studies (71% also tested the agreement of instruments. Conclusion: This study finds that the Intra-class Correlation Coefficient is the most popular method used to assess the reliability of medical instruments measuring continuous outcomes. There are also inappropriate applications and interpretations of statistical methods in some studies. It is important for medical researchers to be aware of this issue, and be able to correctly perform analysis in reliability studies.

  16. Understanding Spatiotemporal Variability of Fine Particulate Matter in an Urban Environment Using Combined Fixed and Mobile Measurements

    Science.gov (United States)

    Sullivan, R.; Pryor, S. C.; Barthelmie, R. J.; Filippelli, G. M.

    2013-12-01

    periods of high PM2.5 concentrations (> 35 micrograms per cubic meter) were approx. 20-25% lower than during all observational periods indicating the key importance of local sources and stagnation in generating extreme PM2.5 concentrations. ● There is substantial spatial correlation in PM2.5 concentrations at the four measurement sites but the absolute concentrations exhibit large site-to-site variability. These findings highlight the high spatiotemporal variability of PM2.5 concentrations in urban environments. Thus, an innovative measurement approach to mobile sampling is being conducted during August 2013. Continuous geo-referenced measurements of PM2.5 are being conducted using pDR samplers carried on bicycle transects across the city. The resulting data will be used to quantify the true spatiotemporal variability of, and therefore population exposure to, PM2.5 and will further be used to investigate major sources of PM2.5 in Indianapolis and to investigate causes of extreme concentrations.

  17. Sources of variability in OSL dose measurements using single grains of quartz

    International Nuclear Information System (INIS)

    Thomsen, K.J.; Murray, A.S.; Boetter-Jensen, L.

    2005-01-01

    In luminescence-based measurements of dose distributions in unheated mineral samples, the observed spread in dose values is usually attributed to four main factors: fluctuations in the number of photons counted, incomplete zeroing of any prior trapped charge (including signals arising from thermal transfer), heterogeneity in dosimetry, and instrument reproducibility. For correct interpretation of measured dose distributions in retrospective dosimetry, it is important to understand the relative importance of these components, and to establish whether other factors also contribute to the observed spread. In this preliminary study, dose distributions have been studied using single grains of heated and laboratory irradiated quartz. By heating the sample, the contribution from incomplete zeroing was excluded and at the same time the sample was sensitised. The laboratory gamma irradiation was designed to deliver a uniform dose to the sample. Thus it was anticipated that statistical fluctuations in the number of photons counted and instrument reproducibility, both quantifiable entities, should be able to account for all the observed variance in the measured dose distributions. We examine this assumption in detail, and discuss the origins and importance of the residual variance in our data

  18. Intra and interobserver variability of renal allograft ultrasound volume and resistive index measurements

    International Nuclear Information System (INIS)

    Mancini, Marcello; Liuzzi, Raffaele; Daniele, Stefania; Raffio, Teresa; Salvatore, Marco; Sabbatini, Massimo; Cianciaruso, Bruno; Ferrara, Liberato Aldo

    2005-01-01

    Purpose: Aim of the presents study was to evaluate the repeatability and reproducibility of the Doppler Resistive Index (R.I.) and the Ultrasound renal volume measurement in renal transplants. Materials and methods: Twenty -six consecutive patients (18 men, 8 women) mean age of 42,8±12,4 years (M±SD)(range 22-65 years) were studied twice by each of two trained sonographers using a color Doppler ultrasound scanner. Twelve of them had a normal allograft function (defined as stable serum creatinine levels ≤123,76 μmol/L), whilst the remaining 14 had decreased allograft function (serum creatinine 132.6-265.2 μmol/L). Results were given as mean of 6 measurements performed at upper, middle and lower pole of the kidney. Intra- and interobserver variability was assessed by the repeatability coefficient and coefficient of variation (CV). Results: Regarding Resistive Index measurement, repeatability coefficient was between 0.04 and 0.06 and the coefficient of variation was [it

  19. Past and present variability of the solar-terrestrial system: measurement, data analysis and theoretical models

    Energy Technology Data Exchange (ETDEWEB)

    Cini Castagnoli, G.; Provenzale, A. [eds.

    1997-12-31

    The course Past and present variability of the solar-terrestrial system: measurement, data analysis and theoretical models is explicitly devoted to these issues. A solar cycle ago, in summer 1985, G. Cini organized a similar school, in a time when this field was in a very early stage of development and definitely fewer high-quality measurements were available. After eleven years, the field has grown toward becoming a robust scientific discipline, new data have been obtained, and new ideas have been proposed by both solar physicists and climate dynamicists. For this reason, the authors felt that it was the right time to organize a new summer school, with the aim of formalizing the developments that have taken place during these years, and also for speculating and maybe dreaming of new results that will be achieved in the upcoming years. The papers of the lectures have now been collected in this volume. First, in order to know what the authors talking about, they need to obtain reliable data from terrestrial archives,and to properly date the records that have been measured. To these crucial aspects is devoted the first part of the book, dealing with various types of proxy data and with the difficult issue of the dating of the records.

  20. Biogenic carbon in combustible waste: Waste composition, variability and measurement uncertainty

    DEFF Research Database (Denmark)

    Larsen, Anna Warberg; Fuglsang, Karsten; Pedersen, Niels H.

    2013-01-01

    described in the literature. This study addressed the variability of biogenic and fossil carbon in combustible waste received at a municipal solid waste incinerator. Two approaches were compared: (1) radiocarbon dating (14C analysis) of carbon dioxide sampled from the flue gas, and (2) mass and energy......, the measurement uncertainties related to the two approaches were determined. Two flue gas sampling campaigns at a full-scale waste incinerator were included: one during normal operation and one with controlled waste input. Estimation of carbon contents in the main waste types received was included. Both the 14C...... method and the balance method represented promising methods able to provide good quality data for the ratio between biogenic and fossil carbon in waste. The relative uncertainty in the individual experiments was 7–10% (95% confidence interval) for the 14C method and slightly lower for the balance method....

  1. Cortisol Variability and Self-reports in the Measurement of Work-related Stress

    DEFF Research Database (Denmark)

    Karlson, Björn; Eek, Frida; Hansen, Åse Marie

    2011-01-01

    We examined whether a high cortisol awakening response (CAR) and low cortisol decline over the day (CDD) are related to self-reported work stress and well-being, and whether there are gender differences in these relationships. Three hundred eighty-three working men and women responded to a survey...... measuring job stress factors, mastery at work, symptoms and well-being. Salivary cortisol was sampled at awakening, after 45 min and at 21:00, from which the variables CAR and CDD were defi ned. A high CAR was associated with lower perceived job control and work mastery, and poorer well-being. Low CDD...... men, a similar comparison showed those with low CDD to have poorer scores on job stress factors and symptom load. We conclude that individuals displaying high CAR or low CDD differ from those not displaying these cortisol profi les in self-report of work stress and well-being, and that gender...

  2. Continuous-variable measurement-device-independent quantum key distribution with photon subtraction

    Science.gov (United States)

    Ma, Hong-Xin; Huang, Peng; Bai, Dong-Yun; Wang, Shi-Yu; Bao, Wan-Su; Zeng, Gui-Hua

    2018-04-01

    It has been found that non-Gaussian operations can be applied to increase and distill entanglement between Gaussian entangled states. We show the successful use of the non-Gaussian operation, in particular, photon subtraction operation, on the continuous-variable measurement-device-independent quantum key distribution (CV-MDI-QKD) protocol. The proposed method can be implemented based on existing technologies. Security analysis shows that the photon subtraction operation can remarkably increase the maximal transmission distance of the CV-MDI-QKD protocol, which precisely make up for the shortcoming of the original CV-MDI-QKD protocol, and one-photon subtraction operation has the best performance. Moreover, the proposed protocol provides a feasible method for the experimental implementation of the CV-MDI-QKD protocol.

  3. Gas permeation measurement under defined humidity via constant volume/variable pressure method

    KAUST Repository

    Jan Roman, Pauls

    2012-02-01

    Many industrial gas separations in which membrane processes are feasible entail high water vapour contents, as in CO 2-separation from flue gas in carbon capture and storage (CCS), or in biogas/natural gas processing. Studying the effect of water vapour on gas permeability through polymeric membranes is essential for materials design and optimization of these membrane applications. In particular, for amine-based CO 2 selective facilitated transport membranes, water vapour is necessary for carrier-complex formation (Matsuyama et al., 1996; Deng and Hägg, 2010; Liu et al., 2008; Shishatskiy et al., 2010) [1-4]. But also conventional polymeric membrane materials can vary their permeation behaviour due to water-induced swelling (Potreck, 2009) [5]. Here we describe a simple approach to gas permeability measurement in the presence of water vapour, in the form of a modified constant volume/variable pressure method (pressure increase method). © 2011 Elsevier B.V.

  4. Self-referenced continuous-variable measurement-device-independent quantum key distribution

    Science.gov (United States)

    Wang, Yijun; Wang, Xudong; Li, Jiawei; Huang, Duan; Zhang, Ling; Guo, Ying

    2018-05-01

    We propose a scheme to remove the demand of transmitting a high-brightness local oscillator (LO) in continuous-variable measurement-device-independent quantum key distribution (CV-MDI QKD) protocol, which we call as the self-referenced (SR) CV-MDI QKD. We show that our scheme is immune to the side-channel attacks, such as the calibration attacks, the wavelength attacks and the LO fluctuation attacks, which are all exploiting the security loopholes introduced by transmitting the LO. Besides, the proposed scheme waives the necessity of complex multiplexer and demultiplexer, which can greatly simplify the QKD processes and improve the transmission efficiency. The numerical simulations under collective attacks show that all the improvements brought about by our scheme are only at the expense of slight transmission distance shortening. This scheme shows an available method to mend the security loopholes incurred by transmitting LO in CV-MDI QKD.

  5. Finite-size analysis of continuous-variable measurement-device-independent quantum key distribution

    Science.gov (United States)

    Zhang, Xueying; Zhang, Yichen; Zhao, Yijia; Wang, Xiangyu; Yu, Song; Guo, Hong

    2017-10-01

    We study the impact of the finite-size effect on the continuous-variable measurement-device-independent quantum key distribution (CV-MDI QKD) protocol, mainly considering the finite-size effect on the parameter estimation procedure. The central-limit theorem and maximum likelihood estimation theorem are used to estimate the parameters. We also analyze the relationship between the number of exchanged signals and the optimal modulation variance in the protocol. It is proved that when Charlie's position is close to Bob, the CV-MDI QKD protocol has the farthest transmission distance in the finite-size scenario. Finally, we discuss the impact of finite-size effects related to the practical detection in the CV-MDI QKD protocol. The overall results indicate that the finite-size effect has a great influence on the secret-key rate of the CV-MDI QKD protocol and should not be ignored.

  6. Carpet-dust chemicals as measures of exposure: Implications of variability

    Directory of Open Access Journals (Sweden)

    Whitehead Todd P

    2012-03-01

    Full Text Available Abstract Background There is increasing interest in using chemicals measured in carpet dust as indicators of chemical exposures. However, investigators have rarely sampled dust repeatedly from the same households and therefore little is known about the variability of chemical levels that exist within and between households in dust samples. Results We analyzed 9 polycyclic aromatic hydrocarbons, 6 polychlorinated biphenyls, and nicotine in 68 carpet-dust samples from 21 households in agricultural communities of Fresno County, California collected from 2003-2005. Chemical concentrations (ng per g dust ranged from Conclusions Our findings suggest that attenuation bias should be relatively modest when using these semi-volatile carpet-dust chemicals as exposure surrogates in epidemiologic studies.

  7. Development of a time-variable nuclear pulser for half life measurements

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Domienikan, Claudio; Carvalhaes, Roberto P. M.; Genezini, Frederico A.

    2013-01-01

    In this work a time-variable pulser system with an exponentially-decaying pulse frequency is presented, which was developed using the low-cost, open-source Arduino microcontroler plataform. In this system, the microcontroller produces a TTL signal in the selected rate and a pulse shaper board adjusts it to be entered in an amplifier as a conventional pulser signal; both the decay constant and the initial pulse rate can be adjusted using a user-friendly control software, and the pulse amplitude can be adjusted using a potentiometer in the pulse shaper board. The pulser was tested using several combinations of initial pulse rate and decay constant, and the results show that the system is stable and reliable, and is suitable to be used in half-life measurements.

  8. Development of a time-variable nuclear pulser for half life measurements

    Energy Technology Data Exchange (ETDEWEB)

    Zahn, Guilherme S.; Domienikan, Claudio; Carvalhaes, Roberto P. M.; Genezini, Frederico A. [Instituto de Pesquisas Energeticas e Nucleares - IPEN-CNEN/SP. P.O. Box 11049, Sao Paulo, 05422-970 (Brazil)

    2013-05-06

    In this work a time-variable pulser system with an exponentially-decaying pulse frequency is presented, which was developed using the low-cost, open-source Arduino microcontroler plataform. In this system, the microcontroller produces a TTL signal in the selected rate and a pulse shaper board adjusts it to be entered in an amplifier as a conventional pulser signal; both the decay constant and the initial pulse rate can be adjusted using a user-friendly control software, and the pulse amplitude can be adjusted using a potentiometer in the pulse shaper board. The pulser was tested using several combinations of initial pulse rate and decay constant, and the results show that the system is stable and reliable, and is suitable to be used in half-life measurements.

  9. α-Cut method based importance measure for criticality analysis in fuzzy probability – Based fault tree analysis

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry; Sony Tjahyani, D.T.; Widodo, Surip; Tjahjono, Hendro

    2017-01-01

    Highlights: •FPFTA deals with epistemic uncertainty using fuzzy probability. •Criticality analysis is important for reliability improvement. •An α-cut method based importance measure is proposed for criticality analysis in FPFTA. •The α-cut method based importance measure utilises α-cut multiplication, α-cut subtraction, and area defuzzification technique. •Benchmarking confirm that the proposed method is feasible for criticality analysis in FPFTA. -- Abstract: Fuzzy probability – based fault tree analysis (FPFTA) has been recently developed and proposed to deal with the limitations of conventional fault tree analysis. In FPFTA, reliabilities of basic events, intermediate events and top event are characterized by fuzzy probabilities. Furthermore, the quantification of the FPFTA is based on fuzzy multiplication rule and fuzzy complementation rule to propagate uncertainties from basic event to the top event. Since the objective of the fault tree analysis is to improve the reliability of the system being evaluated, it is necessary to find the weakest path in the system. For this purpose, criticality analysis can be implemented. Various importance measures, which are based on conventional probabilities, have been developed and proposed for criticality analysis in fault tree analysis. However, not one of those importance measures can be applied for criticality analysis in FPFTA, which is based on fuzzy probability. To be fully applied in nuclear power plant probabilistic safety assessment, FPFTA needs to have its corresponding importance measure. The objective of this study is to develop an α-cut method based importance measure to evaluate and rank the importance of basic events for criticality analysis in FPFTA. To demonstrate the applicability of the proposed measure, a case study is performed and its results are then benchmarked to the results generated by the four well known importance measures in conventional fault tree analysis. The results

  10. The use of Fluorescence Resonance Energy Transfer (FRET peptidesfor measurement of clinically important proteolytic enzymes

    Directory of Open Access Journals (Sweden)

    Adriana K. Carmona

    2009-09-01

    Full Text Available Proteolytic enzymes have a fundamental role in many biological processes and are associated with multiple pathological conditions. Therefore, targeting these enzymes may be important for a better understanding of their function and development of therapeutic inhibitors. Fluorescence Resonance Energy Transfer (FRET peptides are convenient tools for the study of peptidases specificity as they allow monitoring of the reaction on a continuous basis, providing a rapid method for the determination of enzymatic activity. Hydrolysis of a peptide bond between the donor/acceptor pair generates fluorescence that permits the measurement of the activity of nanomolar concentrations of the enzyme. The assays can be performed directly in a cuvette of the fluorimeter or adapted for determinations in a 96-well fluorescence plate reader. The synthesis of FRET peptides containing ortho-aminobenzoic acid (Abz as fluorescent group and 2, 4-dinitrophenyl (Dnp or N-(2, 4-dinitrophenylethylenediamine (EDDnp as quencher was optimized by our group and became an important line of research at the Department of Biophysics of the Federal University of São Paulo. Recently, Abz/Dnp FRET peptide libraries were developed allowing high-throughput screening of peptidases substrate specificity. This review presents the consolidation of our research activities undertaken between 1993 and 2008 on the synthesis of peptides and study of peptidases specificities.As enzimas proteolíticas têm um papel fundamental em muitos processos biológicos e estão associadas a vários estados patológicos. Por isso, o estudo da especificidade das peptidases pode ser importante para uma melhor compreensão da função destas enzimas e para o desenvolvimento de inibidores. Os substratos com supressão intramolecular de fluorescência constituem uma excelente ferramenta, pois permitem o monitoramento da reação de forma contínua, proporcionando um método prático e rápido para a determinação da

  11. Assessment of the intraday variability of anthropometric measurements in the work environment: a pilot study.

    Science.gov (United States)

    Bragança, Sara; Arezes, Pedro; Carvalho, Miguel; Ashdown, Susan P; Leão, Celina

    2017-05-19

    Sitting for long periods of time, both during work and leisure times, is the typical behavior of the modern society. Especially at work, where there is not much flexibility, adopting the sitting posture for the entire day can cause some short-term and long-term effects. As workers' productivity and well-being relies on working conditions, evaluating the effects caused by work postures assumes a very important role. The purpose of this article was to evaluate the variation of some anthropometric measurements during one typical workday to understand whether the known long-term effects can also be seen and quantified in an 8-h period. Twenty participants were measured before and after work, using traditional anthropometry equipment. The data from the two repetitions were compared using statistical tests. The results showed a slight variation in the anthropometric measurements, some with a tendency to increase over time and others with a tendency to decrease.

  12. Comparing the importance of quality measurement themes in juvenile idiopathic inflammatory myositis between patients and families and healthcare professionals.

    Science.gov (United States)

    Tory, Heather O; Carrasco, Ruy; Griffin, Thomas; Huber, Adam M; Kahn, Philip; Robinson, Angela Byun; Zurakowski, David; Kim, Susan

    2018-04-19

    A standardized set of quality measures for juvenile idiopathic inflammatory myopathies (JIIM) is not in use. Discordance has been shown between the importance ascribed to quality measures between patients and families and physicians. The objective of this study was to assess and compare the importance of various aspects of high quality care to patients with JIIM and their families with healthcare providers, to aid in future development of comprehensive quality measures. Surveys were developed by members of the Childhood Arthritis and Rheumatology Research Alliance (CARRA) Juvenile Dermatomyositis Workgroup through a consensus process and administered to patients and families through the CureJM Foundation and to healthcare professionals through CARRA. The survey asked respondents to rate the importance of 19 items related to aspects of high quality care, using a Likert scale. Patients and families gave generally higher scores for importance to most of the quality measurement themes compared with healthcare professionals, with ratings of 13 of the 19 measures reaching statistical significance (p quality of life, timely diagnosis, access to rheumatology, normalization of functioning/strength, and ability for self care. Despite overall differences in the rating of importance of quality indicators between patients and families and healthcare professionals, the groups agreed on the most important aspects of care. Recognizing areas of particular importance to patients and families, and overlapping in importance with providers, will promote the development of standardized quality measures with the greatest potential for improving care and outcomes for children with JIIM.

  13. Measuring household food insecurity: why it's so important and yet so difficult to do.

    Science.gov (United States)

    Webb, Patrick; Coates, Jennifer; Frongillo, Edward A; Rogers, Beatrice Lorge; Swindale, Anne; Bilinsky, Paula

    2006-05-01

    Food insecurity is a daily reality for hundreds of millions of people around the world. Although its most extreme manifestations are often obvious, many other households facing constraints in their access to food are less identifiable. Operational agencies lack a method for differentiating households at varying degrees of food insecurity in order to target and evaluate their interventions. This chapter provides an overview of a set of papers associated with a research initiative that seeks to identify more precise, yet simple, measures of household food insecurity. The overview highlights three main conceptual developments associated with practical approaches to measuring constraints in access to food: 1) a shift from using measures of food availability and utilization to measuring "inadequate access"; 2) a shift from a focus on objective to subjective measures; and 3) a growing emphasis on fundamental measurement as opposed to reliance on distal, proxy measures. Further research is needed regarding 1) how well measures of household food insecurity designed for chronically food-insecure contexts capture the processes leading to, and experience of, acute food insecurity, 2) the impact of short-term shocks, such as major floods or earthquake, on household behaviors that determine responses to food security questions, 3) better measurement of the interaction between severity and frequency of household food insecurity behaviors, and 4) the determination of whether an individual's response to survey questions can be representative of the food insecurity experiences of all members of the household.

  14. Changes in the Complexity of Heart Rate Variability with Exercise Training Measured by Multiscale Entropy-Based Measurements

    Directory of Open Access Journals (Sweden)

    Frederico Sassoli Fazan

    2018-01-01

    Full Text Available Quantifying complexity from heart rate variability (HRV series is a challenging task, and multiscale entropy (MSE, along with its variants, has been demonstrated to be one of the most robust approaches to achieve this goal. Although physical training is known to be beneficial, there is little information about the long-term complexity changes induced by the physical conditioning. The present study aimed to quantify the changes in physiological complexity elicited by physical training through multiscale entropy-based complexity measurements. Rats were subject to a protocol of medium intensity training ( n = 13 or a sedentary protocol ( n = 12 . One-hour HRV series were obtained from all conscious rats five days after the experimental protocol. We estimated MSE, multiscale dispersion entropy (MDE and multiscale SDiff q from HRV series. Multiscale SDiff q is a recent approach that accounts for entropy differences between a given time series and its shuffled dynamics. From SDiff q , three attributes (q-attributes were derived, namely SDiff q m a x , q m a x and q z e r o . MSE, MDE and multiscale q-attributes presented similar profiles, except for SDiff q m a x . q m a x showed significant differences between trained and sedentary groups on Time Scales 6 to 20. Results suggest that physical training increases the system complexity and that multiscale q-attributes provide valuable information about the physiological complexity.

  15. Use of count-based image reconstruction to evaluate the variability and repeatability of measured standardised uptake values.

    Directory of Open Access Journals (Sweden)

    Tomohiro Kaneta

    Full Text Available Standardized uptake values (SUVs are the most widely used quantitative imaging biomarkers in PET. It is important to evaluate the variability and repeatability of measured SUVs. Phantom studies seem to be essential for this purpose; however, repetitive phantom scanning is not recommended due to the decay of radioactivity. In this study, we performed count-based image reconstruction to avoid the influence of decay using two different PET/CT scanners. By adjusting the ratio of 18F-fluorodeoxyglucose solution to tap water, a NEMA IEC body phantom was set for SUVs of 4.0 inside six hot spheres. The PET data were obtained using two scanners (Aquiduo and Celesteion; Toshiba Medical Systems, Tochigi, Japan. We set the start time for image reconstruction when the total radioactivity in the phantom was 2.53 kBq/cc, and employed the counts of the first 2-min acquisition as the standard. To maintain the number of counts for each image, we set the acquisition time for image reconstruction depending on the decay of radioactivity. We obtained 50 images, and calculated the SUVmax and SUVpeak of all six spheres in each image. The average values of the SUVmax were used to calculate the recovery coefficients to compare those measured by the two different scanners. Bland-Altman analyses of the SUVs measured by the two scanners were also performed. The measured SUVs using the two scanners exhibited a 10-30% difference, and the standard deviation (SD of the measured SUVs was between 0.1-0.2. The Celesteion always exhibited higher values than the Aquiduo. The smaller sphere exhibited a larger SD, and the SUVpeak had a smaller SD than the SUVmax. The Bland-Altman analyses showed poor agreement between the SUVs measured by the two scanners. The recovery coefficient curves obtained from the two scanners were considerably different. The Celesteion exhibited higher recovery coefficients than the Aquiduo, especially at approximately 20-mm-diameter. Additionally, the curves

  16. Meeting report: batch-to-batch variability in estrogenic activity in commercial animal diets--importance and approaches for laboratory animal research.

    Science.gov (United States)

    Heindel, Jerrold J; vom Saal, Frederick S

    2008-03-01

    We report information from two workshops sponsored by the National Institutes of Health that were held to a) assess whether dietary estrogens could significantly impact end points in experimental animals, and b) involve program participants and feed manufacturers to address the problems associated with measuring and eliminating batch-to-batch variability in rodent diets that may lead to conflicting findings in animal experiments within and between laboratories. Data were presented at the workshops showing that there is significant batch-to-batch variability in estrogenic content of commercial animal diets, and that this variability results in differences in experimental outcomes. A combination of methods were proposed to determine levels of total estrogenic activity and levels of specific estrogenic constituents in soy-containing, casein-containing, and other soy-free rodent diets. Workshop participants recommended that researchers pay greater attention to the type of diet being used in animal studies and choose a diet whose estrogenic activity (or lack thereof) is appropriate for the experimental model and end points of interest. Information about levels of specific phytoestrogens, as well as estrogenic activity caused by other contaminants and measured by bioassay, should be disclosed in scientific publications. This will require laboratory animal diet manufacturers to provide investigators with information regarding the phytoestrogen content and other estrogenic compounds in commercial diets used in animal research.

  17. Selection of entropy-measure parameters for knowledge discovery in heart rate variability data.

    Science.gov (United States)

    Mayer, Christopher C; Bachler, Martin; Hörtenhuber, Matthias; Stocker, Christof; Holzinger, Andreas; Wassertheurer, Siegfried

    2014-01-01

    Heart rate variability is the variation of the time interval between consecutive heartbeats. Entropy is a commonly used tool to describe the regularity of data sets. Entropy functions are defined using multiple parameters, the selection of which is controversial and depends on the intended purpose. This study describes the results of tests conducted to support parameter selection, towards the goal of enabling further biomarker discovery. This study deals with approximate, sample, fuzzy, and fuzzy measure entropies. All data were obtained from PhysioNet, a free-access, on-line archive of physiological signals, and represent various medical conditions. Five tests were defined and conducted to examine the influence of: varying the threshold value r (as multiples of the sample standard deviation σ, or the entropy-maximizing rChon), the data length N, the weighting factors n for fuzzy and fuzzy measure entropies, and the thresholds rF and rL for fuzzy measure entropy. The results were tested for normality using Lilliefors' composite goodness-of-fit test. Consequently, the p-value was calculated with either a two sample t-test or a Wilcoxon rank sum test. The first test shows a cross-over of entropy values with regard to a change of r. Thus, a clear statement that a higher entropy corresponds to a high irregularity is not possible, but is rather an indicator of differences in regularity. N should be at least 200 data points for r = 0.2 σ and should even exceed a length of 1000 for r = rChon. The results for the weighting parameters n for the fuzzy membership function show different behavior when coupled with different r values, therefore the weighting parameters have been chosen independently for the different threshold values. The tests concerning rF and rL showed that there is no optimal choice, but r = rF = rL is reasonable with r = rChon or r = 0.2σ. Some of the tests showed a dependency of the test significance on the data at hand. Nevertheless, as the medical

  18. Measuring stress level of dairy cows during milking using by geometric indices of heart rate variability

    Directory of Open Access Journals (Sweden)

    Levente Kovács

    2013-05-01

    Heart rate (HR and heart rate variability (HRV were investigated in cows (n=32, age: 3.86 years, milk production: 35±2.5 kg, DIM: 150±15 milked in a parallel milking parlour. Geometric parameters of HRV (SD1 and SD2 were calculated using Poincare graphs. HRV indices of resting 1 h after midday milking (reference period were compared to those measured during the different phases of the evening milking (driving; in the holding pen; udder preparation; milking; after milking in the milking stall. There was no difference between the reference period and the different phases of milking in animal welfare terms. During the reference period SD2 (198.5 ms was significantly higher (p<0.05 than every other measured period suggesting an increasing parasympathetic tone after milking. This parasympathetic predominance decreased with time of the day (1.5 h after milking. SD2 was significantly affected by parity, by the breeding bull (p<0.01 and by milk production (p<0.05. SD2 was notably higher (102.8 ms in multiparous cows than in primiparous cows (p<0.017; α=0.005 during resting and milking. Results suggested that a conventional milking process is not really stressful for cows. Primiparous cows were more susceptible of milking process than multiparous ones. SD2 is a good marker of vagus activity and affected by several independent factors.

  19. Input variable selection for data-driven models of Coriolis flowmeters for two-phase flow measurement

    International Nuclear Information System (INIS)

    Wang, Lijuan; Yan, Yong; Wang, Xue; Wang, Tao

    2017-01-01

    Input variable selection is an essential step in the development of data-driven models for environmental, biological and industrial applications. Through input variable selection to eliminate the irrelevant or redundant variables, a suitable subset of variables is identified as the input of a model. Meanwhile, through input variable selection the complexity of the model structure is simplified and the computational efficiency is improved. This paper describes the procedures of the input variable selection for the data-driven models for the measurement of liquid mass flowrate and gas volume fraction under two-phase flow conditions using Coriolis flowmeters. Three advanced input variable selection methods, including partial mutual information (PMI), genetic algorithm-artificial neural network (GA-ANN) and tree-based iterative input selection (IIS) are applied in this study. Typical data-driven models incorporating support vector machine (SVM) are established individually based on the input candidates resulting from the selection methods. The validity of the selection outcomes is assessed through an output performance comparison of the SVM based data-driven models and sensitivity analysis. The validation and analysis results suggest that the input variables selected from the PMI algorithm provide more effective information for the models to measure liquid mass flowrate while the IIS algorithm provides a fewer but more effective variables for the models to predict gas volume fraction. (paper)

  20. The Importance of Replication in Measurement Research: Using Curriculum-Based Measures with Postsecondary Students with Developmental Disabilities

    Science.gov (United States)

    Hosp, John L.; Ford, Jeremy W.; Huddle, Sally M.; Hensley, Kiersten K.

    2018-01-01

    Replication is a foundation of the development of a knowledge base in an evidence-based field such as education. This study includes two direct replications of Hosp, Hensley, Huddle, and Ford which found evidence of criterion-related validity of curriculum-based measurement (CBM) for reading and mathematics with postsecondary students with…

  1. Measurement of the 36Cl deposition flux in central Japan: natural background levels and seasonal variability

    International Nuclear Information System (INIS)

    Tosaki, Yuki; Tase, Norio; Sasa, Kimikazu; Takahashi, Tsutomu; Nagashima, Yasuo

    2012-01-01

    Essential parameters for the applications of 36 Cl as a tracer in groundwater studies include the initial 36 Cl/Cl ratio, at the time of recharge, and/or the natural background deposition flux of 36 Cl in the recharge area. To facilitate the hydrological use of 36 Cl in central Japan, this study aimed to obtain a precise estimate of the long-term average local 36 Cl flux and to characterize its seasonal variability. The 36 Cl in precipitation was continuously monitored in Tsukuba, central Japan over a period of >5 years. The 36 Cl flux showed a clear seasonal variation with an annual peak during the spring, which was attributed to the seasonal variability of tropopause height. The long-term average 36 Cl flux (32 ± 2 atoms m −2 s −1 ), estimated from the measured data, was consistent with the prediction from the 36 Cl latitudinal fallout model scaled using the global mean production rate of 20 atoms m −2 s −1 . The initial 36 Cl/Cl ratio was estimated to be (41 ± 6) × 10 −15 , which is similar to that of pre-bomb groundwater in the Tsukuba Upland. An observation period covering an 11-year solar cycle would yield more accurate estimates of the values, given the increased 36 Cl flux during the solar minimum. - Highlights: ► We monitored 36 Cl in precipitation in central Japan over a period of >5 years. ► The 36 Cl flux varied seasonally, with a peak in spring. ► The long-term average 36 Cl flux and the initial 36 Cl/Cl ratio were 32 ± 2 atoms m −2 s −1 and (41 ± 6) × 10 −15 , respectively. ► An observation period covering an 11-year solar cycle would yield more accurate estimates of the values, given the increased 36 Cl flux during the solar minimum.

  2. Variability of non-Gaussian diffusion MRI and intravoxel incoherent motion (IVIM) measurements in the breast.

    Science.gov (United States)

    Iima, Mami; Kataoka, Masako; Kanao, Shotaro; Kawai, Makiko; Onishi, Natsuko; Koyasu, Sho; Murata, Katsutoshi; Ohashi, Akane; Sakaguchi, Rena; Togashi, Kaori

    2018-01-01

    We prospectively examined the variability of non-Gaussian diffusion magnetic resonance imaging (MRI) and intravoxel incoherent motion (IVIM) measurements with different numbers of b-values and excitations in normal breast tissue and breast lesions. Thirteen volunteers and fourteen patients with breast lesions (seven malignant, eight benign; one patient had bilateral lesions) were recruited in this prospective study (approved by the Internal Review Board). Diffusion-weighted MRI was performed with 16 b-values (0-2500 s/mm2 with one number of excitations [NEX]) and five b-values (0-2500 s/mm2, 3 NEX), using a 3T breast MRI. Intravoxel incoherent motion (flowing blood volume fraction [fIVIM] and pseudodiffusion coefficient [D*]) and non-Gaussian diffusion (theoretical apparent diffusion coefficient [ADC] at b value of 0 sec/mm2 [ADC0] and kurtosis [K]) parameters were estimated from IVIM and Kurtosis models using 16 b-values, and synthetic apparent diffusion coefficient (sADC) values were obtained from two key b-values. The variabilities between and within subjects and between different diffusion acquisition methods were estimated. There were no statistical differences in ADC0, K, or sADC values between the different b-values or NEX. A good agreement of diffusion parameters was observed between 16 b-values (one NEX), five b-values (one NEX), and five b-values (three NEX) in normal breast tissue or breast lesions. Insufficient agreement was observed for IVIM parameters. There were no statistical differences in the non-Gaussian diffusion MRI estimated values obtained from a different number of b-values or excitations in normal breast tissue or breast lesions. These data suggest that a limited MRI protocol using a few b-values might be relevant in a clinical setting for the estimation of non-Gaussian diffusion MRI parameters in normal breast tissue and breast lesions.

  3. Non-Proliferation, the IAEA Safeguards System, and the importance of nuclear material measurements

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, Rebecca S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-18

    The objective of this project is to explain the contribution of nuclear material measurements to the system of international verification of State declarations and the non-proliferation of nuclear weapons.

  4. Impulsive buying tendency: Measuring important relationships with a new perspective and an indigenous scale

    OpenAIRE

    Anant Jyoti Badgaiyan; Anshul Verma; Saumya Dixit

    2016-01-01

    With the opening up of the economy and the proliferation of mall culture, the economic relevance of impulsive buying behaviour has assumed significance. Impulsive buying behaviour is better understood by examining the impulsive buying tendency that shapes such behaviour, and since consumer behaviour differs across cultures, by incorporating an indigenous perspective in understanding and measuring the tendency. Studies were conducted to develop an Indian scale for measuring impulsive buying te...

  5. Measuring the relative importance of strategic thinking dimensions in relation to counterproductive behavior

    Directory of Open Access Journals (Sweden)

    Afsaneh Zamani Moghaddam

    2013-11-01

    Full Text Available The purpose of this paper is to explore the relative importance of strategic thinking dimensions in prediction of counter-productive behavior. The research method is based on a descriptive- Survey research. After collecting the questionnaires from 73 top managers and 110 staffs, the correlations between strategic thinking dimensions and counterproductive behavior were calculated. The relative importance method was used to calculate the relative weight of each dimension of strategic thinking in prediction of counterproductive behaviors. The results show that the strategic thinking of top managers is associated with their counterproductive behavior (correlation coefficient -0.38. Furthermore, The results of the Relative Importance Method indicate that the relative importance of each dimension of strategic thinking in prediction of counterproductive behavior is not the same. System perspective with 31.1% has the highest importance and hypothesis driven with 11.7% has the lowest weight. Intent focus, thinking in time and intelligent opportunism predict 14.1%, 13.3%, and 29.8% of counter-productive changes, respectively.

  6. LACK OF AGREEMENT BETWEEN GAS EXCHANGE VARIABLES MEASURED BY TWO METABOLIC SYSTEMS

    Directory of Open Access Journals (Sweden)

    DjordjeG. Jakovljevic

    2008-03-01

    Full Text Available The purpose of this study was to assess the agreement and consistency between gas exchange variables measured by two online metabolic systems during an incremental exercise test. After obtaining local ethics approval and informed consent, 15 healthy subjects performed an incremental exercise test to volitional fatigue using the Bruce protocol. The Innocor (Innovision, Denmark and CardiO2 (Medical Graphics, USA systems were placed in series, with the Innocor mouthpiece attached to the pneumotach of the CardiO2. Metabolic data were analysed during the last 30 seconds of each stage and at peak exercise. There were non- significant differences (p > 0.05 between the two systems in estimation of oxygen consumption (VO2 and in minute ventilation (VE. Mean Cronbach's alpha for VO2 and VE were 0.88 and 0.92. The Bland-Altman analysis revealed that limits of agreement were -0.52 to 0.55 l.min-1 for VO2, and -8.74 to 10.66 l.min-1 for VE. Carbon dioxide production (VCO2 and consequently respiratory exchange ratio (RER measured by the Innocor were significantly lower (p < 0.05 through all stages. The CardiO2 measured fraction of expired carbon dioxide (FeCO2 significantly higher (p < 0.05. The limits of agreement for VO2 and VE are wide and unacceptable in cardio-pulmonary exercise testing. The Innocor reported VCO2 systematically lower. Therefore the Innocor and CardiO2 metabolic systems cannot be used interchangeably without affecting the diagnosis of an individual patient. Results from the present study support previous suggestion that considerable care is needed when comparing metabolic data obtained from different automated metabolic systems.

  7. CameraHRV: robust measurement of heart rate variability using a camera

    Science.gov (United States)

    Pai, Amruta; Veeraraghavan, Ashok; Sabharwal, Ashutosh

    2018-02-01

    The inter-beat-interval (time period of the cardiac cycle) changes slightly for every heartbeat; this variation is measured as Heart Rate Variability (HRV). HRV is presumed to occur due to interactions between the parasym- pathetic and sympathetic nervous system. Therefore, it is sometimes used as an indicator of the stress level of an individual. HRV also reveals some clinical information about cardiac health. Currently, HRV is accurately measured using contact devices such as a pulse oximeter. However, recent research in the field of non-contact imaging Photoplethysmography (iPPG) has made vital sign measurements using just the video recording of any exposed skin (such as a person's face) possible. The current signal processing methods for extracting HRV using peak detection perform well for contact-based systems but have poor performance for the iPPG signals. The main reason for this poor performance is the fact that current methods are sensitive to large noise sources which are often present in iPPG data. Further, current methods are not robust to motion artifacts that are common in iPPG systems. We developed a new algorithm, CameraHRV, for robustly extracting HRV even in low SNR such as is common with iPPG recordings. CameraHRV combined spatial combination and frequency demodulation to obtain HRV from the instantaneous frequency of the iPPG signal. CameraHRV outperforms other current methods of HRV estimation. Ground truth data was obtained from FDA-approved pulse oximeter for validation purposes. CameraHRV on iPPG data showed an error of 6 milliseconds for low motion and varying skin tone scenarios. The improvement in error was 14%. In case of high motion scenarios like reading, watching and talking, the error was 10 milliseconds.

  8. Measuring what we manage – the importance of hydrological data to water resources management

    Directory of Open Access Journals (Sweden)

    B. Stewart

    2015-04-01

    Full Text Available Water resources cannot be managed, unless we know where they are, in what quantity and quality, and how variable they are likely to be in the foreseeable future. Data from hydrological networks are used by public and private sectors for a variety of different applications. This paper discusses the value proposition behind the collection, analysis and use of hydrological data in support of these applications. The need for hydrological data and the requirements for the data are outlined, and information is provided on topics such as status of networks and data access and sharing. This paper outlines elements of the contribution by the World Meteorological Organization (WMO to hydrological data collection and covers aspects related to quality management in the collection of hydrological data, especially regarding streamflow gauging, network design and capacity building for services delivery. It should be noted that the applications which make use of hydrological data may also be significantly impacted by climate change.

  9. Measuring what we manage - the importance of hydrological data to water resources management

    Science.gov (United States)

    Stewart, B.

    2015-04-01

    Water resources cannot be managed, unless we know where they are, in what quantity and quality, and how variable they are likely to be in the foreseeable future. Data from hydrological networks are used by public and private sectors for a variety of different applications. This paper discusses the value proposition behind the collection, analysis and use of hydrological data in support of these applications. The need for hydrological data and the requirements for the data are outlined, and information is provided on topics such as status of networks and data access and sharing. This paper outlines elements of the contribution by the World Meteorological Organization (WMO) to hydrological data collection and covers aspects related to quality management in the collection of hydrological data, especially regarding streamflow gauging, network design and capacity building for services delivery. It should be noted that the applications which make use of hydrological data may also be significantly impacted by climate change.

  10. A novel variable baseline visibility detection system and its measurement method

    Science.gov (United States)

    Li, Meng; Jiang, Li-hui; Xiong, Xing-long; Zhang, Guizhong; Yao, JianQuan

    2017-10-01

    As an important meteorological observation instrument, the visibility meter can ensure the safety of traffic operation. However, due to the optical system contamination as well as sample error, the accuracy and stability of the equipment are difficult to meet the requirement in the low-visibility environment. To settle this matter, a novel measurement equipment was designed based upon multiple baseline, which essentially acts as an atmospheric transmission meter with movable optical receiver, applying weighted least square method to process signal. Theoretical analysis and experiments in real atmosphere environment support this technique.

  11. [Comparison of self-reported anthropometric variables and real measurement data].

    Science.gov (United States)

    Díaz-García, J; González-Zapata, L I; Estrada-Restrepo, A

    2012-06-01

    The objectives of this study were to evaluate self-reporting of weight, height, and waist circumference, and to compare that perception with the real measurements in college students of the MESPYN cohort--Medellin, Salud Pública y Nutrición--from the University of Antioquia (UdeA), Colombia. A cross-sectional study was conducted starting with the first measurement of the MESPYN Cohort 2009-2010. The sample included volunteer students from different academic areas. Self-perception of weight, height, and waist circumference were recorded before the real measurements were performed. Intraclass correlation coefficients (ICC) were calculated for all the variables, and an alpha of 0.05 was used. The concordance between real measurements and self-referred values was evaluated with the Bland and Altman method. 424 volunteer students were included. The average real weight (kg) in males was 67.4 +/- 10.4 and self-reported: 67.0 +/- 11.0; in females the real value was 55.7 +/- 10.1 and self-reported: 55.0 +/- 9.0. The average real height (m) in males was 1.73 +/- 6.1 and self-reported: 1.73 +/- 6.0; in females the real value was 1.60 +/- 5.9 and self-reported: 1.61 +/- 6.0. In males, the average real waist circumference (cm) was 76.6 +/- 8.0 and self-reported: 75.0 +/- 14.0; in females the real value was 69.9 +/- 8.0 and self-reported: 70.0 +/- 9.0. Weight ICC: 0.956, 95% CI (0.95; 0.97), (p < 0.01); height ICC: 0.953, 95%IC (0.91; 0.97), (p < 0.01), and waist circumference ICC: 0.593, 95% IC (0.55; 0.65), (p < 0.01). In conclusion, anthropometric nutritional evaluation of UdeA students can be performed with self-reported data for weight and height, but the evaluation of abdominal obesity requires direct measurement of waist circumference.

  12. Validity of (Ultra-Short Recordings for Heart Rate Variability Measurements.

    Directory of Open Access Journals (Sweden)

    M Loretto Munoz

    Full Text Available In order to investigate the applicability of routine 10s electrocardiogram (ECG recordings for time-domain heart rate variability (HRV calculation we explored to what extent these (ultra-short recordings capture the "actual" HRV.The standard deviation of normal-to-normal intervals (SDNN and the root mean square of successive differences (RMSSD were measured in 3,387 adults. SDNN and RMSSD were assessed from (ultrashort recordings of 10s(3x, 30s, and 120s and compared to 240s-300s (gold standard measurements. Pearson's correlation coefficients (r, Bland-Altman 95% limits of agreement and Cohen's d statistics were used as agreement analysis techniques.Agreement between the separate 10s recordings and the 240s-300s recording was already substantial (r = 0.758-0.764/Bias = 0.398-0.416/d = 0.855-0.894 for SDNN; r = 0.853-0.862/Bias = 0.079-0.096/d = 0.150-0.171 for RMSSD, and improved further when three 10s periods were averaged (r = 0.863/Bias = 0.406/d = 0.874 for SDNN; r = 0.941/Bias = 0.088/d = 0.167 for RMSSD. Agreement increased with recording length and reached near perfect agreement at 120s (r = 0.956/Bias = 0.064/d = 0.137 for SDNN; r = 0.986/Bias = 0.014/d = 0.027 for RMSSD. For all recording lengths and agreement measures, RMSSD outperformed SDNN.Our results confirm that it is unnecessary to use recordings longer than 120s to obtain accurate measures of RMSSD and SDNN in the time domain. Even a single 10s (standard ECG recording yields a valid RMSSD measurement, although an average over multiple 10s ECGs is preferable. For SDNN we would recommend either 30s or multiple 10s ECGs. Future research projects using time-domain HRV parameters, e.g. genetic epidemiological studies, could calculate HRV from (ultra-short ECGs enabling such projects to be performed at a large scale.

  13. Evaluating measurement models in clinical research: covariance structure analysis of latent variable models of self-conception.

    Science.gov (United States)

    Hoyle, R H

    1991-02-01

    Indirect measures of psychological constructs are vital to clinical research. On occasion, however, the meaning of indirect measures of psychological constructs is obfuscated by statistical procedures that do not account for the complex relations between items and latent variables and among latent variables. Covariance structure analysis (CSA) is a statistical procedure for testing hypotheses about the relations among items that indirectly measure a psychological construct and relations among psychological constructs. This article introduces clinical researchers to the strengths and limitations of CSA as a statistical procedure for conceiving and testing structural hypotheses that are not tested adequately with other statistical procedures. The article is organized around two empirical examples that illustrate the use of CSA for evaluating measurement models with correlated error terms, higher-order factors, and measured and latent variables.

  14. The metabolic syndrome in long-term cancer survivors, an important target for secondary preventive measures

    NARCIS (Netherlands)

    Nuver, J; Smit, AJ; Postma, A; Sleijfer, DT; Gietema, JA

    With increasing numbers of cancer survivors, attention has been drawn to long-term complications of curative cancer treatment, including a range of metabolic disorders. These metabolic disorders often resemble the components of the so-called metabolic syndrome, or syndrome X, which is an important

  15. Health status measurement in COPD : the minimal clinically important difference of the clinical COPD questionnaire

    NARCIS (Netherlands)

    Kocks, J. W. H.; Tuinenga, M. G.; Uil, S. M.; van den Berg, J. W. K.; Stahl, E.; van der Molen, T.

    2006-01-01

    Background: Patient-reported outcomes ( PRO) questionnaires are being increasingly used in COPD clinical studies. The challenge facing investigators is to determine what change is significant, ie what is the minimal clinically important difference (MCID). This study aimed to identify the MCID for

  16. Measuring Anxiety in Children: The Importance of Separate Mother and Father Reports

    Science.gov (United States)

    Jansen, Mélou; Bodden, Denise H. M.; Muris, Peter; van Doorn, Marleen; Granic, Isabela

    2017-01-01

    Background: Previous research suggests that it is important to use parental reports when assessing children's anxiety, but it remains unclear to what extent there are differences between mothers' and fathers' scores and whether these potential differences have any repercussions for the psychometric properties of the scale being used. Objective:…

  17. Measuring School Facility Conditions: An Illustration of the Importance of Purpose

    Science.gov (United States)

    Roberts, Lance W.

    2009-01-01

    Purpose: The purpose of this paper is to argue that taking the educational purposes of schools into account is central to understanding the place and importance of facilities to learning outcomes. The paper begins by observing that the research literature connecting facility conditions to student outcomes is mixed. A closer examination of this…

  18. The Importance of Sensory-Motor Control in Providing Core Stability Implications for Measurement and Training

    NARCIS (Netherlands)

    Borghuis, Jan; Hof, At L.; Lemmink, Koen A. P. M.

    2008-01-01

    Although the hip musculature is found to be very important in connecting the core to the lower extremities and in transferring forces from and to the core, it is proposed to leave the hip musculature out of consideration when talking about the concept of core stability. A low level of co-contraction

  19. Importance of Raman Lidar Aerosol Extinction Measurements for Aerosol-Cloud Interaction Studies

    Directory of Open Access Journals (Sweden)

    Han Zaw

    2016-01-01

    Full Text Available Using a UV Raman Lidar for aerosol extinction, and combining Microwave Radiometer derived Liquid Water Path (LWP with Multifilter Rotating Shadowband Radiometer derived Cloud Optical depth, to get cloud effective radius (Reff, we observe under certain specialized conditions, clear signatures of the Twomey Aerosol Indirect effect on cloud droplet properties which are consistent with the theoretical bounds. We also show that the measurement is very sensitive to how far the aerosol layer is from the cloud base and demonstrate that surface PM25 is far less useful. Measurements from both the DOE ARM site and new results at CCNY are presented.

  20. Strategic B2B customer experience management: the importance of outcomes-based measures

    OpenAIRE

    Zolkiewski, Judy; Story, Victoria; Burton, Jamie; Chan, Paul; Gomes, Andre; Hunter-Jones, Philippa; O’Malley, Lisa; Peters, Linda D.; Raddats, Chris; Robinson, William

    2017-01-01

    Purpose\\ud \\ud The purpose of this paper is to critique the adequacy of efforts to capture the complexities of customer experience in a business-to-business (B2B) context using input–output measures. The paper introduces a strategic customer experience management framework to capture the complexity of B2B service interactions and discusses the value of outcomes-based measurement.\\ud Design/methodology/approach\\ud \\ud This is a theoretical paper that reviews extant literature related to B2B cu...

  1. Performance evaluation of the short-time objective intelligibility measure with different band importance functions

    DEFF Research Database (Denmark)

    Heidemann Andersen, Asger; de Haan, Jan Mark; Tan, Zheng-Hua

    performance measures: root-mean-squared-error, Pearson correlation, and Kendall rank correlation. The results show substantially improved performance when fitting and evaluating on the same dataset. However, this advantage does not necessarily subsist when fitting and evaluating on different datasets. When...... with a filter bank, 2) envelopes are extracted from each band, 3) the temporal correlation between clean and degraded envelopes is computed in short time segments, and 4) the correlation is averaged across time and frequency bands to obtain the final output. An unusual choice in the design of the STOI measure...

  2. Research on the Impacts of Expensive Food and Luxury Goods Import Tariff Adjustment on Chinese Economy and Related Measures

    OpenAIRE

    Qishen Zhou; Mingxing Yang

    2013-01-01

    This study aims to investigate the impacts of expensive food and luxury goods import tariff adjustment on Chinese economy and related measures. Nowadays, Asia especially China has been the world’s biggest expensive food and luxury goods market. However, due to relatively higher luxury import tariff in China, most consumers have chosen to purchase expensive food and luxury goods abroad which leads to a large of domestic consumption cash outflow. Therefore, whether to cut the luxury import tari...

  3. Identification of key aromatic compounds in Congou black tea by PLSR with variable importance of projection scores and gas chromatography-mass spectrometry/gas chromatography-olfactometry.

    Science.gov (United States)

    Mao, Shihong; Lu, Changqi; Li, Meifeng; Ye, Yulong; Wei, Xu; Tong, Huarong

    2018-04-13

    Gas chromatography-olfactometry (GC-O) is the most frequently used method to estimate the sensory contribution of single odorant, but disregards the interactions between volatiles. In order to select the key volatiles responsible for the aroma attributes of Congou black tea (Camellia sinensis), instrumental, sensory and multivariate statistical approaches were applied. By sensory analysis, nine panelists developed 8 descriptors, namely, floral, sweet, fruity, green, roasted, oil, spicy, and off-odor. Linalool, (E)-furan linalool oxide, (Z)-pyran linalool oxide, methyl salicylate, β-myrcene, phenylethyl alcohol which identified from the most representative samples by GC-O procedure, were the essential aroma-active compounds in the formation of basic Congou black tea aroma. In addition, 136 volatiles were identified by gas chromatography-mass spectrometry (GC-MS), among which 55 compounds were determined as the key factors for the six sensory attributes by partial least-square regression (PLSR) with variable importance of projection (VIP) scores. Our results demonstrated that HS-SPME/GC-MS/GC-O was a fast approach for isolation and quantification aroma-active compounds. PLSR method was also considered to be a useful tool in selecting important variables for sensory attributes. These two strategies allowed us to comprehensively evaluate the sensorial contribution of single volatile from different perspectives, can be applied to related products for comprehensive quality control. This article is protected by copyright. All rights reserved.

  4. Spatially-Resolved Influence of Temperature and Salinity on Stock and Recruitment Variability of Commercially Important Fishes in the North Sea.

    Directory of Open Access Journals (Sweden)

    Anna Akimova

    Full Text Available Understanding of the processes affecting recruitment of commercially important fish species is one of the major challenges in fisheries science. Towards this aim, we investigated the relation between North Sea hydrography (temperature and salinity and fish stock variables (recruitment, spawning stock biomass and pre-recruitment survival index for 9 commercially important fishes using spatially-resolved cross-correlation analysis. We used high-resolution (0.2° × 0.2° hydrographic data fields matching the maximal temporal extent of the fish population assessments (1948-2013. Our approach allowed for the identification of regions in the North Sea where environmental variables seem to be more influential on the fish stocks, as well as the regions of a lesser or nil influence. Our results confirmed previously demonstrated negative correlations between temperature and recruitment of cod and plaice and identified regions of the strongest correlations (German Bight for plaice and north-western North Sea for cod. We also revealed a positive correlation between herring spawning stock biomass and temperature in the Orkney-Shetland area, as well as a negative correlation between sole pre-recruitment survival index and temperature in the German Bight. A strong positive correlation between sprat stock variables and salinity in the central North Sea was also found. To our knowledge the results concerning correlations between North Sea hydrography and stocks' dynamics of herring, sole and sprat are novel. The new information about spatial distribution of the correlation provides an additional help to identify mechanisms underlying these correlations. As an illustration of the utility of these results for fishery management, an example is provided that incorporates the identified environmental covariates in stock-recruitment models.

  5. Why quality of life measurement is important in dermatology clinical practice

    DEFF Research Database (Denmark)

    Finlay, A Y; Salek, M S; Abeni, D

    2017-01-01

    The aim of this study was to describe the many ways in which quality of life (QoL) measurement may potentially be advantageous in routine clinical dermatology practice. Thirteen members of the EADV Task Force on Quality of Life, eight dermatologists, three health psychologists, one epidemiologist...

  6. First Equals Most Important? Order Effects in Vignette-Based Measurement

    Science.gov (United States)

    Auspurg, Katrin; Jäckle, Annette

    2017-01-01

    To measure what determines people's attitudes, definitions, or decisions, surveys increasingly ask respondents to judge vignettes. A vignette typically describes a hypothetical situation or object as having various attributes (dimensions). In factorial surveys, the values (levels) of dimensions are experimentally varied, so that their impact on…

  7. Importance of accurate measurements in nutrition research: dietary flavonoids as a case study

    Science.gov (United States)

    Accurate measurements of the secondary metabolites in natural products and plant foods are critical to establishing diet/health relationships. There are as many as 50,000 secondary metabolites which may influence human health. Their structural and chemical diversity present a challenge to analytic...

  8. On the Importance of Reliable Covariate Measurement in Selection Bias Adjustments Using Propensity Scores

    Science.gov (United States)

    Steiner, Peter M.; Cook, Thomas D.; Shadish, William R.

    2011-01-01

    The effect of unreliability of measurement on propensity score (PS) adjusted treatment effects has not been previously studied. The authors report on a study simulating different degrees of unreliability in the multiple covariates that were used to estimate the PS. The simulation uses the same data as two prior studies. Shadish, Clark, and Steiner…

  9. 77 FR 53959 - WTO Dispute Settlement Proceeding Regarding Argentina-Measures Affecting the Importation of Goods

    Science.gov (United States)

    2012-09-04

    ... Agreement Establishing the World Trade Organization (``WTO Agreement'') concerning certain measures imposed... Appellate Body, will also be available on the Web site of the World Trade Organization at www.wto.org... OFFICE OF THE UNITED STATES TRADE REPRESENTATIVE [Dispute No. WT/DS444] WTO Dispute Settlement...

  10. 77 FR 18296 - WTO Dispute Settlement Proceeding Regarding India-Measures Concerning the Importation of Certain...

    Science.gov (United States)

    2012-03-27

    ... Establishing the World Trade Organization (``WTO Agreement'') concerning antidumping measures prohibitions... available on the Web site of the World Trade Organization, www.wto.org . Comments open to public inspection... OFFICE OF THE UNITED STATES TRADE REPRESENTATIVE [Dispute No. WTO/DS430] WTO Dispute Settlement...

  11. Laboratory calibrations of airborne gamma-ray spectrometers. Measurements and discussions of important parameters

    International Nuclear Information System (INIS)

    Korsbech, U.

    1994-02-01

    This report is the fourth of reports from The Department of Electrophysics covering measurement and interpretation of airborne gamma-spectrometry measurements. It describes different topics concerning the construction of a suitable calibration setup in the laboratory. The goal is to build a simple and cheap laboratory setup that can produce most of the gamma-ray data needed for an interpretation of spectra measured 50 to 120 m above ground level. A simple calibration setup has been build and tested. It may produce gamma-ray spectra similar to those measured in the air - from surface contamination with artificial nuclides and from 'bulk' natural radioactivity. It is possible to investigate the influence of the air above an aircraft carrying the detector (skyshine: scattering of gamma photons in the air above the detector). In order to reduce the influence of non-detected pile-up the count rates are kept low without reaching levels where the background spectra (to be subtracted) would cause unacceptable counting statistical fluctuations. Sources selected for the calibrations are heavy minerals sand (with thorium and uranium), potassium nitrate (with 40 K). These sources are 'bulk sources' of natural radioactivity. Cesium-137 has been selected as the basic artifical surface contamination nuclide. The report also discusses methods for comparing two spectra a priori assumed equal. Finally the properties of some materials that could be used as 'air-substitutes' in the calibration setup have been tested with respect to stability against moisture sorption. (au)

  12. The importance, measurement and practical implications of worker's expectations for return to work.

    Science.gov (United States)

    Young, Amanda E; Besen, Elyssa; Choi, YoonSun

    2015-01-01

    Workers' own expectations for return to work consistently predict work status. To advance the understanding of the relationship between RTW expectations and outcomes, we reviewed existing measures to determine those which we felt were the most likely to capture the construct. A comprehensive search of the work-disability rehabilitation literature was undertaken. The review of the measures was conducted in three steps: first, a review of terminology; second, an examination of whether a time reference was included; third, an evaluation of ease of comprehension, and applicability across contexts. A total of 42 different measures were identified. One of the most striking findings was the inconsistency in terminology. Measures were also limited by not including a time reference. Problems were also identified with regards to ease of understanding, utility of response options, and applicability in a wide variety of research and applied settings. Most previously used measures contain elements that potentially limit utility. However, it would seem that further development can overcome these, resulting in a tool that provides risk prediction information, and an opportunity to start a conversation to help identify problems that might negatively impact a worker's movement through the RTW process and the outcomes achieved. Implications for Rehabilitation Return to work is an integral part of workplace injury management. The capture of RTW expectations affords a way to identify the potential for less than optimal RTW processes and outcomes. A mismatch between an injured worker's expectations and what other stakeholders might expect suggests that efforts could be made to determine what is causing the injured worker's concerns. Once underling issues are identified, work can be put into resolving these so that the worker's return to the workplace is not impeded.

  13. The importance of severity of arthrosis for the reliability of bone mineral density measurement in women.

    Science.gov (United States)

    Hayirlioglu, Alper; Gokaslan, Husnu; Cimsit, Canan; Baysal, Begumhan

    2009-02-01

    The objective of this study is to investigate the effect of the severity of degenerative changes on measurements of A-P lumbar spines BMD values and to determine the reliability of DEXA measurements associated with severity of the disease on A-P lumbar spines BMD values using DEXA. The measurements using DEXA were taken from L2-L4 spines and femoral neck of total 271 female cases. One hundred and ten of them had mild arthrosis (Group 0), and 69 had severe arthrosis (Group 1). Ninety-two cases without arthrosis were chosen as control group (Group 2). The cases with arthrosic changes were grouped according to their degree of severity of arthrosis. The groups were compared two by two and Tukey multiple comparison test was used for the analysis of the difference of the means of the groups. The mean age of cases was 61.79, 61.84, and 60.47, respectively. The average height was 157.26, 155.93, and 15.92 cm while the average weight was 69.21, 70.78, and 71.45 kg, respectively. The mean body mass index (BMI) was 0.00283, 0.00291, and 0.00293, respectively. L2-L4 A-P spinal BMD values were 0.9870, 0.9848, and 1.0836 g/cm(2) while the femoral neck BMD values were 0.7964, 0.8056, and 0.8223 g/cm(2), respectively. There was no statistical significance between study and control groups in terms of age, weight, height, BMI, and BMD values obtained from femoral neck. However, lumbar region BMD values of the cases with severe arthrosis were statistically significantly high when compared with other two groups. The femoral neck measurement is the prominent alternative method in severe arthrosis while taking measurements from lumbar region is still the most appropriate method in cases with mild arthrosis without having giant osteophytes.

  14. Importing statistical measures into Artemis enhances gene identification in the Leishmania genome project

    Directory of Open Access Journals (Sweden)

    McDonagh Paul D

    2003-06-01

    Full Text Available Abstract Background Seattle Biomedical Research Institute (SBRI as part of the Leishmania Genome Network (LGN is sequencing chromosomes of the trypanosomatid protozoan species Leishmania major. At SBRI, chromosomal sequence is annotated using a combination of trained and untrained non-consensus gene-prediction algorithms with ARTEMIS, an annotation platform with rich and user-friendly interfaces. Results Here we describe a methodology used to import results from three different protein-coding gene-prediction algorithms (GLIMMER, TESTCODE and GENESCAN into the ARTEMIS sequence viewer and annotation tool. Comparison of these methods, along with the CODONUSAGE algorithm built into ARTEMIS, shows the importance of combining methods to more accurately annotate the L. major genomic sequence. Conclusion An improvised and powerful tool for gene prediction has been developed by importing data from widely-used algorithms into an existing annotation platform. This approach is especially fruitful in the Leishmania genome project where there is large proportion of novel genes requiring manual annotation.

  15. Reexamining age, race, site, and thermometer type as variables affecting temperature measurement in adults – A comparison study

    Directory of Open Access Journals (Sweden)

    Smith Linda S

    2003-06-01

    Full Text Available Abstract Background As a result of the recent international vigilance regarding disease assessment, accurate measurement of body temperature has become increasingly important. Yet, trusted low-tech, portable mercury glass thermometers are no longer available. Thus, comparing accuracy of mercury-free thermometers with mercury devices is essential. Study purposes were 1 to examine age, race, site as variables affecting temperature measurement in adults, and 2 to compare clinical accuracy of low-tech Galinstan-in-glass device to mercury-in-glass at oral, axillary, groin, and rectal sites in adults. Methods Setting 176 bed accredited healthcare facility, rural northwest US Participants Convenience sample (N = 120 of hospitalized persons ≥ 18 years old. Instruments Temperatures (°F measured at oral, skin (simultaneous, immediately followed by rectal sites with four each mercury-glass (BD and Galinstan-glass (Geratherm thermometers; 10 minute dwell times. Results Participants averaged 61.6 years (SD 17.9, 188 pounds (SD 55.3; 61% female; race: 85% White, 8.3% Native Am., 4.2% Hispanic, 1.7 % Asian, 0.8% Black. For both mercury and Galinstan-glass thermometers, within-subject temperature readings were highest rectally; followed by oral, then skin sites. Galinstan assessments demonstrated rectal sites 0.91°F > oral and ≅ 1.3°F > skin sites. Devices strongly correlated between and across sites. Site difference scores between devices showed greatest variability at skin sites; least at rectal site. 95% confidence intervals of difference scores by site (°F: oral (0.142 – 0.265, axilla (0.167 – 0.339, groin (0.037 – 0.321, and rectal (-0.111 – 0.111. Race correlated with age, temperature readings each site and device. Conclusion Temperature readings varied by age, race. Mercury readings correlated with Galinstan thermometer readings at all sites. Site mean differences between devices were considered clinically insignificant. Still considered

  16. Improving Continuous-Variable Measurement-Device-Independent Multipartite Quantum Communication with Optical Amplifiers*

    Science.gov (United States)

    Guo, Ying; Zhao, Wei; Li, Fei; Huang, Duan; Liao, Qin; Xie, Cai-Lang

    2017-08-01

    The developing tendency of continuous-variable (CV) measurement-device-independent (MDI) quantum cryptography is to cope with the practical issue of implementing scalable quantum networks. Up to now, most theoretical and experimental researches on CV-MDI QKD are focused on two-party protocols. However, we suggest a CV-MDI multipartite quantum secret sharing (QSS) protocol use the EPR states coupled with optical amplifiers. More remarkable, QSS is the real application in multipartite CV-MDI QKD, in other words, is the concrete implementation method of multipartite CV-MDI QKD. It can implement a practical quantum network scheme, under which the legal participants create the secret correlations by using EPR states connecting to an untrusted relay via insecure links and applying the multi-entangled Greenberger-Horne-Zeilinger (GHZ) state analysis at relay station. Even if there is a possibility that the relay may be completely tampered, the legal participants are still able to extract a secret key from network communication. The numerical simulation indicates that the quantum network communication can be achieved in an asymmetric scenario, fulfilling the demands of a practical quantum network. Additionally, we illustrate that the use of optical amplifiers can compensate the partial inherent imperfections of detectors and increase the transmission distance of the CV-MDI quantum system.

  17. Continuous-variable Measurement-device-independent Quantum Relay Network with Phase-sensitive Amplifiers

    Science.gov (United States)

    Li, Fei; Zhao, Wei; Guo, Ying

    2018-01-01

    Continuous-variable (CV) measurement-device-independent (MDI) quantum cryptography is now heading towards solving the practical problem of implementing scalable quantum networks. In this paper, we show that a solution can come from deploying an optical amplifier in the CV-MDI system, aiming to establish a high-rate quantum network. We suggest an improved CV-MDI protocol using the EPR states coupled with optical amplifiers. It can implement a practical quantum network scheme, where the legal participants create the secret correlations by using EPR states connecting to an untrusted relay via insecure links and applying the multi-entangled Greenberger-Horne-Zeilinger (GHZ) state analysis at relay station. Despite the possibility that the relay could be completely tampered with and imperfect links are subject to the powerful attacks, the legal participants are still able to extract a secret key from network communication. The numerical simulation indicates that the quantum network communication can be achieved in an asymmetric scenario, fulfilling the demands of a practical quantum network. Furthermore, we show that the use of optical amplifiers can compensate the inherent imperfections and improve the secret key rate of the CV-MDI system.

  18. A broadband variable-temperature test system for complex permittivity measurements of solid and powder materials

    Science.gov (United States)

    Zhang, Yunpeng; Li, En; Zhang, Jing; Yu, Chengyong; Zheng, Hu; Guo, Gaofeng

    2018-02-01

    A microwave test system to measure the complex permittivity of solid and powder materials as a function of temperature has been developed. The system is based on a TM0n0 multi-mode cylindrical cavity with a slotting structure, which provides purer test modes compared to a traditional cavity. To ensure the safety, effectiveness, and longevity, heating and testing are carried out separately and the sample can move between two functional areas through an Alundum tube. Induction heating and a pneumatic platform are employed to, respectively, shorten the heating and cooling time of the sample. The single trigger function of the vector network analyzer is added to test software to suppress the drift of the resonance peak during testing. Complex permittivity is calculated by the rigorous field theoretical solution considering multilayer media loading. The variation of the cavity equivalent radius caused by the sample insertion holes is discussed in detail, and its influence to the test result is analyzed. The calibration method for the complex permittivity of the Alundum tube and quartz vial (for loading powder sample), which vary with the temperature, is given. The feasibility of the system has been verified by measuring different samples in a wide range of relative permittivity and loss tangent, and variable-temperature test results of fused quartz and SiO2 powder up to 1500 °C are compared with published data. The results indicate that the presented system is reliable and accurate. The stability of the system is verified by repeated and long-term tests, and error analysis is presented to estimate the error incurred due to the uncertainties in different error sources.

  19. Variable impact of CSF flow suppression on quantitative 3.0T intracranial vessel wall measurements.

    Science.gov (United States)

    Cogswell, Petrice M; Siero, Jeroen C W; Lants, Sarah K; Waddle, Spencer; Davis, L Taylor; Gilbert, Guillaume; Hendrikse, Jeroen; Donahue, Manus J

    2018-03-31

    Flow suppression techniques have been developed for intracranial (IC) vessel wall imaging (VWI) and optimized using simulations; however, simulation results may not translate in vivo. To evaluate experimentally how IC vessel wall and lumen measurements change in identical subjects when evaluated using the most commonly available blood and cerebrospinal fluid (CSF) flow suppression modules and VWI sequences. Prospective. Healthy adults (n = 13; age = 37 ± 15 years) were enrolled. A 3.0T 3D T 1 /proton density (PD)-weighted turbo-spin-echo (TSE) acquisition with post-readout anti-driven equilibrium module, with and without Delay-Alternating-with-Nutation-for-Tailored-Excitation (DANTE) was applied. DANTE flip angle (8-12°) and TSE refocusing angle (sweep = 40-120° or 50-120°) were varied. Basilar artery and internal carotid artery (ICA) wall thicknesses, CSF signal-to-noise ratio (SNR), contrast-to-noise ratio (CNR), and signal ratio (SR) were assessed. Measurements were made by two readers (radiology resident and board-certified neuroradiologist). A Wilcoxon signed-rank test was applied with corrected two-sided P CSF suppression. Addition of the DANTE preparation reduced CSF SNR from 17.4 to 6.7, thereby providing significant (P CSF suppression. The DANTE preparation also resulted in a significant (P CSF CNR improvement (P = 0.87). There was a trend for a difference in blood SNR with vs. without DANTE (P = 0.05). The outer vessel wall diameter and wall thickness values were lower (P CSF suppression and CNR of the approaches evaluated. However, improvements are heterogeneous, likely owing to intersubject vessel pulsatility and CSF flow variations, which can lead to variable flow suppression efficacy in these velocity-dependent modules. 2 Technical Efficacy: Stage 1 J. Magn. Reson. Imaging 2018. © 2018 International Society for Magnetic Resonance in Medicine.

  20. Infiltration and Runoff Measurements on Steep Burned Hillslopes Using a Rainfall Simulator with Variable Rain Intensities

    Science.gov (United States)

    Kinner, David A.; Moody, John A.

    2008-01-01

    Multiple rainfall intensities were used in rainfall-simulation experiments designed to investigate the infiltration and runoff from 1-square-meter plots on burned hillslopes covered by an ash layer of varying thickness. The 1-square-meter plots were on north- and south-facing hillslopes in an area burned by the Overland fire northwest of Boulder near Jamestown on the Front Range of Colorado. A single-nozzle, wide-angle, multi-intensity rain simulator was developed to investigate the infiltration and runoff on steep (30- to 40-percent gradient) burned hillslopes covered with ash. The simulated rainfall was evaluated for spatial variability, drop size, and kinetic energy. Fourteen rainfall simulations, at three intensities (about 20 millimeters per hour [mm/h], 35 mm/h, and 50 mm/h), were conducted on four plots. Measurements during and after the simulations included runoff, rainfall, suspended-sediment concentrations, surface ash layer thickness, soil moisture, soil grain size, soil lost on ignition, and plot topography. Runoff discharge reached a steady state within 7 to 26 minutes. Steady infiltration rates with the 50-mm/h application rainfall intensity approached 20?35 mm/h. If these rates are projected to rainfall application intensities used in many studies of burned area runoff production (about 80 mm/h), the steady discharge rates are on the lower end of measurements from other studies. Experiments using multiple rainfall intensities (three) suggest that runoff begins at rainfall intensities around 20 mm/h at the 1-square-meter scale, an observation consistent with a 10-mm/h rainfall intensity threshold needed for runoff initiation that has been reported in the literature.

  1. Estimating regional methane surface fluxes: the relative importance of surface and GOSAT mole fraction measurements

    Directory of Open Access Journals (Sweden)

    A. Fraser

    2013-06-01

    Full Text Available We use an ensemble Kalman filter (EnKF, together with the GEOS-Chem chemistry transport model, to estimate regional monthly methane (CH4 fluxes for the period June 2009–December 2010 using proxy dry-air column-averaged mole fractions of methane (XCH4 from GOSAT (Greenhouse gases Observing SATellite and/or NOAA ESRL (Earth System Research Laboratory and CSIRO GASLAB (Global Atmospheric Sampling Laboratory CH4 surface mole fraction measurements. Global posterior estimates using GOSAT and/or surface measurements are between 510–516 Tg yr−1, which is less than, though within the uncertainty of, the prior global flux of 529 ± 25 Tg yr−1. We find larger differences between regional prior and posterior fluxes, with the largest changes in monthly emissions (75 Tg yr−1 occurring in Temperate Eurasia. In non-boreal regions the error reductions for inversions using the GOSAT data are at least three times larger (up to 45% than if only surface data are assimilated, a reflection of the greater spatial coverage of GOSAT, with the two exceptions of latitudes >60° associated with a data filter and over Europe where the surface network adequately describes fluxes on our model spatial and temporal grid. We use CarbonTracker and GEOS-Chem XCO2 model output to investigate model error on quantifying proxy GOSAT XCH4 (involving model XCO2 and inferring methane flux estimates from surface mole fraction data and show similar resulting fluxes, with differences reflecting initial differences in the proxy value. Using a series of observing system simulation experiments (OSSEs we characterize the posterior flux error introduced by non-uniform atmospheric sampling by GOSAT. We show that clear-sky measurements can theoretically reproduce fluxes within 10% of true values, with the exception of tropical regions where, due to a large seasonal cycle in the number of measurements because of clouds and aerosols, fluxes are within 15% of true fluxes. We evaluate our

  2. Importance of Leadership Style towards Quality of Care Measures in Healthcare Settings: A Systematic Review.

    Science.gov (United States)

    Sfantou, Danae F; Laliotis, Aggelos; Patelarou, Athina E; Sifaki-Pistolla, Dimitra; Matalliotakis, Michail; Patelarou, Evridiki

    2017-10-14

    Effective leadership of healthcare professionals is critical for strengthening quality and integration of care. This study aimed to assess whether there exist an association between different leadership styles and healthcare quality measures. The search was performed in the Medline (National Library of Medicine, PubMed interface) and EMBASE databases for the time period 2004-2015. The research question that guided this review was posed as: "Is there any relationship between leadership style in healthcare settings and quality of care?" Eighteen articles were found relevant to our research question. Leadership styles were found to be strongly correlated with quality care and associated measures. Leadership was considered a core element for a well-coordinated and integrated provision of care, both from the patients and healthcare professionals.

  3. Missing transverse energy measurement in ATLAS detector: first LHC data results and importance for physics study

    CERN Document Server

    Pizio, Caterina

    2010-01-01

    The Large Hadron Collider (LHC) at CERN started its operation at the end of November 2009, first at a centre-of-mass energy of 900 GeV, then, since March 2010, at 7 TeV. During this period the ATLAS experiment has collected a large number of proton-proton collision events, resulting up to now in an integrated luminosity of about 45 pb-1. A very good measurement of the missing transverse energy, ETmiss, is essential for many physics studies in ATLAS both for Standard Model channels, as W, Z bosons decaying to tau leptons or top quark decays, and for discovering channels. Events with large ETmiss are expected to be the key signature for new physics such as supersymmetry and extra dimensions. A good ETmiss measurement in terms of linearity and resolution is crucial for the efficient and accurate reconstruction of the Higgs boson mass when the Higgs boson decays to a pair of tau leptons. This thesis describes the first measurement of ETmiss in ATLAS with real data. The performance of the algorithm for ETmiss reco...

  4. Measurements of activation cross sections for some long-lived nuclides important in fusion reactor technology

    International Nuclear Information System (INIS)

    Blinov, M.V.; Filatenkov, A.A.; Chuvaev, S.V.

    1992-01-01

    The Ag-109(n,2n)Ag-108m, Eu-151(n,2n)Eu-150 and Eu-153(n,2n)Eu-152 cross sections have been measured in the neutron energy interval of 13.7-14.9 MeV. The measurements were performed at the neutron generator NG-400 of the Radium Institute using (D-T) neutrons. At the same facility the upper limit has been obtained for the W-182(n,n'a)Hf-178m 2 cross section. Neutron capture of the Mo-98 that lead ultimately to the production of the long-lived Tc-99 has been studied at neutron energies 0.7-2.0 MeV. For these purposes, the Van de Graaf accelerator (EG-5) was employed that produced monochromatic neutrons in the (p-T) reaction. Both at EG-5 and NG-400 measurements, special efforts were made to minimize neutron spectrum impurities which unavoidably arise in irradiation environments. (author). 15 refs, 6 figs, 1 tab

  5. On the importance of measurement error correlations in data assimilation for integrated hydrological models

    Science.gov (United States)

    Camporese, Matteo; Botto, Anna

    2017-04-01

    Data assimilation is becoming increasingly popular in hydrological and earth system modeling, as it allows us to integrate multisource observation data in modeling predictions and, in doing so, to reduce uncertainty. For this reason, data assimilation has been recently the focus of much attention also for physically-based integrated hydrological models, whereby multiple terrestrial compartments (e.g., snow cover, surface water, groundwater) are solved simultaneously, in an attempt to tackle environmental problems in a holistic approach. Recent examples include the joint assimilation of water table, soil moisture, and river discharge measurements in catchment models of coupled surface-subsurface flow using the ensemble Kalman filter (EnKF). One of the typical assumptions in these studies is that the measurement errors are uncorrelated, whereas in certain situations it is reasonable to believe that some degree of correlation occurs, due for example to the fact that a pair of sensors share the same soil type. The goal of this study is to show if and how the measurement error correlations between different observation data play a significant role on assimilation results in a real-world application of an integrated hydrological model. The model CATHY (CATchment HYdrology) is applied to reproduce the hydrological dynamics observed in an experimental hillslope. The physical model, located in the Department of Civil, Environmental and Architectural Engineering of the University of Padova (Italy), consists of a reinforced concrete box containing a soil prism with maximum height of 3.5 m, length of 6 m, and width of 2 m. The hillslope is equipped with sensors to monitor the pressure head and soil moisture responses to a series of generated rainfall events applied onto a 60 cm thick sand layer overlying a sandy clay soil. The measurement network is completed by two tipping bucket flow gages to measure the two components (subsurface and surface) of the outflow. By collecting

  6. On the importance of Task 1 and error performance measures in PRP dual-task studies

    Science.gov (United States)

    Strobach, Tilo; Schütz, Anja; Schubert, Torsten

    2015-01-01

    The psychological refractory period (PRP) paradigm is a dominant research tool in the literature on dual-task performance. In this paradigm a first and second component task (i.e., Task 1 and Task 2) are presented with variable stimulus onset asynchronies (SOAs) and priority to perform Task 1. The main indicator of dual-task impairment in PRP situations is an increasing Task 2-RT with decreasing SOAs. This impairment is typically explained with some task components being processed strictly sequentially in the context of the prominent central bottleneck theory. This assumption could implicitly suggest that processes of Task 1 are unaffected by Task 2 and bottleneck processing, i.e., decreasing SOAs do not increase reaction times (RTs) and error rates of the first task. The aim of the present review is to assess whether PRP dual-task studies included both RT and error data presentations and statistical analyses and whether studies including both data types (i.e., RTs and error rates) show data consistent with this assumption (i.e., decreasing SOAs and unaffected RTs and/or error rates in Task 1). This review demonstrates that, in contrast to RT presentations and analyses, error data is underrepresented in a substantial number of studies. Furthermore, a substantial number of studies with RT and error data showed a statistically significant impairment of Task 1 performance with decreasing SOA. Thus, these studies produced data that is not primarily consistent with the strong assumption that processes of Task 1 are unaffected by Task 2 and bottleneck processing in the context of PRP dual-task situations; this calls for a more careful report and analysis of Task 1 performance in PRP studies and for a more careful consideration of theories proposing additions to the bottleneck assumption, which are sufficiently general to explain Task 1 and Task 2 effects. PMID:25904890

  7. On the importance of Task 1 and error performance measures in PRP dual-task studies.

    Science.gov (United States)

    Strobach, Tilo; Schütz, Anja; Schubert, Torsten

    2015-01-01

    The psychological refractory period (PRP) paradigm is a dominant research tool in the literature on dual-task performance. In this paradigm a first and second component task (i.e., Task 1 and Task 2) are presented with variable stimulus onset asynchronies (SOAs) and priority to perform Task 1. The main indicator of dual-task impairment in PRP situations is an increasing Task 2-RT with decreasing SOAs. This impairment is typically explained with some task components being processed strictly sequentially in the context of the prominent central bottleneck theory. This assumption could implicitly suggest that processes of Task 1 are unaffected by Task 2 and bottleneck processing, i.e., decreasing SOAs do not increase reaction times (RTs) and error rates of the first task. The aim of the present review is to assess whether PRP dual-task studies included both RT and error data presentations and statistical analyses and whether studies including both data types (i.e., RTs and error rates) show data consistent with this assumption (i.e., decreasing SOAs and unaffected RTs and/or error rates in Task 1). This review demonstrates that, in contrast to RT presentations and analyses, error data is underrepresented in a substantial number of studies. Furthermore, a substantial number of studies with RT and error data showed a statistically significant impairment of Task 1 performance with decreasing SOA. Thus, these studies produced data that is not primarily consistent with the strong assumption that processes of Task 1 are unaffected by Task 2 and bottleneck processing in the context of PRP dual-task situations; this calls for a more careful report and analysis of Task 1 performance in PRP studies and for a more careful consideration of theories proposing additions to the bottleneck assumption, which are sufficiently general to explain Task 1 and Task 2 effects.

  8. On the importance of Task 1 and error performance measures in PRP dual-task studies

    Directory of Open Access Journals (Sweden)

    Tilo eStrobach

    2015-04-01

    Full Text Available The Psychological Refractory Period (PRP paradigm is a dominant research tool in the literature on dual-task performance. In this paradigm a first and second component task (i.e., Task 1 and 2 are presented with variable stimulus onset asynchronies (SOAs and priority to perform Task 1. The main indicator of dual-task impairment in PRP situations is an increasing Task 2-RT with decreasing SOAs. This impairment is typically explained with some task components being processed strictly sequentially in the context of the prominent central bottleneck theory. This assumption could implicitly suggest that processes of Task 1 are unaffected by Task 2 and bottleneck processing, i.e. decreasing SOAs do not increase RTs and error rates of the first task. The aim of the present review is to assess whether PRP dual-task studies included both RT and error data presentations and statistical analyses and whether studies including both data types (i.e., RTs and error rates show data consistent with this assumption (i.e., decreasing SOAs and unaffected RTs and/ or error rates in Task 1. This review demonstrates that, in contrast to RT presentations and analyses, error data is underrepresented in a substantial number of studies. Furthermore, a substantial number of studies with RT and error data showed a statistically significant impairment of Task 1 performance with decreasing SOA. Thus, these studies produced data that is not primarily consistent with the strong assumption that processes of Task 1 are unaffected by Task 2 and bottleneck processing in the context of PRP dual-task situations; this calls for a more careful report and analysis of Task 1 performance in PRP studies and for a more careful consideration of theories proposing additions to the bottleneck assumption, which are sufficiently general to explain Task 1 and Task 2 effects.

  9. Measuring the importance of oil-related revenues in total fiscal income for Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Reyes-Loya, Manuel Lorenzo; Blanco, Lorenzo [Facultad de Economia, Universidad Autonoma de Nuevo Leon, Loma Redonda 1515 Pte., Col. Loma Larga, C.P. 64710, Monterrey, Nuevo Leon (Mexico)

    2008-09-15

    Revenues from oil exports are an important part of government budgets in Mexico. A time-series analysis is conducted using monthly data from 1990 to 2005 examining three different specifications to determine how international oil price fluctuations and government income generated from oil exports influence fiscal policy in Mexico. The behavior of government spending and taxation is consistent with the spend-tax hypothesis. The results show that there is an inverse relationship between oil-related revenues and tax revenue from non-oil sources. Fiscal policy reform is urgently needed in order to improve tax collection as oil reserves in Mexico become more and more depleted. (author)

  10. Measuring the importance of oil-related revenues in total fiscal income for Mexico

    International Nuclear Information System (INIS)

    Reyes-Loya, Manuel Lorenzo; Blanco, Lorenzo

    2008-01-01

    Revenues from oil exports are an important part of government budgets in Mexico. A time-series analysis is conducted using monthly data from 1990 to 2005 examining three different specifications to determine how international oil price fluctuations and government income generated from oil exports influence fiscal policy in Mexico. The behavior of government spending and taxation is consistent with the spend-tax hypothesis. The results show that there is an inverse relationship between oil-related revenues and tax revenue from non-oil sources. Fiscal policy reform is urgently needed in order to improve tax collection as oil reserves in Mexico become more and more depleted. (author)

  11. Importance of frequency dependent magnetoresistance measurements in analysing the intrinsicality of magnetodielectric effect: A case study

    Science.gov (United States)

    Rai, Hari Mohan; Saxena, Shailendra K.; Mishra, Vikash; Kumar, Rajesh; Sagdeo, P. R.

    2017-08-01

    Magnetodielectric (MD) materials have attracted considerable attention due to their intriguing physics and potential future applications. However, the intrinsicality of the MD effect is always a major concern in such materials as the MD effect may arise also due to the MR (magnetoresistance) effect. In the present case study, we report an experimental approach to analyse and separate the intrinsic and MR dominated contributions of the MD phenomenon. For this purpose, polycrystalline samples of LaGa1-xAxO3 (A = Mn/Fe) have been prepared by solid state reaction method. The purity of their structural phase (orthorhombic) has been validated by refining the X-ray diffraction data. The RTMD (room temperature MD) response has been recorded over a frequency range of 20 Hz to 10 MHz. In order to analyse the intrinsicality of the MD effect, FDMR (frequency dependent MR) by means of IS (impedance spectroscopy) and dc MR measurements in four probe geometry have been carried out at RT. A significant RTMD effect has been observed in selected Mn/Fe doped LaGaO3 (LGO) compositions. The mechanism of MR free/intrinsic MD effect, observed in Mn/Fe doped LGO, has been understood speculatively in terms of modified cell volume associated with the reorientation/retransformation of spin-coupled Mn/Fe orbitals due to the application of magnetic field. The present analysis suggests that in order to justify the intrinsic/resistive origin of the MD phenomenon, FDMR measurements are more useful than measuring only dc MR or analysing the trends of magnetic field dependent change in the dielectric constant and tanδ. On the basis of the present case study, we propose that IS (FDMR) alone can be used as an effective experimental tool to detect and analyse the resistive and intrinsic parts contributing to the MD phenomenon.

  12. Dietary supplement use and smoking are important correlates of biomarkers of water-soluble vitamin status after adjusting for sociodemographic and lifestyle variables in a representative sample of US adults1,2,3

    Science.gov (United States)

    Pfeiffer, Christine M.; Sternberg, Maya R.; Schleicher, Rosemary L.; Rybak, Michael E.

    2016-01-01

    Biochemical indicators of water-soluble vitamin (WSV) status have been measured in a nationally representative sample of the US population in NHANES 2003–2006. To examine whether demographic differentials in nutritional status were related to and confounded by certain variables, we assessed the association of sociodemographic (age, sex, race-ethnicity, education, income) and lifestyle variables (dietary supplement use, smoking, alcohol consumption, BMI, physical activity) with biomarkers of WSV status in adults (≥20 y): serum and RBC folate, serum pyridoxal-5′-phosphate (PLP), serum 4-pyridoxic acid, serum total cobalamin (B-12), plasma total homocysteine (tHcy), plasma methylmalonic acid (MMA), and serum ascorbic acid. Age (except for PLP) and smoking (except for MMA) were generally the strongest significant correlates of these biomarkers (|r| ≤0.43) and together with supplement use explained more of the variability as compared to the other covariates in bivariate analysis. In multiple regression models, sociodemographic and lifestyle variables together explained from 7% (B-12) to 29% (tHcy) of the biomarker variability. We observed significant associations for most biomarkers (≥6 out of 8) with age, sex, race-ethnicity, supplement use, smoking, and BMI; and for some biomarkers with PIR (5/8), education (1/8), alcohol consumption (4/8), and physical activity (5/8). We noted large estimated percent changes in biomarker concentrations between race-ethnic groups (from −24% to 20%), between supplement users and nonusers (from −12% to 104%), and between smokers and nonsmokers (from −28% to 8%). In summary, age, sex, and race-ethnic differentials in biomarker concentrations remained significant after adjusting for sociodemographic and lifestyle variables. Supplement use and smoking were important correlates of biomarkers of WSV status. PMID:23576641

  13. Dietary supplement use and smoking are important correlates of biomarkers of water-soluble vitamin status after adjusting for sociodemographic and lifestyle variables in a representative sample of U.S. adults.

    Science.gov (United States)

    Pfeiffer, Christine M; Sternberg, Maya R; Schleicher, Rosemary L; Rybak, Michael E

    2013-06-01

    Biochemical indicators of water-soluble vitamin (WSV) status were measured in a nationally representative sample of the U.S. population in NHANES 2003-2006. To examine whether demographic differentials in nutritional status were related to and confounded by certain variables, we assessed the association of sociodemographic (age, sex, race-ethnicity, education, income) and lifestyle (dietary supplement use, smoking, alcohol consumption, BMI, physical activity) variables with biomarkers of WSV status in adults (aged ≥ 20 y): serum and RBC folate, serum pyridoxal-5'-phosphate (PLP), serum 4-pyridoxic acid, serum total cobalamin (vitamin B-12), plasma total homocysteine (tHcy), plasma methylmalonic acid (MMA), and serum ascorbic acid. Age (except for PLP) and smoking (except for MMA) were generally the strongest significant correlates of these biomarkers (|r| ≤ 0.43) and together with supplement use explained more of the variability compared with the other covariates in bivariate analysis. In multiple regression models, sociodemographic and lifestyle variables together explained from 7 (vitamin B-12) to 29% (tHcy) of the biomarker variability. We observed significant associations for most biomarkers (≥ 6 of 8) with age, sex, race-ethnicity, supplement use, smoking, and BMI and for some biomarkers with PIR (5 of 8), education (1 of 8), alcohol consumption (4 of 8), and physical activity (5 of 8). We noted large estimated percentage changes in biomarker concentrations between race-ethnic groups (from -24 to 20%), between supplement users and nonusers (from -12 to 104%), and between smokers and nonsmokers (from -28 to 8%). In summary, age, sex, and race-ethnic differentials in biomarker concentrations remained significant after adjusting for sociodemographic and lifestyle variables. Supplement use and smoking were important correlates of biomarkers of WSV status.

  14. Constructing Proxy Variables to Measure Adult Learners' Time Management Strategies in LMS

    Science.gov (United States)

    Jo, Il-Hyun; Kim, Dongho; Yoon, Meehyun

    2015-01-01

    This study describes the process of constructing proxy variables from recorded log data within a Learning Management System (LMS), which represents adult learners' time management strategies in an online course. Based on previous research, three variables of total login time, login frequency, and regularity of login interval were selected as…

  15. Importance weighting of local flux measurements to improve reactivity predictions in nuclear systems

    Energy Technology Data Exchange (ETDEWEB)

    Dulla, Sandra; Hoh, Siew Sin; Nervo, Marta; Ravetto, Piero [Politecnico di Torino, Dipt. Energia (Italy)

    2015-07-15

    The reactivity monitoring is a key aspect for the safe operation of nuclear reactors, especially for subcritical source-driven systems. Various methods are available for both, off-line and on-line reactivity determination from direct measurements carried out on the reactor. Usually the methods are based on the inverse point kinetic model applied to signals from neutron detectors and results may be severely affected by space and spectral effects. Such effects need to be compensated and correction procedures have to be applied. In this work, a new approach is proposed, by using the full information from different local measurements to generate a global signal through a proper weighting of the signals provided by single neutron detectors. A weighting techique based on the use of the adjoint flux proves to be efficient in improving the prediction capability of inverse techniques. The idea is applied to the recently developed algorithm, named MAρTA, that can be used in both off-line and online modes.

  16. Classical vs. evolved quenching parameters and procedures in scintillation measurements: The importance of ionization quenching

    International Nuclear Information System (INIS)

    Bagan, H.; Tarancon, A.; Rauret, G.; Garcia, J.F.

    2008-01-01

    The quenching parameters used to model detection efficiency variations in scintillation measurements have not evolved since the decade of 1970s. Meanwhile, computer capabilities have increased enormously and ionization quenching has appeared in practical measurements using plastic scintillation. This study compares the results obtained in activity quantification by plastic scintillation of 14 C samples that contain colour and ionization quenchers, using classical (SIS, SCR-limited, SCR-non-limited, SIS(ext), SQP(E)) and evolved (MWA-SCR and WDW) parameters and following three calibration approaches: single step, which does not take into account the quenching mechanism; two steps, which takes into account the quenching phenomena; and multivariate calibration. Two-step calibration (ionization followed by colour) yielded the lowest relative errors, which means that each quenching phenomenon must be specifically modelled. In addition, the sample activity was quantified more accurately when the evolved parameters were used. Multivariate calibration-PLS also yielded better results than those obtained using classical parameters, which confirms that the quenching phenomena must be taken into account. The detection limits for each calibration method and each parameter were close to those obtained theoretically using the Currie approach

  17. The importance of patient-reported outcome measures in reconstructive urology.

    Science.gov (United States)

    Jackson, Matthew J; N'Dow, James; Pickard, Rob

    2010-11-01

    Patient-reported outcome measures (PROMs) are now recognised as the most appropriate instruments to assess the effectiveness of healthcare interventions from the patient's perspective. The purpose of this review was to identify recent publications describing the use of PROMs following reconstructive urological surgery. A wide systematic search identified only three original articles published in the last 2 years that prospectively assessed effectiveness using a patient-completed condition-specific or generic health-related quality of life (HRQoL) instrument. These publications illustrate the need to administer PROMs at a postoperative interval relevant to the anticipated recovery phase of individual procedures. They also highlight the difference in responsiveness of generic HRQoL instruments to symptomatic improvement between straightforward conditions such as pelviureteric junction obstruction and complex multidimensional conditions such as meningomyelocele. PROMs uptake and awareness is increasing in reconstructive urology but more work is required to demonstrate the effectiveness of surgical procedures for patients and healthcare funders alike. Healthcare policy-makers now rely on these measures to determine whether specific treatments are worth financing and to compare outcomes between institutions.

  18. Risk Mitigation Measures: An Important Aspect of the Environmental Risk Assessment of Pharmaceuticals

    Directory of Open Access Journals (Sweden)

    Markus Liebig

    2014-01-01

    Full Text Available Within EU marketing authorization procedures of human and veterinary medicinal products (HMP and VMP, an environmental risk assessment (ERA has to be performed. In the event that an unacceptable environmental risk is identified, risk mitigation measures (RMM shall be applied in order to reduce environmental exposure to the pharmaceutical. Within the authorization procedures of HMP, no RMM have been applied so far, except for specific precautions for the disposal of the unused medicinal product or waste materials. For VMP, a limited number of RMM do exist. The aim of this study was to develop consistent and efficient RMM. Therefore, existing RMM were compiled from a summary of product characteristics of authorized pharmaceuticals, and new RMM were developed and evaluated. Based on the results, appropriate RMM were applied within the authorization procedures of medicinal products. For HMP, except for the existing precautions for disposal, no further reasonable measures could be developed. For VMP, two specific precautions for disposal and 17 specific precautions for use in animals were proposed as RMM.

  19. Local measures in HIV-associated Kaposi's sarcoma - importance of radiation therapy

    International Nuclear Information System (INIS)

    Plettenberg, A.; Meigel, W.; Janik, I.; Kolb, H.

    1991-01-01

    In 23 patients with HIV-associated Kaposi's sarcoma 53 tumor lesions were treated with fractional radiotherapy. Indication for the radiotherapy were mostly cosmetic reasons in stigmatising tumors, but also in several cases pain, oedema or functional deficits as a result of the tumor lesions. 21 patients received orthovoltage irradiation, the remaining four patients were treated with telecobalt therapy. A complete response was observed in 17%, a partial response in 76% and unchanged lesions in 4%. In two cases (4%), both were treated with telecobalt-therapy by large tumor masses, there occurred a further tumor progression inspite of the radiotherapy. In ten lesions, all with partial remission, we later observed a repeated tumor progression. Important side effects were signs of inflammation as mucositis and edema or hyperpigmentation. The occurrence of acute side effects can be reduced by fractionating of the radiotherapy. (orig.) [de

  20. Measurement of some radiologically and nutritionally important trace elements in human milk and commercially available milk

    International Nuclear Information System (INIS)

    Nair, Suma; Sathyapriya, R.S.; Nair, M.G.; Ravi, Prabhat; Bhati, Sharda

    2011-01-01

    Milk is considered to be a complete food and an almost indispensable part of the diets of infants and children. In this paper we present the concentration of some radiologically and nutritionally important trace elements such as Th, Cs, Co, Rb, Fe, Ca and Zn present in human milk and commercially available milk. The trace elements in human and other milk samples were determined using instrumental neutron activation analysis technique. The results show that higher concentrations of Th, Cs, Ca and Rb were found in ordinary milk samples in comparison with the human milk samples. Whereas, a higher concentrations of Fe and Co were observed in human milk samples. These data will be useful for the nutritional and biokinetic studies of these elements in infants and children of different age groups. (author)

  1. THE USE OF IMPORTANCE-PERFORMANCE ANALYSIS TO MEASURE THE SATISFACTION OF TRAVEL AGENCY FRANCHISEES

    Directory of Open Access Journals (Sweden)

    José M. Ramírez-Hurtado

    2017-02-01

    Full Text Available This study contributes to the limited literature on the satisfaction of travel agency franchisees. Specifically, it aims to identify strengths and weaknesses of the system from the perspective of the franchisee. This study would enable franchisors to identify areas in which franchisees are less satisfied. If franchisees are satisfied with numerous aspects that influence the franchisor-franchisee relationship, the latter may have a high degree of loyalty towards their franchisors and this would benefit the entire network. This article uses a variant of the classic importance-performance model from Martilla and James (1977 and others (Ábalo, Varela, & Rial, 2006; Picón, Varela, & Braña, 2011. The results show that the attributes travel agency franchisees feel more dissatisfied with are: chain advertising, ongoing support from franchisors, the initial franchisor support, delivery from the franchisors, and training provided by franchisors.

  2. Using multiple biomarkers and determinants to obtain a better measurement of oxidative stress: a latent variable structural equation model approach.

    Science.gov (United States)

    Eldridge, Ronald C; Flanders, W Dana; Bostick, Roberd M; Fedirko, Veronika; Gross, Myron; Thyagarajan, Bharat; Goodman, Michael

    2017-09-01

    Since oxidative stress involves a variety of cellular changes, no single biomarker can serve as a complete measure of this complex biological process. The analytic technique of structural equation modeling (SEM) provides a possible solution to this problem by modelling a latent (unobserved) variable constructed from the covariance of multiple biomarkers. Using three pooled datasets, we modelled a latent oxidative stress variable from five biomarkers related to oxidative stress: F 2 -isoprostanes (FIP), fluorescent oxidation products, mitochondrial DNA copy number, γ-tocopherol (Gtoc) and C-reactive protein (CRP, an inflammation marker closely linked to oxidative stress). We validated the latent variable by assessing its relation to pro- and anti-oxidant exposures. FIP, Gtoc and CRP characterized the latent oxidative stress variable. Obesity, smoking, aspirin use and β-carotene were statistically significantly associated with oxidative stress in the theorized directions; the same exposures were weakly and inconsistently associated with the individual biomarkers. Our results suggest that using SEM with latent variables decreases the biomarker-specific variability, and may produce a better measure of oxidative stress than do single variables. This methodology can be applied to similar areas of research in which a single biomarker is not sufficient to fully describe a complex biological phenomenon.

  3. Variability in results from negative binomial models for Lyme disease measured at different spatial scales.

    Science.gov (United States)

    Tran, Phoebe; Waller, Lance

    2015-01-01

    Lyme disease has been the subject of many studies due to increasing incidence rates year after year and the severe complications that can arise in later stages of the disease. Negative binomial models have been used to model Lyme disease in the past with some success. However, there has been little focus on the reliability and consistency of these models when they are used to study Lyme disease at multiple spatial scales. This study seeks to explore how sensitive/consistent negative binomial models are when they are used to study Lyme disease at different spatial scales (at the regional and sub-regional levels). The study area includes the thirteen states in the Northeastern United States with the highest Lyme disease incidence during the 2002-2006 period. Lyme disease incidence at county level for the period of 2002-2006 was linked with several previously identified key landscape and climatic variables in a negative binomial regression model for the Northeastern region and two smaller sub-regions (the New England sub-region and the Mid-Atlantic sub-region). This study found that negative binomial models, indeed, were sensitive/inconsistent when used at different spatial scales. We discuss various plausible explanations for such behavior of negative binomial models. Further investigation of the inconsistency and sensitivity of negative binomial models when used at different spatial scales is important for not only future Lyme disease studies and Lyme disease risk assessment/management but any study that requires use of this model type in a spatial context. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. The importance of measuring macroprolactin in the differential diagnosis of hyperprolactinemic patients

    Directory of Open Access Journals (Sweden)

    Chih-Chin Lu

    2012-02-01

    Full Text Available This study investigated the differences in clinical and laboratory features as well as treatment response in 70 outpatients with macroprolactinemia and monomeric hyperprolactinemia treated with dopamine agonists. After precipitation of the patients’ serum samples with poly-ethylene-glycol (PEG, serum prolactin (PRL levels were measured. We also measured serum levels of luteinizing hormone (LH, follicle-stimulating hormone (FSH, estradiol for women and testosterone for men. Clinical symptoms and signs were recorded. All patients received brain magnetic resonance imaging (MRI. After excluding patients with macroadenoma, 66 patients were treated with the dopamine agonist cabergoline. After 1 year, the clinical responses to cabergoline were recorded and PRL levels measured. Of the initial 70 patients with hyperprolactinemia, 15 patients (21.4% were found to have macroprolactinemia, while the rest had monomeric hyperprolactinemia. The two groups did not differ with regard to galactorrhea, menstrual disturbances or impotence. There were no significant group differences in serum LH, FSH, estradiol or testosterone levels. Patients with macroprolactinemia, however, had a significantly lower infertility rate than those with true hyperprolactinemia (6.7% vs. 32.7%, p=0.005. A greater percentage of macroprolactinemic patients had normal MRI pituitary images than those with hyperprolactinemia (73.3% vs. 34.5%, p=0.029. Compared to those with true hyperprolactinemia, patients with macroprolactinemia were found to have no significant changes in clinical features and PRL levels after 1 year of cabergoline therapy (after PEG precipitation, pre- and post-PRL levels: 59.3±100.2 to 13.8±9.5 ng/mL vs. 6.1±5.3 to 5.1±4.3 ng/mL, p=0.002. In conclusion, while macroprolactinemia is a common cause of hyperprolactinemia, many clinical and laboratory features cannot be used reliably to differentiate macroprolactinemia from true hyperprolactinemia. Routine screening

  5. Stress Measured by Allostatic Load in Neurologically Impaired Children: The Importance of Nutritional Status.

    Science.gov (United States)

    Calcaterra, Valeria; Cena, Hellas; de Silvestri, Annalisa; Albertini, Riccardo; De Amici, Mara; Valenza, Mario; Pelizzo, Gloria

    2017-01-01

    Allostatic load (AL) is the cumulative physiological wear and tear that results from repeated efforts to adapt to stressors over time. The life stress response is modified by nutritional status. We estimated AL scores among neurologically impaired (NI) children; the association with malnutrition was also evaluated. Forty-one patients with severe disabilities were included. Data based