WorldWideScience

Sample records for variables included standardized

  1. Variability of consumer impacts from energy efficiency standards

    Energy Technology Data Exchange (ETDEWEB)

    McMahon, James E.; Liu, Xiaomin

    2000-06-15

    A typical prospective analysis of the expected impact of energy efficiency standards on consumers is based on average economic conditions (e.g., energy price) and operating characteristics. In fact, different consumers face different economic conditions and exhibit different behaviors when using an appliance. A method has been developed to characterize the variability among individual households and to calculate the life-cycle cost of appliances taking into account those differences. Using survey data, this method is applied to a distribution of consumers representing the U.S. Examples of clothes washer standards are shown for which 70-90% of the population benefit, compared to 10-30% who are expected to bear increased costs due to new standards. In some cases, sufficient data exist to distinguish among demographic subgroups (for example, low income or elderly households) who are impacted differently from the general population. Rank order correlations between the sampled input distributions and the sampled output distributions are calculated to determine which variability inputs are main factors. This ''importance analysis'' identifies the key drivers contributing to the range of results. Conversely, the importance analysis identifies variables that, while uncertain, make so little difference as to be irrelevant in deciding a particular policy. Examples will be given from analysis of water heaters to illustrate the dominance of the policy implications by a few key variables.

  2. An Assessment of the Need for Standard Variable Names for Airborne Field Campaigns

    Science.gov (United States)

    Beach, A. L., III; Chen, G.; Northup, E. A.; Kusterer, J.; Quam, B. M.

    2017-12-01

    The NASA Earth Venture Program has led to a dramatic increase in airborne observations, requiring updated data management practices with clearly defined data standards and protocols for metadata. An airborne field campaign can involve multiple aircraft and a variety of instruments. It is quite common to have different instruments/techniques measure the same parameter on one or more aircraft platforms. This creates a need to allow instrument Principal Investigators (PIs) to name their variables in a way that would distinguish them across various data sets. A lack of standardization of variables names presents a challenge for data search tools in enabling discovery of similar data across airborne studies, aircraft platforms, and instruments. This was also identified by data users as one of the top issues in data use. One effective approach for mitigating this problem is to enforce variable name standardization, which can effectively map the unique PI variable names to fixed standard names. In order to ensure consistency amongst the standard names, it will be necessary to choose them from a controlled list. However, no such list currently exists despite a number of previous efforts to establish a sufficient list of atmospheric variable names. The Atmospheric Composition Variable Standard Name Working Group was established under the auspices of NASA's Earth Science Data Systems Working Group (ESDSWG) to solicit research community feedback to create a list of standard names that are acceptable to data providers and data users This presentation will discuss the challenges and recommendations of standard variable names in an effort to demonstrate how airborne metadata curation/management can be improved to streamline data ingest, improve interoperability, and discoverability to a broader user community.

  3. The gait standard deviation, a single measure of kinematic variability.

    Science.gov (United States)

    Sangeux, Morgan; Passmore, Elyse; Graham, H Kerr; Tirosh, Oren

    2016-05-01

    Measurement of gait kinematic variability provides relevant clinical information in certain conditions affecting the neuromotor control of movement. In this article, we present a measure of overall gait kinematic variability, GaitSD, based on combination of waveforms' standard deviation. The waveform standard deviation is the common numerator in established indices of variability such as Kadaba's coefficient of multiple correlation or Winter's waveform coefficient of variation. Gait data were collected on typically developing children aged 6-17 years. Large number of strides was captured for each child, average 45 (SD: 11) for kinematics and 19 (SD: 5) for kinetics. We used a bootstrap procedure to determine the precision of GaitSD as a function of the number of strides processed. We compared the within-subject, stride-to-stride, variability with the, between-subject, variability of the normative pattern. Finally, we investigated the correlation between age and gait kinematic, kinetic and spatio-temporal variability. In typically developing children, the relative precision of GaitSD was 10% as soon as 6 strides were captured. As a comparison, spatio-temporal parameters required 30 strides to reach the same relative precision. The ratio stride-to-stride divided by normative pattern variability was smaller in kinematic variables (the smallest for pelvic tilt, 28%) than in kinetic and spatio-temporal variables (the largest for normalised stride length, 95%). GaitSD had a strong, negative correlation with age. We show that gait consistency may stabilise only at, or after, skeletal maturity. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. What to use to express the variability of data: Standard deviation or standard error of mean?

    Science.gov (United States)

    Barde, Mohini P; Barde, Prajakt J

    2012-07-01

    Statistics plays a vital role in biomedical research. It helps present data precisely and draws the meaningful conclusions. While presenting data, one should be aware of using adequate statistical measures. In biomedical journals, Standard Error of Mean (SEM) and Standard Deviation (SD) are used interchangeably to express the variability; though they measure different parameters. SEM quantifies uncertainty in estimate of the mean whereas SD indicates dispersion of the data from mean. As readers are generally interested in knowing the variability within sample, descriptive data should be precisely summarized with SD. Use of SEM should be limited to compute CI which measures the precision of population estimate. Journals can avoid such errors by requiring authors to adhere to their guidelines.

  5. 2016 Updated American Society of Clinical Oncology/Oncology Nursing Society Chemotherapy Administration Safety Standards, Including Standards for Pediatric Oncology.

    Science.gov (United States)

    Neuss, Michael N; Gilmore, Terry R; Belderson, Kristin M; Billett, Amy L; Conti-Kalchik, Tara; Harvey, Brittany E; Hendricks, Carolyn; LeFebvre, Kristine B; Mangu, Pamela B; McNiff, Kristen; Olsen, MiKaela; Schulmeister, Lisa; Von Gehr, Ann; Polovich, Martha

    2016-12-01

    Purpose To update the ASCO/Oncology Nursing Society (ONS) Chemotherapy Administration Safety Standards and to highlight standards for pediatric oncology. Methods The ASCO/ONS Chemotherapy Administration Safety Standards were first published in 2009 and updated in 2011 to include inpatient settings. A subsequent 2013 revision expanded the standards to include the safe administration and management of oral chemotherapy. A joint ASCO/ONS workshop with stakeholder participation, including that of the Association of Pediatric Hematology Oncology Nurses and American Society of Pediatric Hematology/Oncology, was held on May 12, 2015, to review the 2013 standards. An extensive literature search was subsequently conducted, and public comments on the revised draft standards were solicited. Results The updated 2016 standards presented here include clarification and expansion of existing standards to include pediatric oncology and to introduce new standards: most notably, two-person verification of chemotherapy preparation processes, administration of vinca alkaloids via minibags in facilities in which intrathecal medications are administered, and labeling of medications dispensed from the health care setting to be taken by the patient at home. The standards were reordered and renumbered to align with the sequential processes of chemotherapy prescription, preparation, and administration. Several standards were separated into their respective components for clarity and to facilitate measurement of adherence to a standard. Conclusion As oncology practice has changed, so have chemotherapy administration safety standards. Advances in technology, cancer treatment, and education and training have prompted the need for periodic review and revision of the standards. Additional information is available at http://www.asco.org/chemo-standards .

  6. What to use to express the variability of data: Standard deviation or standard error of mean?

    OpenAIRE

    Barde, Mohini P.; Barde, Prajakt J.

    2012-01-01

    Statistics plays a vital role in biomedical research. It helps present data precisely and draws the meaningful conclusions. While presenting data, one should be aware of using adequate statistical measures. In biomedical journals, Standard Error of Mean (SEM) and Standard Deviation (SD) are used interchangeably to express the variability; though they measure different parameters. SEM quantifies uncertainty in estimate of the mean whereas SD indicates dispersion of the data from mean. As reade...

  7. Individual variability in heart rate recovery after standardized submaximal exercise

    NARCIS (Netherlands)

    van der Does, Hendrike; Brink, Michel; Visscher, Chris; Lemmink, Koen

    2012-01-01

    To optimize performance, coaches and athletes are always looking for the right balance between training load and recovery. Therefore, closely monitoring of athletes is important. Heart rate recovery (HRR) after standardized sub maximal exercise has been proposed as a useful variable to monitor

  8. Accounting for human variability and sensitivity in setting standards for electromagnetic fields.

    Science.gov (United States)

    Bailey, William H; Erdreich, Linda S

    2007-06-01

    Biological sensitivity and variability are key issues for risk assessment and standard setting. Variability encompasses general inter-individual variations in population responses, while sensitivity relates to unusual or extreme responses based on genetic, congenital, medical, or environmental conditions. For risk assessment and standard setting, these factors affect estimates of thresholds for effects and dose-response relationships and inform efforts to protect the more sensitive members of the population, not just the typical or average person. While issues of variability and sensitivity can be addressed by experimental and clinical studies of electromagnetic fields, investigators have paid little attention to these important issues. This paper provides examples that illustrate how default assumptions regarding variability can be incorporated into estimates of 60-Hz magnetic field exposures with no risk of cardiac stimulation and how population thresholds and variability of peripheral nerve stimulation responses at 60-Hz can be estimated from studies of pulsed gradient magnetic fields in magnetic resonance imaging studies. In the setting of standards for radiofrequency exposures, the International Commission for Non-Ionizing Radiation Protection uses inter-individual differences in thermal sensitivity as one of the considerations in the development of "safety factors." However, neither the range of sensitivity nor the sufficiency or excess of the 10-fold and the additional 5-fold safety factors have been assessed quantitatively. Data on the range of responses between median and sensitive individuals regarding heat stress and cognitive function should be evaluated to inform a reassessment of these safety factors and to identify data gaps.

  9. Taylor Series Trajectory Calculations Including Oblateness Effects and Variable Atmospheric Density

    Science.gov (United States)

    Scott, James R.

    2011-01-01

    Taylor series integration is implemented in NASA Glenn's Spacecraft N-body Analysis Program, and compared head-to-head with the code's existing 8th- order Runge-Kutta Fehlberg time integration scheme. This paper focuses on trajectory problems that include oblateness and/or variable atmospheric density. Taylor series is shown to be significantly faster and more accurate for oblateness problems up through a 4x4 field, with speedups ranging from a factor of 2 to 13. For problems with variable atmospheric density, speedups average 24 for atmospheric density alone, and average 1.6 to 8.2 when density and oblateness are combined.

  10. Standard Errors of Estimated Latent Variable Scores with Estimated Structural Parameters

    Science.gov (United States)

    Hoshino, Takahiro; Shigemasu, Kazuo

    2008-01-01

    The authors propose a concise formula to evaluate the standard error of the estimated latent variable score when the true values of the structural parameters are not known and must be estimated. The formula can be applied to factor scores in factor analysis or ability parameters in item response theory, without bootstrap or Markov chain Monte…

  11. Variability of gastric emptying measurements in man employing standardized radiolabeled meals

    International Nuclear Information System (INIS)

    Brophy, C.M.; Moore, J.G.; Christian, P.E.; Egger, M.J.; Taylor, A.T.

    1986-01-01

    Radiolabeled liquid and solid portions of standardized 300-g meals were administered on four different study days to eight healthy subjects in an attempt to define the range of inter- and intrasubject variability in gastric emptying. Meal half emptying times, analysis of variance, and intraclass correlations were computed and compared within and between subjects. The mean solid half emptying time was 58 +/- 17 min (range 29-92), while the mean liquid half emptying time was 24 +/- 8 min (range 12-37). A nested random effects analysis of variance showed moderate intrasubject variability for solid emptying and high intrasubject variability for liquid emptying. The variability of solid and liquid emptying was comparable and relatively large when compared with other reports in the literature. The isotopic method for measuring gastric emptying is a valuable tool for investigating problems in gastric pathophysiology, particularly when differences between groups of subjects are sought. However, meal emptying time is a variable phenomenon in healthy subjects with significant inter- and intraindividual day-to-day differences. These day-to-day variations in gastric emptying must be considered in interpreting individual study results

  12. Variability of gastric emptying measurements in man employing standardized radiolabeled meals

    Energy Technology Data Exchange (ETDEWEB)

    Brophy, C.M.; Moore, J.G.; Christian, P.E.; Egger, M.J.; Taylor, A.T.

    1986-08-01

    Radiolabeled liquid and solid portions of standardized 300-g meals were administered on four different study days to eight healthy subjects in an attempt to define the range of inter- and intrasubject variability in gastric emptying. Meal half emptying times, analysis of variance, and intraclass correlations were computed and compared within and between subjects. The mean solid half emptying time was 58 +/- 17 min (range 29-92), while the mean liquid half emptying time was 24 +/- 8 min (range 12-37). A nested random effects analysis of variance showed moderate intrasubject variability for solid emptying and high intrasubject variability for liquid emptying. The variability of solid and liquid emptying was comparable and relatively large when compared with other reports in the literature. The isotopic method for measuring gastric emptying is a valuable tool for investigating problems in gastric pathophysiology, particularly when differences between groups of subjects are sought. However, meal emptying time is a variable phenomenon in healthy subjects with significant inter- and intraindividual day-to-day differences. These day-to-day variations in gastric emptying must be considered in interpreting individual study results.

  13. 32 CFR 37.620 - What financial management standards do I include for nonprofit participants?

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 1 2010-07-01 2010-07-01 false What financial management standards do I include... financial management standards do I include for nonprofit participants? So as not to force system changes..., your expenditure-based TIA's requirements for the financial management system of any nonprofit...

  14. Proposed Standards for Variable Harmonization Documentation and Referencing: A Case Study Using QuickCharmStats 1.1

    Science.gov (United States)

    Winters, Kristi; Netscher, Sebastian

    2016-01-01

    Comparative statistical analyses often require data harmonization, yet the social sciences do not have clear operationalization frameworks that guide and homogenize variable coding decisions across disciplines. When faced with a need to harmonize variables researchers often look for guidance from various international studies that employ output harmonization, such as the Comparative Survey of Election Studies, which offer recoding structures for the same variable (e.g. marital status). More problematically there are no agreed documentation standards or journal requirements for reporting variable harmonization to facilitate a transparent replication process. We propose a conceptual and data-driven digital solution that creates harmonization documentation standards for publication and scholarly citation: QuickCharmStats 1.1. It is free and open-source software that allows for the organizing, documenting and publishing of data harmonization projects. QuickCharmStats starts at the conceptual level and its workflow ends with a variable recording syntax. It is therefore flexible enough to reflect a variety of theoretical justifications for variable harmonization. Using the socio-demographic variable ‘marital status’, we demonstrate how the CharmStats workflow collates metadata while being guided by the scientific standards of transparency and replication. It encourages researchers to publish their harmonization work by providing researchers who complete the peer review process a permanent identifier. Those who contribute original data harmonization work to their discipline can now be credited through citations. Finally, we propose peer-review standards for harmonization documentation, describe a route to online publishing, and provide a referencing format to cite harmonization projects. Although CharmStats products are designed for social scientists our adherence to the scientific method ensures our products can be used by researchers across the sciences. PMID

  15. Impact of including surface currents on simulation of Indian Ocean variability with the POAMA coupled model

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Mei; Wang, Guomin; Hendon, Harry H.; Alves, Oscar [Bureau of Meteorology, Centre for Australian Weather and Climate Research, Melbourne (Australia)

    2011-04-15

    Impacts on the coupled variability of the Indo-Pacific by including the effects of surface currents on surface stress are explored in four extended integrations of an experimental version of the Bureau of Meteorology's coupled seasonal forecast model POAMA. The first pair of simulations differs only in their treatment of momentum coupling: one version includes the effects of surface currents on the surface stress computation and the other does not. The version that includes the effect of surface currents has less mean-state bias in the equatorial Pacific cold tongue but produces relatively weak coupled variability in the Tropics, especially that related to the Indian Ocean dipole (IOD) and El Nino/Southern Oscillation (ENSO). The version without the effects of surface currents has greater bias in the Pacific cold tongue but stronger IOD and ENSO variability. In order to diagnose the role of changes in local coupling from changes in remote forcing by ENSO for causing changes in IOD variability, a second set of simulations is conducted where effects of surface currents are included only in the Indian Ocean and only in the Pacific Ocean. IOD variability is found to be equally reduced by inclusion of the local effects of surface currents in the Indian Ocean and by the reduction of ENSO variability as a result of including effects of surface currents in the Pacific. Some implications of these results for predictability of the IOD and its dependence on ENSO, and for ocean subsurface data assimilation are discussed. (orig.)

  16. Multiplicative surrogate standard deviation: a group metric for the glycemic variability of individual hospitalized patients.

    Science.gov (United States)

    Braithwaite, Susan S; Umpierrez, Guillermo E; Chase, J Geoffrey

    2013-09-01

    Group metrics are described to quantify blood glucose (BG) variability of hospitalized patients. The "multiplicative surrogate standard deviation" (MSSD) is the reverse-transformed group mean of the standard deviations (SDs) of the logarithmically transformed BG data set of each patient. The "geometric group mean" (GGM) is the reverse-transformed group mean of the means of the logarithmically transformed BG data set of each patient. Before reverse transformation is performed, the mean of means and mean of SDs each has its own SD, which becomes a multiplicative standard deviation (MSD) after reverse transformation. Statistical predictions and comparisons of parametric or nonparametric tests remain valid after reverse transformation. A subset of a previously published BG data set of 20 critically ill patients from the first 72 h of treatment under the SPRINT protocol was transformed logarithmically. After rank ordering according to the SD of the logarithmically transformed BG data of each patient, the cohort was divided into two equal groups, those having lower or higher variability. For the entire cohort, the GGM was 106 (÷/× 1.07) mg/dl, and MSSD was 1.24 (÷/× 1.07). For the subgroups having lower and higher variability, respectively, the GGM did not differ, 104 (÷/× 1.07) versus 109 (÷/× 1.07) mg/dl, but the MSSD differed, 1.17 (÷/× 1.03) versus 1.31 (÷/× 1.05), p = .00004. By using the MSSD with its MSD, groups can be characterized and compared according to glycemic variability of individual patient members. © 2013 Diabetes Technology Society.

  17. Including alternative resources in state renewable portfolio standards: Current design and implementation experience

    International Nuclear Information System (INIS)

    Heeter, Jenny; Bird, Lori

    2013-01-01

    As of October 2012, 29 states, the District of Columbia, and Puerto Rico have instituted a renewable portfolio standard (RPS). Each state policy is unique, varying in percentage targets, timetables, and eligible resources. Increasingly, new RPS polices have included alternative resources. Alternative resources have included energy efficiency, thermal resources, and, to a lesser extent, non-renewables. This paper examines state experience with implementing renewable portfolio standards that include energy efficiency, thermal resources, and non-renewable energy and explores compliance experience, costs, and how states evaluate, measure, and verify energy efficiency and convert thermal energy. It aims to gain insights from the experience of states for possible federal clean energy policy as well as to share experience and lessons for state RPS implementation. - Highlights: • Increasingly, new RPS policies have included alternative resources. • Nearly all states provide a separate tier or cap on the quantity of eligible alternative resources. • Where allowed, non-renewables and energy efficiency are being heavily utilized

  18. Microscopic age determination of human skeletons including an unknown but calculable variable

    DEFF Research Database (Denmark)

    Wallin, Johan Albert; Tkocz, Izabella; Kristensen, Gustav

    1994-01-01

    estimation, which includes the covariance matrix of four single equation residuals, improves the accuracy of age determination. The standard deviation, however, of age prediction remains 12.58 years. An experimental split of the data was made in order to demonstrate that the use of subgroups gives a false...

  19. Developing standard transmission system for radiology reporting including key images

    International Nuclear Information System (INIS)

    Kim, Seon Chil

    2007-01-01

    Development of hospital information system and Picture Archiving Communication System is not new in the medical field, and the development of internet and information technology are also universal. In the course of such development, however, it is hard to share medical information without a refined standard format. Especially in the department of radiology, the role of PACS has become very important in interchanging information with other disparate hospital information systems. A specific system needs to be developed that radiological reports are archived into a database efficiently. This includes sharing of medical images. A model is suggested in this study in which an internal system is developed where radiologists store necessary images and transmit them is the standard international clinical format, Clinical Document Architecture, and share the information with hospitals. CDA document generator was made to generate a new file format and separate the existing storage system from the new system. This was to ensure the access to required data in XML documents. The model presented in this study added a process where crucial images in reading are inserted in the CDA radiological report generator. Therefore, this study suggests a storage and transmission model for CDA documents, which is different from the existing DICOM SR. Radiological reports could be better shared, when the application function for inserting images and the analysis of standard clinical terms are completed

  20. Preliminary Safety Information Document for the Standard MHTGR. Volume 1, (includes latest Amendments)

    Energy Technology Data Exchange (ETDEWEB)

    None

    1986-01-01

    With NRC concurrence, the Licensing Plan for the Standard HTGR describes an application program consistent with 10CFR50, Appendix O to support a US Nuclear Regulatory Commission (NRC) review and design certification of an advanced Standard modular High Temperature Gas-Cooled Reactor (MHTGR) design. Consistent with the NRC's Advanced Reactor Policy, the Plan also outlines a series of preapplication activities which have as an objective the early issuance of an NRC Licensability Statement on the Standard MHTGR conceptual design. This Preliminary Safety Information Document (PSID) has been prepared as one of the submittals to the NRC by the US Department of Energy in support of preapplication activities on the Standard MHTGR. Other submittals to be provided include a Probabilistic Risk Assessment, a Regulatory Technology Development Plan, and an Emergency Planning Bases Report.

  1. Outlining precision boundaries among areas with different variability standards using magnetic susceptibility and geomorphic surfaces

    OpenAIRE

    Matias,Sammy S. R.; Marques Júnior,José; Siqueira,Diego S.; Pereira,Gener T.

    2014-01-01

    There is an increasing demand for detailed maps that represent in a simplified way the knowledge of the variability of a particular area or region maps. The objective was to outline precision boundaries among areas with different accuracy variability standards using magnetic susceptibility and geomorphic surfaces. The study was conducted in an area of 110 ha, which identified three compartment landscapes based on the geomorphic surfaces model. To determinate pH, organic matter, phosphorus, po...

  2. Delayed Gadolinium-Enhanced MRI of Cartilage (dGEMRIC): Intra- and Interobserver Variability in Standardized Drawing of Regions of Interest

    International Nuclear Information System (INIS)

    Tiderius, C.J.; Tjoernstrand, J.; Aakeson, P.; Soedersten, K.; Dahlberg, L.; Leander, P.

    2004-01-01

    Purpose: To establish the reproducibility of a standardized region of interest (ROI) drawing procedure in delayed gadolinium-enhanced magnetic resonance imaging (MRI) of cartilage (dGEMRIC). Material and Methods: A large ROI in lateral and medial femoral weight-bearing cartilage was drawn in images of 12 healthy male volunteers by 6 investigators with different skills in MRI. The procedure was done twice, with a 1-week interval. Calculated T1-values were evaluated for intra- and interobserver variability. Results: The mean interobserver variability for both compartments ranged between 1.3% and 2.3% for the 6 different investigators without correlation to their experience in MRI. Post-contrast intra-observer variability was low in both the lateral and the medial femoral cartilage, 2.6% and 1.5%, respectively. The larger variability in lateral than in medial cartilage was related to slightly longer and thinner ROIs. Conclusion: Intra-observer variability and interobserver variability are both low when a large standardized ROI is used in dGEMRIC. The experience of the investigator does not affect the variability, which further supports a clinical applicability of the method

  3. Quantification of the islet product: presentation of a standardized current good manufacturing practices compliant system with minimal variability.

    Science.gov (United States)

    Friberg, Andrew S; Brandhorst, Heide; Buchwald, Peter; Goto, Masafumi; Ricordi, Camillo; Brandhorst, Daniel; Korsgren, Olle

    2011-03-27

    Accurate islet quantification has proven difficult to standardize in a good manufacturing practices (GMP) approved manner. The influence of assessment variables from both manual and computer-assisted digital image analysis (DIA) methods were compared using calibrated, standardized microspheres or islets alone. Additionally, a mixture of microspheres and exocrine tissue was used to evaluate the variability of both the current, internationally recognized, manual method and a novel GMP-friendly purity- and volume-based method (PV) evaluated by DIA in a semiclosed, culture bag system. Computer-assisted DIA recorded known microsphere size distribution and quantities accurately. By using DIA to evaluate islets, the interindividual manually evaluated percent coefficients of variation (CV%; n=14) were reduced by almost half for both islet equivalents (IEs; 31% vs. 17%, P=0.002) and purity (20% vs. 13%, P=0.033). The microsphere pool mixed with exocrine tissue did not differ from expected IE with either method. However, manual IE resulted in a total CV% of 44.3% and a range spanning 258 k IE, whereas PV resulted in CV% of 10.7% and range of 60 k IE. Purity CV% for each method were similar approximating 10.5% and differed from expected by +7% for the manual method and +3% for PV. The variability of standard counting methods for islet samples and clinical quantities of microspheres mixed with exocrine tissue were reduced with DIA. They were reduced even further by use of a semiclosed bag system compared with standard manual counting, thereby facilitating the standardization of islet evaluation according to GMP standards.

  4. Variability of linezolid concentrations after standard dosing in critically ill patients: a prospective observational study

    Science.gov (United States)

    2014-01-01

    Introduction Severe infections in intensive care patients show high morbidity and mortality rates. Linezolid is an antimicrobial drug frequently used in critically ill patients. Recent data indicates that there might be high variability of linezolid serum concentrations in intensive care patients receiving standard doses. This study was aimed to evaluate whether standard dosing of linezolid leads to therapeutic serum concentrations in critically ill patients. Methods In this prospective observational study, 30 critically ill adult patients with suspected infections received standard dosing of 600 mg linezolid intravenously twice a day. Over 4 days, multiple serum samples were obtained from each patient, in order to determine the linezolid concentrations by liquid chromatography tandem mass spectrometry. Results A high variability of serum linezolid concentrations was observed (range of area under the linezolid concentration time curve over 24 hours (AUC24) 50.1 to 453.9 mg/L, median 143.3 mg*h/L; range of trough concentrations (Cmin) linezolid concentrations over 24 hours and at single time points (defined according to the literature as AUC24  400 mg*h/L and Cmin > 10 mg/L) were observed for 7 of the patients. Conclusions A high variability of linezolid serum concentrations with a substantial percentage of potentially subtherapeutic levels was observed in intensive care patients. The findings suggest that therapeutic drug monitoring of linezolid might be helpful for adequate dosing of linezolid in critically ill patients. Trial registration Clinicaltrials.gov NCT01793012. Registered 24 January 2013. PMID:25011656

  5. Direct-phase-variable model of a synchronous reluctance motor including all slot and winding harmonics

    International Nuclear Information System (INIS)

    Obe, Emeka S.; Binder, A.

    2011-01-01

    A detailed model in direct-phase variables of a synchronous reluctance motor operating at mains voltage and frequency is presented. The model includes the stator and rotor slot openings, the actual winding layout and the reluctance rotor geometry. Hence, all mmf and permeance harmonics are taken into account. It is seen that non-negligible harmonics introduced by slots are present in the inductances computed by the winding function procedure. These harmonics are usually ignored in d-q models. The machine performance is simulated in the stator reference frame to depict the difference between this new direct-phase model including all harmonics and the conventional rotor reference frame d-q model. Saturation is included by using a polynomial fitting the variation of d-axis inductance with stator current obtained by finite-element software FEMAG DC (registered) . The detailed phase-variable model can yield torque pulsations comparable to those obtained from finite elements while the d-q model cannot.

  6. A Case for Including Atmospheric Thermodynamic Variables in Wind Turbine Fatigue Loading Parameter Identification

    International Nuclear Information System (INIS)

    Kelley, Neil D.

    1999-01-01

    This paper makes the case for establishing efficient predictor variables for atmospheric thermodynamics that can be used to statistically correlate the fatigue accumulation seen on wind turbines. Recently, two approaches to this issue have been reported. One uses multiple linear-regression analysis to establish the relative causality between a number of predictors related to the turbulent inflow and turbine loads. The other approach, using many of the same predictors, applies the technique of principal component analysis. An examination of the ensemble of predictor variables revealed that they were all kinematic in nature; i.e., they were only related to the description of the velocity field. Boundary-layer turbulence dynamics depends upon a description of the thermal field and its interaction with the velocity distribution. We used a series of measurements taken within a multi-row wind farm to demonstrate the need to include atmospheric thermodynamic variables as well as velocity-related ones in the search for efficient turbulence loading predictors in various turbine-operating environments. Our results show that a combination of vertical stability and hub-height mean shearing stress variables meet this need over a period of 10 minutes

  7. The variability of standard artificial soils: Behaviour, extractability and bioavailability of organic pollutants

    International Nuclear Information System (INIS)

    Hofman, Jakub; Hovorková, Ivana; Semple, Kirk T.

    2014-01-01

    Highlights: • Artificial soils from different laboratories revealed different fates, behaviour and bioavailability of lindane and phenanthrene. • Lindane behaviour was related to organic carbon. • Phenanthrene behaviour was significantly affected by degrading microorganisms from peat. • Sterilization of artificial soils might reduce unwanted variability. -- Abstract: Artificial soil is an important standard medium and reference material for soil ecotoxicity bioassays. Recent studies have documented the significant variability of their basic properties among different laboratories. Our study investigated (i) the variability of ten artificial soils from different laboratories by means of the fate, extractability and bioavailability of phenanthrene and lindane, and (ii) the relationships of these results to soil properties and ageing. Soils were spiked with 14 C-phenanthrene and 14 C-lindane, and the total residues, fractions extractable by hydroxypropyl-β-cyclodextrin, and the fractions of phenanthrene mineralizable by bacteria were determined after 1, 14, 28 and 56 days. Significant temporal changes in total residues and extractable and mineralizable fractions were observed for phenanthrene, resulting in large differences between soils after 56 days. Phenanthrene mineralization by indigenous peat microorganisms was suggested as the main driver of that, outweighing the effects of organic matter. Lindane total residues and extractability displayed much smaller changes over time and smaller differences between soils related to organic matter. Roughly estimated, the variability between the artificial soils was comparable to natural soils. The implications of such variability for the results of toxicity tests and risk assessment decisions should be identified. We also suggested that the sterilization of artificial soils might reduce unwanted variability

  8. The variability of standard artificial soils: Behaviour, extractability and bioavailability of organic pollutants

    Energy Technology Data Exchange (ETDEWEB)

    Hofman, Jakub, E-mail: hofman@recetox.muni.cz [Research Centre for Toxic Compounds in the Environment (RECETOX), Faculty of Science, Masaryk University, Kamenice 753/5, Brno CZ-62500 (Czech Republic); Hovorková, Ivana [Research Centre for Toxic Compounds in the Environment (RECETOX), Faculty of Science, Masaryk University, Kamenice 753/5, Brno CZ-62500 (Czech Republic); Semple, Kirk T. [Lancaster Environment Centre, Lancaster University, Lancaster LA1 4YQ (United Kingdom)

    2014-01-15

    Highlights: • Artificial soils from different laboratories revealed different fates, behaviour and bioavailability of lindane and phenanthrene. • Lindane behaviour was related to organic carbon. • Phenanthrene behaviour was significantly affected by degrading microorganisms from peat. • Sterilization of artificial soils might reduce unwanted variability. -- Abstract: Artificial soil is an important standard medium and reference material for soil ecotoxicity bioassays. Recent studies have documented the significant variability of their basic properties among different laboratories. Our study investigated (i) the variability of ten artificial soils from different laboratories by means of the fate, extractability and bioavailability of phenanthrene and lindane, and (ii) the relationships of these results to soil properties and ageing. Soils were spiked with {sup 14}C-phenanthrene and {sup 14}C-lindane, and the total residues, fractions extractable by hydroxypropyl-β-cyclodextrin, and the fractions of phenanthrene mineralizable by bacteria were determined after 1, 14, 28 and 56 days. Significant temporal changes in total residues and extractable and mineralizable fractions were observed for phenanthrene, resulting in large differences between soils after 56 days. Phenanthrene mineralization by indigenous peat microorganisms was suggested as the main driver of that, outweighing the effects of organic matter. Lindane total residues and extractability displayed much smaller changes over time and smaller differences between soils related to organic matter. Roughly estimated, the variability between the artificial soils was comparable to natural soils. The implications of such variability for the results of toxicity tests and risk assessment decisions should be identified. We also suggested that the sterilization of artificial soils might reduce unwanted variability.

  9. A standardized approach to study human variability in isometric thermogenesis during low-intensity physical activity

    Directory of Open Access Journals (Sweden)

    Delphine eSarafian

    2013-07-01

    Full Text Available Limitations of current methods: The assessment of human variability in various compartments of daily energy expenditure (EE under standardized conditions is well defined at rest (as basal metabolic rate and thermic effect of feeding, and currently under validation for assessing the energy cost of low-intensity dynamic work. However, because physical activities of daily life consist of a combination of both dynamic and isometric work, there is also a need to develop standardized tests for assessing human variability in the energy cost of low-intensity isometric work.Experimental objectives: Development of an approach to study human variability in isometric thermogenesis by incorporating a protocol of intermittent leg press exercise of varying low-intensity isometric loads with measurements of EE by indirect calorimetry. Results: EE was measured in the seated position with the subject at rest or while intermittently pressing both legs against a press-platform at 5 low-intensity isometric loads (+5, +10, + 15, +20 and +25 kg force, each consisting of a succession of 8 cycles of press (30 s and rest (30 s. EE, integrated over each 8-min period of the intermittent leg press exercise, was found to increase linearly across the 5 isometric loads with a correlation coefficient (r > 0.9 for each individual. The slope of this EE-Load relationship, which provides the energy cost of this standardized isometric exercise expressed per kg force applied intermittently (30 s in every min, was found to show good repeatability when assessed in subjects who repeated the same experimental protocol on 3 separate days: its low intra-individual coefficient of variation (CV of ~ 10% contrasted with its much higher inter-individual CV of 35%; the latter being mass-independent but partly explained by height. Conclusion: This standardized approach to study isometric thermogenesis opens up a new avenue for research in EE phenotyping and metabolic predisposition to obesity

  10. SU-F-R-30: Interscanner Variability of Radiomics Features in Computed Tomography (CT) Using a Standard ACR Phantom

    Energy Technology Data Exchange (ETDEWEB)

    Shafiq ul Hassan, M; Zhang, G; Moros, E [H Lee Moffitt Cancer Center and Research Institute, Tampa, FL (United States); Department of Physics, University of South Florida, Tampa, FL (United States); Budzevich, M; Latifi, K; Hunt, D; Gillies, R [H Lee Moffitt Cancer Center and Research Institute, Tampa, FL (United States)

    2016-06-15

    Purpose: A simple approach to investigate Interscanner variability of Radiomics features in computed tomography (CT) using a standard ACR phantom. Methods: The standard ACR phantom was scanned on CT scanners from three different manufacturers. Scanning parameters of 120 KVp, 200 mA were used while slice thickness of 3.0 mm on two scanners and 3.27 mm on third scanner was used. Three spherical regions of interest (ROI) from water, medium density and high density inserts were contoured. Ninety four Radiomics features were extracted using an in-house program. These features include shape (11), intensity (22), GLCM (26), GLZSM (11), RLM (11), and NGTDM (5) and 8 fractal dimensions features. To evaluate the Interscanner variability across three scanners, a coefficient of variation (COV) is calculated for each feature group. Each group is further classified according to the COV- by calculating the percentage of features in each of the following categories: COV less than 2%, between 2 and 10% and greater than 10%. Results: For all feature groups, similar trend was observed for three different inserts. Shape features were the most robust for all scanners as expected. 70% of the shape features had COV <2%. For intensity feature group, 2% COV varied from 9 to 32% for three scanners. All features in four groups GLCM, GLZSM, RLM and NGTDM were found to have Interscanner variability ≥2%. The fractal dimensions dependence for medium and high density inserts were similar while it was different for water inserts. Conclusion: We concluded that even for similar scanning conditions, Interscanner variability across different scanners was significant. The texture features based on GLCM, GLZSM, RLM and NGTDM are highly scanner dependent. Since the inserts of the ACR Phantom are not heterogeneous in HU values suggests that matrix based 2nd order features are highly affected by variation in noise. Research partly funded by NIH/NCI R01CA190105-01.

  11. Chromospheric activity of periodic variable stars (including eclipsing binaries) observed in DR2 LAMOST stellar spectral survey

    Science.gov (United States)

    Zhang, Liyun; Lu, Hongpeng; Han, Xianming L.; Jiang, Linyan; Li, Zhongmu; Zhang, Yong; Hou, Yonghui; Wang, Yuefei; Cao, Zihuang

    2018-05-01

    The LAMOST spectral survey provides a rich databases for studying stellar spectroscopic properties and chromospheric activity. We cross-matched a total of 105,287 periodic variable stars from several photometric surveys and databases (CSS, LINEAR, Kepler, a recently updated eclipsing star catalogue, ASAS, NSVS, some part of SuperWASP survey, variable stars from the Tsinghua University-NAOC Transient Survey, and other objects from some new references) with four million stellar spectra published in the LAMOST data release 2 (DR2). We found 15,955 spectra for 11,469 stars (including 5398 eclipsing binaries). We calculated their equivalent widths (EWs) of their Hα, Hβ, Hγ, Hδ and Caii H lines. Using the Hα line EW, we found 447 spectra with emission above continuum for a total of 316 stars (178 eclipsing binaries). We identified 86 active stars (including 44 eclipsing binaries) with repeated LAMOST spectra. A total of 68 stars (including 34 eclipsing binaries) show chromospheric activity variability. We also found LAMOST spectra of 12 cataclysmic variables, five of which show chromospheric activity variability. We also made photometric follow-up studies of three short period targets (DY CVn, HAT-192-0001481, and LAMOST J164933.24+141255.0) using the Xinglong 60-cm telescope and the SARA 90-cm and 1-m telescopes, and obtained new BVRI CCD light curves. We analyzed these light curves and obtained orbital and starspot parameters. We detected the first flare event with a huge brightness increase of more than about 1.5 magnitudes in R filter in LAMOST J164933.24+141255.0.

  12. Including Alternative Resources in State Renewable Portfolio Standards: Current Design and Implementation Experience

    Energy Technology Data Exchange (ETDEWEB)

    Heeter, J.; Bird, L.

    2012-11-01

    Currently, 29 states, the District of Columbia, and Puerto Rico have instituted a renewable portfolio standard (RPS). An RPS sets a minimum threshold for how much renewable energy must be generated in a given year. Each state policy is unique, varying in percentage targets, timetables, and eligible resources. This paper examines state experience with implementing renewable portfolio standards that include energy efficiency, thermal resources, and non-renewable energy and explores compliance experience, costs, and how states evaluate, measure, and verify energy efficiency and convert thermal energy. It aims to gain insights from the experience of states for possible federal clean energy policy as well as to share experience and lessons for state RPS implementation.

  13. The use of a xylosylated plant glycoprotein as an internal standard accounting for N-linked glycan cleavage and sample preparation variability.

    Science.gov (United States)

    Walker, S Hunter; Taylor, Amber D; Muddiman, David C

    2013-06-30

    Traditionally, free oligosaccharide internal standards are used to account for variability in glycan relative quantification experiments by mass spectrometry. However, a more suitable internal standard would be a glycoprotein, which could also control for enzymatic cleavage efficiency, allowing for more accurate quantitative experiments. Hydrophobic, hydrazide N-linked glycan reagents (both native and stable-isotope labeled) are used to derivatize and differentially label N-linked glycan samples for relative quantification, and the samples are analyzed by a reversed-phase liquid chromatography chip system coupled online to a Q-Exactive mass spectrometer. The inclusion of two internal standards, maltoheptaose (previously used) and horseradish peroxidase (HRP) (novel), is studied to demonstrate the effectiveness of using a glycoprotein as an internal standard in glycan relative quantification experiments. HRP is a glycoprotein containing a xylosylated N-linked glycan, which is unique from mammalian N-linked glycans. Thus, the internal standard xylosylated glycan could be detected without interference to the sample. Additionally, it was shown that differences in cleavage efficiency can be detected by monitoring the HRP glycan. In a sample where cleavage efficiency variation is minimal, the HRP glycan performs as well as maltoheptaose. Because the HRP glycan performs as well as maltoheptaose but is also capable of correcting and accounting for cleavage variability, it is a more versatile internal standard and will be used in all subsequent biological studies. Because of the possible lot-to-lot variation of an enzyme, differences in biological matrix, and variable enzyme activity over time, it is a necessity to account for glycan cleavage variability in glycan relative quantification experiments. Copyright © 2013 John Wiley & Sons, Ltd.

  14. Variability of assay methods for total and free PSA after WHO standardization.

    Science.gov (United States)

    Foj, L; Filella, X; Alcover, J; Augé, J M; Escudero, J M; Molina, R

    2014-03-01

    The variability of total PSA (tPSA) and free PSA (fPSA) results among commercial assays has been suggested to be decreased by calibration to World Health Organization (WHO) reference materials. To characterize the current situation, it is necessary to know its impact in the critical cutoffs used in clinical practice. In the present study, we tested 167 samples with tPSA concentrations of 0 to 20 μg/L using seven PSA and six fPSA commercial assays, including Access, ARCHITECT i2000, ADVIA Centaur XP, IMMULITE 2000, Elecsys, and Lumipulse G1200, in which we only measured tPSA. tPSA and fPSA were measured in Access using the Hybritech and WHO calibrators. Passing-Bablok analysis was performed for PSA, and percentage of fPSA with the Hybritech-calibrated access comparison assay. For tPSA, relative differences were more than 10 % at 0.2 μg/L for ARCHITECT i2000, and at a critical concentration of 3, 4, and 10 μg/L, the relative difference was exceeded by ADVIA Centaur XP and WHO-calibrated Access. For percent fPSA, at a critical concentration of 10 %, the 10 % relative difference limit was exceeded by IMMULITE 2000 assay. At a critical concentration of 20 and 25 %, ADVIA Centaur XP, ARCHITECT i2000, and IMMULITE 2000 assays exceeded the 10 % relative difference limit. We have shown significant discordances between assays included in this study despite advances in standardization conducted in the last years. Further harmonization efforts are required in order to obtain a complete clinical concordance.

  15. Clinical Implications of Glucose Variability: Chronic Complications of Diabetes

    Directory of Open Access Journals (Sweden)

    Hye Seung Jung

    2015-06-01

    Full Text Available Glucose variability has been identified as a potential risk factor for diabetic complications; oxidative stress is widely regarded as the mechanism by which glycemic variability induces diabetic complications. However, there remains no generally accepted gold standard for assessing glucose variability. Representative indices for measuring intraday variability include calculation of the standard deviation along with the mean amplitude of glycemic excursions (MAGE. MAGE is used to measure major intraday excursions and is easily measured using continuous glucose monitoring systems. Despite a lack of randomized controlled trials, recent clinical data suggest that long-term glycemic variability, as determined by variability in hemoglobin A1c, may contribute to the development of microvascular complications. Intraday glycemic variability is also suggested to accelerate coronary artery disease in high-risk patients.

  16. Use of CTX-I and PINP as bone turnover markers: National Bone Health Alliance recommendations to standardize sample handling and patient preparation to reduce pre-analytical variability.

    Science.gov (United States)

    Szulc, P; Naylor, K; Hoyle, N R; Eastell, R; Leary, E T

    2017-09-01

    The National Bone Health Alliance (NBHA) recommends standardized sample handling and patient preparation for C-terminal telopeptide of type I collagen (CTX-I) and N-terminal propeptide of type I procollagen (PINP) measurements to reduce pre-analytical variability. Controllable and uncontrollable patient-related factors are reviewed to facilitate interpretation and minimize pre-analytical variability. The IOF and the International Federation of Clinical Chemistry (IFCC) Bone Marker Standards Working Group have identified PINP and CTX-I in blood to be the reference markers of bone turnover for the fracture risk prediction and monitoring of osteoporosis treatment. Although used in clinical research for many years, bone turnover markers (BTM) have not been widely adopted in clinical practice primarily due to their poor within-subject and between-lab reproducibility. The NBHA Bone Turnover Marker Project team aim to reduce pre-analytical variability of CTX-I and PINP measurements through standardized sample handling and patient preparation. Recommendations for sample handling and patient preparations were made based on review of available publications and pragmatic considerations to reduce pre-analytical variability. Controllable and un-controllable patient-related factors were reviewed to facilitate interpretation and sample collection. Samples for CTX-I must be collected consistently in the morning hours in the fasted state. EDTA plasma is preferred for CTX-I for its greater sample stability. Sample collection conditions for PINP are less critical as PINP has minimal circadian variability and is not affected by food intake. Sample stability limits should be observed. The uncontrollable aspects (age, sex, pregnancy, immobility, recent fracture, co-morbidities, anti-osteoporotic drugs, other medications) should be considered in BTM interpretation. Adopting standardized sample handling and patient preparation procedures will significantly reduce controllable pre

  17. Standardized Competencies for Parenteral Nutrition Order Review and Parenteral Nutrition Preparation, Including Compounding: The ASPEN Model.

    Science.gov (United States)

    Boullata, Joseph I; Holcombe, Beverly; Sacks, Gordon; Gervasio, Jane; Adams, Stephen C; Christensen, Michael; Durfee, Sharon; Ayers, Phil; Marshall, Neil; Guenter, Peggi

    2016-08-01

    Parenteral nutrition (PN) is a high-alert medication with a complex drug use process. Key steps in the process include the review of each PN prescription followed by the preparation of the formulation. The preparation step includes compounding the PN or activating a standardized commercially available PN product. The verification and review, as well as preparation of this complex therapy, require competency that may be determined by using a standardized process for pharmacists and for pharmacy technicians involved with PN. An American Society for Parenteral and Enteral Nutrition (ASPEN) standardized model for PN order review and PN preparation competencies is proposed based on a competency framework, the ASPEN-published interdisciplinary core competencies, safe practice recommendations, and clinical guidelines, and is intended for institutions and agencies to use with their staff. © 2016 American Society for Parenteral and Enteral Nutrition.

  18. Variability of standard artificial soils: Physico-chemical properties and phenanthrene desorption measured by means of supercritical fluid extraction

    International Nuclear Information System (INIS)

    Bielská, Lucie; Hovorková, Ivana; Komprdová, Klára; Hofman, Jakub

    2012-01-01

    The study is focused on artificial soil which is supposed to be a standardized “soil like” medium. We compared physico-chemical properties and extractability of Phenanthrene from 25 artificial soils prepared according to OECD standardized procedures at different laboratories. A substantial range of soil properties was found, also for parameters which should be standardized because they have an important influence on the bioavailability of pollutants (e.g. total organic carbon ranged from 1.4 to 6.1%). The extractability of Phe was measured by supercritical fluid extraction (SFE) at harsh and mild conditions. Highly variable Phe extractability from different soils (3–89%) was observed. The extractability was strongly related (R 2 = 0.87) to total organic carbon content, 0.1–2 mm particle size, and humic/fulvic acid ratio in the following multiple regression model: SFE (%) = 1.35 * sand (%) − 0.77 * TOC (%)2 + 0.27 * HA/FA. - Highlights: ► We compared properties and extractability of Phe from 25 different artificial soils. ► Substantial range of soil properties was found, also for important parameters. ► Phe extractability was measured by supercritical fluid extraction (SFE) at 2 modes. ► Phe extractability was highly variable from different soils (3–89%). ► Extractability was strongly related to TOC, 0.1–2 mm particles, and HA/FA. - Significant variability in physico-chemical properties exists between artificial soils prepared at different laboratories and affects behavior of contaminants in these soils.

  19. On the use of Standardized Drought Indices under decadal climate variability: Critical assessment and drought policy implications

    Science.gov (United States)

    Núñez, J.; Rivera, D.; Oyarzún, R.; Arumí, J. L.

    2014-09-01

    Since the recent High Level Meeting on National Drought Policy held in Geneva in 2013, a greater concern about the creation and adaptation of national drought monitoring systems is expected. Consequently, backed by international recommendations, the use of Standardized Drought Indices (SDI), such as the Standardized Precipitation Index (SPI), as an operational basis of drought monitoring systems has been increasing in many parts of the world. Recommendations for the use of the SPI, and consequently, those indices that share its properties, do not take into account the limitations that this type of index can exhibit under the influence of multidecadal climate variability. These limitations are fundamentally related to the lack of consistency among the operational definition expressed by this type of index, the conceptual definition with which it is associated and the political definition it supports. Furthermore, the limitations found are not overcome by the recommendations for their application. This conclusion is supported by the long-term study of the Standardized Streamflow Index (SSI) in the arid north-central region of Chile, under the influence of multidecadal climate variability. The implications of the findings of the study are discussed with regard to their link to aspects of drought policy in the cases of Australia, the United States and Chile.

  20. Evaluation of Dogs with Border Collie Collapse, Including Response to Two Standardized Strenuous Exercise Protocols.

    Science.gov (United States)

    Taylor, Susan; Shmon, Cindy; Su, Lillian; Epp, Tasha; Minor, Katie; Mickelson, James; Patterson, Edward; Shelton, G Diane

    2016-01-01

    Clinical and metabolic variables were evaluated in 13 dogs with border collie collapse (BCC) before, during, and following completion of standardized strenuous exercise protocols. Six dogs participated in a ball-retrieving protocol, and seven dogs participated in a sheep-herding protocol. Findings were compared with 16 normal border collies participating in the same exercise protocols (11 retrieving, five herding). Twelve dogs with BCC developed abnormal mentation and/or an abnormal gait during evaluation. All dogs had post-exercise elevations in rectal temperature, pulse rate, arterial blood pH, PaO2, and lactate, and decreased PaCO2 and bicarbonate, as expected with strenuous exercise, but there were no significant differences between BCC dogs and normal dogs. Electrocardiography demonstrated sinus tachycardia in all dogs following exercise. Needle electromyography was normal, and evaluation of muscle biopsy cryosections using a standard panel of histochemical stains and reactions did not reveal a reason for collapse in 10 dogs with BCC in which these tests were performed. Genetic testing excluded the dynamin-1 related exercise-induced collapse mutation and the V547A malignant hyperthermia mutation as the cause of BCC. Common reasons for exercise intolerance were eliminated. Although a genetic basis is suspected, the cause of collapse in BCC was not determined.

  1. [Roaming through methodology. XXXVIII. Common misconceptions involving standard deviation and standard error

    NARCIS (Netherlands)

    Mokkink, H.G.A.

    2002-01-01

    Standard deviation and standard error have a clear mutual relationship, but at the same time they differ strongly in the type of information they supply. This can lead to confusion and misunderstandings. Standard deviation describes the variability in a sample of measures of a variable, for instance

  2. Cytomegalovirus sequence variability, amplicon length, and DNase-sensitive non-encapsidated genomes are obstacles to standardization and commutability of plasma viral load results.

    Science.gov (United States)

    Naegele, Klaudia; Lautenschlager, Irmeli; Gosert, Rainer; Loginov, Raisa; Bir, Katia; Helanterä, Ilkka; Schaub, Stefan; Khanna, Nina; Hirsch, Hans H

    2018-04-22

    Cytomegalovirus (CMV) management post-transplantation relies on quantification in blood, but inter-laboratory and inter-assay variability impairs commutability. An international multicenter study demonstrated that variability is mitigated by standardizing plasma volumes, automating DNA extraction and amplification, and calibration to the 1st-CMV-WHO-International-Standard as in the FDA-approved Roche-CAP/CTM-CMV. However, Roche-CAP/CTM-CMV showed under-quantification and false-negative results in a quality assurance program (UK-NEQAS-2014). To evaluate factors contributing to quantification variability of CMV viral load and to develop optimized CMV-UL54-QNAT. The UL54 target of the UK-NEQAS-2014 variant was sequenced and compared to 329 available CMV GenBank sequences. Four Basel-CMV-UL54-QNAT assays of 361 bp, 254 bp, 151 bp, and 95 bp amplicons were developed that only differed in reverse primer positions. The assays were validated using plasmid dilutions, UK-NEQAS-2014 sample, as well as 107 frozen and 69 prospectively collected plasma samples from transplant patients submitted for CMV QNAT, with and without DNase-digestion prior to nucleic acid extraction. Eight of 43 mutations were identified as relevant in the UK-NEQAS-2014 target. All Basel-CMV-UL54 QNATs quantified the UK-NEQAS-2014 but revealed 10-fold increasing CMV loads as amplicon size decreased. The inverse correlation of amplicon size and viral loads was confirmed using 1st-WHO-International-Standard and patient samples. DNase pre-treatment reduced plasma CMV loads by >90% indicating the presence of unprotected CMV genomic DNA. Sequence variability, amplicon length, and non-encapsidated genomes obstruct standardization and commutability of CMV loads needed to develop thresholds for clinical research and management. Besides regular sequence surveys, matrix and extraction standardization, we propose developing reference calibrators using 100 bp amplicons. Copyright © 2018 Elsevier B.V. All

  3. 32 CFR 37.615 - What standards do I include for financial systems of for-profit firms?

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 1 2010-07-01 2010-07-01 false What standards do I include for financial... SECRETARY OF DEFENSE DoD GRANT AND AGREEMENT REGULATIONS TECHNOLOGY INVESTMENT AGREEMENTS Award Terms Affecting Participants' Financial, Property, and Purchasing Systems Financial Matters § 37.615 What...

  4. Variability of gastric emptying time using standardized radiolabeled meals

    International Nuclear Information System (INIS)

    Christian, P.E.; Brophy, C.M.; Egger, M.J.; Taylor, A.; Moore, J.G.

    1984-01-01

    To define the range of inter- and intra-subject variability on gastric emptying measurements, eight healthy male subjects (ages 19-40) received meals on four separate occasions. The meal consisted of 150 g of beef stew labeled with Tc-99m SC labeled liver (600 μCi) and 150 g of orange juice containing In-111 DTPA (100 μCi) as the solid- and liquid-phase markers respectively. Images of the solid and liquid phases were obtained at 20 min intervals immediately after meal ingestion. The stomach region was selected from digital images and data were corrected for radionuclide interference, radioactive decay and the geometric mean of anterior and posterior counts. More absolute variability was seen with the solid than the liquid marker emptying for the group. The mean solid half-emptying time was 58 +- 17 min (range 29-92) while the mean liquid half-emptying time was 24 +- 8 min (range 12-37). A nested random effects analysis of variance showed moderate intra-subject variability for solid half-emptying times (rho = 0.4594), and high intra-subject variability was implied by a low correlation (rho = 0.2084) for liquid half-emptying. The average inter-subject differences were 58.3% of the total variance for solids (rho = 0.0017). For liquids, the inter-subject variability was 69.1% of the total variance, but was only suggestive of statistical significance (rho = 0.0666). The normal half emptying time for gastric emptying of liquids and solids is a variable phenomenon in healthy subjects and has great inter- and intra-individual day-to-day differences

  5. Variability of gastric emptying time using standardized radiolabeled meals

    Energy Technology Data Exchange (ETDEWEB)

    Christian, P.E.; Brophy, C.M.; Egger, M.J.; Taylor, A.; Moore, J.G.

    1984-01-01

    To define the range of inter- and intra-subject variability on gastric emptying measurements, eight healthy male subjects (ages 19-40) received meals on four separate occasions. The meal consisted of 150 g of beef stew labeled with Tc-99m SC labeled liver (600 ..mu..Ci) and 150 g of orange juice containing In-111 DTPA (100 ..mu..Ci) as the solid- and liquid-phase markers respectively. Images of the solid and liquid phases were obtained at 20 min intervals immediately after meal ingestion. The stomach region was selected from digital images and data were corrected for radionuclide interference, radioactive decay and the geometric mean of anterior and posterior counts. More absolute variability was seen with the solid than the liquid marker emptying for the group. The mean solid half-emptying time was 58 +- 17 min (range 29-92) while the mean liquid half-emptying time was 24 +- 8 min (range 12-37). A nested random effects analysis of variance showed moderate intra-subject variability for solid half-emptying times (rho = 0.4594), and high intra-subject variability was implied by a low correlation (rho = 0.2084) for liquid half-emptying. The average inter-subject differences were 58.3% of the total variance for solids (rho = 0.0017). For liquids, the inter-subject variability was 69.1% of the total variance, but was only suggestive of statistical significance (rho = 0.0666). The normal half emptying time for gastric emptying of liquids and solids is a variable phenomenon in healthy subjects and has great inter- and intra-individual day-to-day differences.

  6. Variability in Measured Space Temperatures in 60 Homes

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, D.; Lay, K.

    2013-03-01

    This report discusses the observed variability in indoor space temperature in a set of 60 homes located in Florida, New York, Oregon, and Washington. Temperature data were collected at 15-minute intervals for an entire year, including living room, master bedroom, and outdoor air temperature (Arena, et. al). The data were examined to establish the average living room temperature for the set of homes for the heating and cooling seasons, the variability of living room temperature depending on climate, and the variability of indoor space temperature within the homes. The accuracy of software-based energy analysis depends on the accuracy of input values. Thermostat set point is one of the most influential inputs for building energy simulation. Several industry standards exist that recommend differing default thermostat settings for heating and cooling seasons. These standards were compared to the values calculated for this analysis. The data examined for this report show that there is a definite difference between the climates and that the data do not agree well with any particular standard.

  7. PaCAL: A Python Package for Arithmetic Computations with Random Variables

    Directory of Open Access Journals (Sweden)

    Marcin Korze?

    2014-05-01

    Full Text Available In this paper we present PaCAL, a Python package for arithmetical computations on random variables. The package is capable of performing the four arithmetic operations: addition, subtraction, multiplication and division, as well as computing many standard functions of random variables. Summary statistics, random number generation, plots, and histograms of the resulting distributions can easily be obtained and distribution parameter ?tting is also available. The operations are performed numerically and their results interpolated allowing for arbitrary arithmetic operations on random variables following practically any probability distribution encountered in practice. The package is easy to use, as operations on random variables are performed just as they are on standard Python variables. Independence of random variables is, by default, assumed on each step but some computations on dependent random variables are also possible. We demonstrate on several examples that the results are very accurate, often close to machine precision. Practical applications include statistics, physical measurements or estimation of error distributions in scienti?c computations.

  8. Linear latent variable models: the lava-package

    DEFF Research Database (Denmark)

    Holst, Klaus Kähler; Budtz-Jørgensen, Esben

    2013-01-01

    are implemented including robust standard errors for clustered correlated data, multigroup analyses, non-linear parameter constraints, inference with incomplete data, maximum likelihood estimation with censored and binary observations, and instrumental variable estimators. In addition an extensive simulation......An R package for specifying and estimating linear latent variable models is presented. The philosophy of the implementation is to separate the model specification from the actual data, which leads to a dynamic and easy way of modeling complex hierarchical structures. Several advanced features...

  9. Progress Report on the Airborne Composition Standard Variable Name and Time Series Working Groups of the 2017 ESDSWG

    Science.gov (United States)

    Evans, K. D.; Early, A. B.; Northup, E. A.; Ames, D. P.; Teng, W. L.; Olding, S. W.; Krotkov, N. A.; Arctur, D. K.; Beach, A. L., III; Silverman, M. L.

    2017-12-01

    The role of NASA's Earth Science Data Systems Working Groups (ESDSWG) is to make recommendations relevant to NASA's Earth science data systems from users' experiences and community insight. Each group works independently, focusing on a unique topic. Progress of two of the 2017 Working Groups will be presented. In a single airborne field campaign, there can be several different instruments and techniques that measure the same parameter on one or more aircraft platforms. Many of these same parameters are measured during different airborne campaigns using similar or different instruments and techniques. The Airborne Composition Standard Variable Name Working Group is working to create a list of variable standard names that can be used across all airborne field campaigns in order to assist in the transition to the ICARTT Version 2.0 file format. The overall goal is to enhance the usability of ICARTT files and the search ability of airborne field campaign data. The Time Series Working Group (TSWG) is a continuation of the 2015 and 2016 Time Series Working Groups. In 2015, we started TSWG with the intention of exploring the new OGC (Open Geospatial Consortium) WaterML 2 standards as a means for encoding point-based time series data from NASA satellites. In this working group, we realized that WaterML 2 might not be the best solution for this type of data, for a number of reasons. Our discussion with experts from other agencies, who have worked on similar issues, identified several challenges that we would need to address. As a result, we made the recommendation to study the new TimeseriesML 1.0 standard of OGC as a potential NASA time series standard. The 2016 TSWG examined closely the TimeseriesML 1.0 and, in coordination with the OGC TimeseriesML Standards Working Group, identified certain gaps in TimeseriesML 1.0 that would need to be addressed for the standard to be applicable to NASA time series data. An engineering report was drafted based on the OGC Engineering

  10. A tool for standardized collector performance calculations including PVT

    DEFF Research Database (Denmark)

    Perers, Bengt; Kovacs, Peter; Olsson, Marcus

    2012-01-01

    A tool for standardized calculation of solar collector performance has been developed in cooperation between SP Technical Research Institute of Sweden, DTU Denmark and SERC Dalarna University. The tool is designed to calculate the annual performance of solar collectors at representative locations...... can be tested and modeled as a thermal collector, when the PV electric part is active with an MPP tracker in operation. The thermal collector parameters from this operation mode are used for the PVT calculations....

  11. Multiple variables data sets visualization in ROOT

    International Nuclear Information System (INIS)

    Couet, O

    2008-01-01

    The ROOT graphical framework provides support for many different functions including basic graphics, high-level visualization techniques, output on files, 3D viewing etc. They use well-known world standards to render graphics on screen, to produce high-quality output files, and to generate images for Web publishing. Many techniques allow visualization of all the basic ROOT data types, but the graphical framework was still a bit weak in the visualization of multiple variables data sets. This paper presents latest developments done in the ROOT framework to visualize multiple variables (>4) data sets

  12. Standardizing effect size from linear regression models with log-transformed variables for meta-analysis.

    Science.gov (United States)

    Rodríguez-Barranco, Miguel; Tobías, Aurelio; Redondo, Daniel; Molina-Portillo, Elena; Sánchez, María José

    2017-03-17

    Meta-analysis is very useful to summarize the effect of a treatment or a risk factor for a given disease. Often studies report results based on log-transformed variables in order to achieve the principal assumptions of a linear regression model. If this is the case for some, but not all studies, the effects need to be homogenized. We derived a set of formulae to transform absolute changes into relative ones, and vice versa, to allow including all results in a meta-analysis. We applied our procedure to all possible combinations of log-transformed independent or dependent variables. We also evaluated it in a simulation based on two variables either normally or asymmetrically distributed. In all the scenarios, and based on different change criteria, the effect size estimated by the derived set of formulae was equivalent to the real effect size. To avoid biased estimates of the effect, this procedure should be used with caution in the case of independent variables with asymmetric distributions that significantly differ from the normal distribution. We illustrate an application of this procedure by an application to a meta-analysis on the potential effects on neurodevelopment in children exposed to arsenic and manganese. The procedure proposed has been shown to be valid and capable of expressing the effect size of a linear regression model based on different change criteria in the variables. Homogenizing the results from different studies beforehand allows them to be combined in a meta-analysis, independently of whether the transformations had been performed on the dependent and/or independent variables.

  13. The quest to standardize hemodialysis care.

    Science.gov (United States)

    Hegbrant, Jörgen; Gentile, Giorgio; Strippoli, Giovanni F M

    2011-01-01

    A large global dialysis provider's core activities include providing dialysis care with excellent quality, ensuring a low variability across the clinic network and ensuring strong focus on patient safety. In this article, we summarize the pertinent components of the quality assurance and safety program of the Diaverum Renal Services Group. Concerning medical performance, the key components of a successful quality program are setting treatment targets; implementing evidence-based guidelines and clinical protocols; consistently, regularly, prospectively and accurately collecting data from all clinics in the network; processing collected data to provide feedback to clinics in a timely manner, incorporating information on interclinic and intercountry variations; and revising targets, guidelines and clinical protocols based on sound scientific data. The key activities for ensuring patient safety include a standardized approach to education, i.e. a uniform education program including control of theoretical knowledge and clinical competencies; implementation of clinical policies and procedures in the organization in order to reduce variability and potential defects in clinic practice; and auditing of clinical practice on a regular basis. By applying a standardized and systematic continuous quality improvement approach throughout the entire organization, it has been possible for Diaverum to progressively improve medical performance and ensure patient safety. Copyright © 2011 S. Karger AG, Basel.

  14. Deriving Daytime Variables From the AmeriFlux Standard Eddy Covariance Data Set

    Energy Technology Data Exchange (ETDEWEB)

    van Ingen, Catharine [Berkeley Water Center. Berkeley, CA (United States); Microsoft. San Francisco, CA (United States); Agarwal, Deborah A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Berkeley Water Center. Berkeley, CA (United States); Univ. of California, Berkeley, CA (United States); Humphrey, Marty [Univ. of Virginia, Charlottesville, VA (United States); Li, Jie [Univ. of Virginia, Charlottesville, VA (United States)

    2008-12-06

    A gap-filled, quality assessed eddy covariance dataset has recently become available for the AmeriFluxnetwork. This dataset uses standard processing and produces commonly used science variables. This shared dataset enables robust comparisons across different analyses. Of course, there are many remaining questions. One of those is how to define 'during the day' which is an important concept for many analyses. Some studies have used local time — for example 9am to 5pm; others have used thresholds on photosynthetic active radiation (PAR). A related question is how to derive quantities such as the Bowen ratio. Most studies compute the ratio of the averages of the latent heat (LE) and sensible heat (H). In this study, we use different methods of defining 'during the day' for GPP, LE, and H. We evaluate the differences between methods in two ways. First, we look at a number of statistics of GPP. Second, we look at differences in the derived Bowen ratio. Our goal is not science per se, but rather informatics in support of the science.

  15. How to include the variability of TMS responses in simulations: a speech mapping case study

    Science.gov (United States)

    De Geeter, N.; Lioumis, P.; Laakso, A.; Crevecoeur, G.; Dupré, L.

    2016-11-01

    When delivered over a specific cortical site, TMS can temporarily disrupt the ongoing process in that area. This allows mapping of speech-related areas for preoperative evaluation purposes. We numerically explore the observed variability of TMS responses during a speech mapping experiment performed with a neuronavigation system. We selected four cases with very small perturbations in coil position and orientation. In one case (E) a naming error occurred, while in the other cases (NEA, B, C) the subject appointed the images as smoothly as without TMS. A realistic anisotropic head model was constructed of the subject from T1-weighted and diffusion-weighted MRI. The induced electric field distributions were computed, associated to the coil parameters retrieved from the neuronavigation system. Finally, the membrane potentials along relevant white matter fibre tracts, extracted from DTI-based tractography, were computed using a compartmental cable equation. While only minor differences could be noticed between the induced electric field distributions of the four cases, computing the corresponding membrane potentials revealed different subsets of tracts were activated. A single tract was activated for all coil positions. Another tract was only triggered for case E. NEA induced action potentials in 13 tracts, while NEB stimulated 11 tracts and NEC one. The calculated results are certainly sensitive to the coil specifications, demonstrating the observed variability in this study. However, even though a tract connecting Broca’s with Wernicke’s area is only triggered for the error case, further research is needed on other study cases and on refining the neural model with synapses and network connections. Case- and subject-specific modelling that includes both electromagnetic fields and neuronal activity enables demonstration of the variability in TMS experiments and can capture the interaction with complex neural networks.

  16. Generation of gaseous methanol reference standards

    International Nuclear Information System (INIS)

    Geib, R.C.

    1991-01-01

    Methanol has been proposed as an automotive fuel component. Reliable, accurate methanol standards are essential to support widespread monitoring programs. The monitoring programs may include quantification of methanol from tailpipe emissions, evaporative emissions, plus ambient air methanol measurements. This paper will present approaches and results in the author's investigation to develop high accuracy methanol standards. The variables upon which the authors will report results are as follows: (1) stability of methanol gas standards, the studies will focus on preparation requirements and stability results from 10 to 1,000 ppmv; (2) cylinder to instrument delivery system components and purge technique, these studies have dealt with materials in contact with the sample stream plus static versus flow injection; (3) optimization of gas chromatographic analytical system will be discussed; (4) gas chromatography and process analyzer results and utility for methanol analysis will be presented; (5) the accuracy of the methanol standards will be qualified using data from multiple studies including: (a) gravimetric preparation; (b) linearity studies; (c) independent standards sources such as low pressure containers and diffusion tubes. The accuracy will be provided as a propagation of error from multiple sources. The methanol target concentrations will be 10 to 500 ppmv

  17. Standardization of biodosimetry operations

    International Nuclear Information System (INIS)

    Dainiak, Nicholas

    2016-01-01

    Methods and procedures for generating, interpreting and scoring the frequency of dicentric chromosomes vary among cytogenetic biodosimetry laboratories (CBLs). This variation adds to the already considerable lack of precision inherent in the dicentric chromosome assay (DCA). Although variability in sample collection, cell preparation, equipment and dicentric frequency scoring can never be eliminated with certainty, it can be substantially minimized, resulting in reduced scatter and improved precision. Use of standard operating procedures and technician exchange may help to mitigate variation. Although the development and adoption of international standards (ISO 21243 and ISO 19238) has helped to reduce variation in standard operating procedures (SOPs), all CBLs must maintain process improvement, and those with challenges may require additional assistance. Sources of variation that may not be readily apparent in the SOPs for sample collection and processing include variability in ambient laboratory conditions, media, serum lot and quantity and the use of particular combinations of cytokines. Variability in maintenance and calibration of metafer equipment, and in scoring criteria, reader proficiency and personal factors may need to be addressed. The calibration curve itself is a source of variation that requires control, using the same known-dose samples among CBLs, measurement of central tendency, and generation of common curves with periodic reassessment to detect drifts in dicentric yield. Finally, the dose estimate should be based on common scoring criteria, using of the z-statistic. Although theoretically possible, it is practically impossible to propagate uncertainty over the entire calibration curve due to the many factors contributing to variance. Periodic re-evaluation of the curve is needed by comparison with newly published curves (using statistical analysis of differences) and determining their potential causes. (author)

  18. Understanding morphological variability in a taxonomic context in Chilean diplomystids (Teleostei: Siluriformes, including the description of a new species

    Directory of Open Access Journals (Sweden)

    Gloria Arratia

    2017-02-01

    Full Text Available Following study of the external morphology and its unmatched variability throughout ontogeny and a re-examination of selected morphological characters based on many specimens of diplomystids from Central and South Chile, we revised and emended previous specific diagnoses and consider Diplomystes chilensis, D. nahuelbutaensis, D. camposensis, and Olivaichthys viedmensis (Baker River to be valid species. Another group, previously identified as Diplomystes sp., D. spec., D. aff. chilensis, and D. cf. chilensis inhabiting rivers between Rapel and Itata Basins is given a new specific name (Diplomystes incognitus and is diagnosed. An identification key to the Chilean species, including the new species, is presented. All specific diagnoses are based on external morphological characters, such as aspects of the skin, neuromast lines, and main lateral line, and position of the anus and urogenital pore, as well as certain osteological characters to facilitate the identification of these species that previously was based on many internal characters. Diplomystids below 150 mm standard length (SL share a similar external morphology and body proportions that make identification difficult; however, specimens over 150 mm SL can be diagnosed by the position of the urogenital pore and anus, and a combination of external and internal morphological characters. According to current knowledge, diplomystid species have an allopatric distribution with each species apparently endemic to particular basins in continental Chile and one species (O. viedmensis known only from one river in the Chilean Patagonia, but distributed extensively in southern Argentina.

  19. Standard leach tests for nuclear waste materials

    International Nuclear Information System (INIS)

    Strachan, D.M.; Barnes, B.O.; Turcotte, R.P.

    1980-01-01

    Five leach tests were conducted to study time-dependent leaching of waste forms (glass). The first four tests include temperature as a variable and the use of three standard leachants. Three of the tests are static and two are dynamic (flow). This paper discusses the waste-form leach tests and presents some representative data. 4 figures

  20. How novice, skilled and advanced clinical researchers include variables in a case report form for clinical research: a qualitative study.

    Science.gov (United States)

    Chu, Hongling; Zeng, Lin; Fetters, Micheal D; Li, Nan; Tao, Liyuan; Shi, Yanyan; Zhang, Hua; Wang, Xiaoxiao; Li, Fengwei; Zhao, Yiming

    2017-09-18

    Despite varying degrees in research training, most academic clinicians are expected to conduct clinical research. The objective of this research was to understand how clinical researchers of different skill levels include variables in a case report form for their clinical research. The setting for this research was a major academic institution in Beijing, China. The target population was clinical researchers with three levels of experience, namely, limited clinical research experience, clinicians with rich clinical research experience and clinical research experts. Using a qualitative approach, we conducted 13 individual interviews (face to face) and one group interview (n=4) with clinical researchers from June to September 2016. Based on maximum variation sampling to identify researchers with three levels of research experience: eight clinicians with limited clinical research experience, five clinicians with rich clinical research experience and four clinical research experts. These 17 researchers had diverse hospital-based medical specialties and or specialisation in clinical research. Our analysis yields a typology of three processes developing a case report form that varies according to research experience level. Novice clinician researchers often have an incomplete protocol or none at all, and conduct data collection and publication based on a general framework. Experienced clinician researchers include variables in the case report form based on previous experience with attention to including domains or items at risk for omission and by eliminating unnecessary variables. Expert researchers consider comprehensively in advance data collection and implementation needs and plan accordingly. These results illustrate increasing levels of sophistication in research planning that increase sophistication in selection for variables in the case report form. These findings suggest that novice and intermediate-level researchers could benefit by emulating the comprehensive

  1. Optimization of Standard In-House 24-Locus Variable-Number Tandem-Repeat Typing for Mycobacterium tuberculosis and Its Direct Application to Clinical Material

    NARCIS (Netherlands)

    de Beer, Jessica L.; Akkerman, Onno W.; Schurch, Anita C.; Mulder, Arnout; van der Werf, Tjip S.; van der Zanden, Adri G. M.; van Ingen, Jakko; van Soolingen, Dick

    Variable-number tandem-repeat (VNTR) typing with a panel of 24 loci is the current gold standard in the molecular typing of Mycobacterium tuberculosis complex isolates. However, because of technical problems, a part of the loci often cannot be amplified by multiplex PCRs. Therefore, a considerable

  2. 106-17 Telemetry Standards Digitized Audio Telemetry Standard Chapter 5

    Science.gov (United States)

    2017-07-01

    Digitized Audio Telemetry Standard 5.1 General This chapter defines continuously variable slope delta (CVSD) modulation as the standard for digitizing...audio signal. The CVSD modulator is, in essence , a 1-bit analog-to-digital converter. The output of this 1-bit encoder is a serial bit stream, where

  3. Focus on variability : New tools to study intra-individual variability in developmental data

    NARCIS (Netherlands)

    van Geert, P; van Dijk, M

    2002-01-01

    In accordance with dynamic systems theory, we assume that variability is an important developmental phenomenon. However, the standard methodological toolkit of the developmental psychologist offers few instruments for the study of variability. In this article we will present several new methods that

  4. Analysis of suspension with variable stiffness and variable damping force for automotive applications

    Directory of Open Access Journals (Sweden)

    Lalitkumar Maikulal Jugulkar

    2016-05-01

    Full Text Available Passive shock absorbers are designed for standard load condition. These give better vibration isolation performance only for the standard load condition. However, if the sprung mass is lesser than the standard mass, comfort and road holding ability is affected. It is demonstrated that sprung mass acceleration increases by 50%, when the vehicle mass varies by 100 kg. In order to obtain consistent damping performance from the shock absorber, it is essential to vary its stiffness and damping properties. In this article, a variable stiffness system is presented, which comprises of two helical springs and a variable fluid damper. Fluid damper intensity is changed in four discrete levels to achieve variable stiffness of the prototype. Numerical simulations have been performed with MATLAB Simscape and Simulink which have been with experimentation on a prototype. Furthermore, the numerical model of the prototype is used in design of real size shock absorber with variable stiffness and damping. Numerical simulation results on the real size model indicate that the peak acceleration will improve by 15% in comparison to the conventional passive solution, without significant deterioration of road holding ability. Arrangement of sensors and actuators for incorporating the system in a vehicle suspension has also been discussed.

  5. FINDING STANDARD DEVIATION OF A FUZZY NUMBER

    OpenAIRE

    Fokrul Alom Mazarbhuiya

    2017-01-01

    Two probability laws can be root of a possibility law. Considering two probability densities over two disjoint ranges, we can define the fuzzy standard deviation of a fuzzy variable with the help of the standard deviation two random variables in two disjoint spaces.

  6. Enhancing the efficacy of treatment for temporomandibular patients with muscular diagnosis through cognitive-behavioral intervention, including hypnosis: a randomized study.

    Science.gov (United States)

    Ferrando, Maite; Galdón, María José; Durá, Estrella; Andreu, Yolanda; Jiménez, Yolanda; Poveda, Rafael

    2012-01-01

    This study evaluated the efficacy of a cognitive-behavioral therapy (CBT), including hypnosis, in patients with temporomandibular disorders (TMDs) with muscular diagnosis. Seventy-two patients (65 women and 7 men with an average age of 39 years) were selected according to the Research Diagnostic Criteria for TMD, and assigned to the experimental group (n = 41), receiving the 6-session CBT program, and the control group (n = 31). All patients received conservative standard treatment for TMD. The assessment included pain variables and psychologic distress. There were significant differences between the groups, the experimental group showing a higher improvement in the variables evaluated. Specifically, 90% of the patients under CBT reported a significant reduction in frequency of pain and 70% in emotional distress. The improvement was stable over time, with no significant differences between posttreatment and 9-month follow-up. CBT, including hypnosis, significantly improved conservative standard treatment outcome in TMD patients. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. Implementation of IEC standard models for power system stability studies

    Energy Technology Data Exchange (ETDEWEB)

    Margaris, Ioannis D.; Hansen, Anca D.; Soerensen, Poul [Technical Univ. of Denmark, Roskilde (Denmark). Dept. of Wind Energy; Bech, John; Andresen, Bjoern [Siemens Wind Power A/S, Brande (Denmark)

    2012-07-01

    This paper presents the implementation of the generic wind turbine generator (WTG) electrical simulation models proposed in the IEC 61400-27 standard which is currently in preparation. A general overview of the different WTG types is given while the main focus is on Type 4B WTG standard model, namely a model for a variable speed wind turbine with full scale power converter WTG including a 2-mass mechanical model. The generic models for fixed and variable speed WTGs models are suitable for fundamental frequency positive sequence response simulations during short events in the power system such as voltage dips. The general configuration of the models is presented and discussed; model implementation in the simulation software platform DIgSILENT PowerFactory is presented in order to illustrate the range of applicability of the generic models under discussion. A typical voltage dip is simulated and results from the basic electrical variables of the WTG are presented and discussed. (orig.)

  8. MRI screening for silicone breast implant rupture: accuracy, inter- and intraobserver variability using explantation results as reference standard

    Energy Technology Data Exchange (ETDEWEB)

    Maijers, M.C.; Ritt, M.J.P.F. [VU University Medical Centre, Department of Plastic, Reconstructive and Hand Surgery, De Boelelaan 1117, PO Box 7057, Amsterdam (Netherlands); Niessen, F.B. [VU University Medical Centre, Department of Plastic, Reconstructive and Hand Surgery, De Boelelaan 1117, PO Box 7057, Amsterdam (Netherlands); Jan van Goyen Clinic, Department of Plastic Surgery, Amsterdam (Netherlands); Veldhuizen, J.F.H. [MRI Centre, Amsterdam (Netherlands); Manoliu, R.A. [MRI Centre, Amsterdam (Netherlands); VU University Medical Centre, Department of Radiology, Amsterdam (Netherlands)

    2014-06-15

    The recall of Poly Implant Prothese (PIP) silicone breast implants in 2010 resulted in large numbers of asymptomatic women with implants who underwent magnetic resonance imaging (MRI) screening. This study's aim was to assess the accuracy and interobserver variability of MRI screening in the detection of rupture and extracapsular silicone leakage. A prospective study included 107 women with 214 PIP implants who underwent explantation preceded by MRI. In 2013, two radiologists blinded for previous MRI findings or outcome at surgery, independently re-evaluated all MRI examinations. A structured protocol described the MRI findings. The ex vivo findings served as reference standard. In 208 of the 214 explanted prostheses, radiologists agreed independently about the condition of the implants. In five of the six cases they disagreed (2.6 %), but subsequently reached consensus. A sensitivity of 93 %, specificity of 93 %, positive predictive value of 77 % and negative predictive value of 98 % was found. The interobserver agreement was excellent (kappa value of 0.92). MRI has a high accuracy in diagnosing rupture in silicone breast implants. Considering the high kappa value of interobserver agreement, MRI appears to be a consistent diagnostic test. A simple, uniform classification, may improve communication between radiologist and plastic surgeon. (orig.)

  9. MRI screening for silicone breast implant rupture: accuracy, inter- and intraobserver variability using explantation results as reference standard

    International Nuclear Information System (INIS)

    Maijers, M.C.; Ritt, M.J.P.F.; Niessen, F.B.; Veldhuizen, J.F.H.; Manoliu, R.A.

    2014-01-01

    The recall of Poly Implant Prothese (PIP) silicone breast implants in 2010 resulted in large numbers of asymptomatic women with implants who underwent magnetic resonance imaging (MRI) screening. This study's aim was to assess the accuracy and interobserver variability of MRI screening in the detection of rupture and extracapsular silicone leakage. A prospective study included 107 women with 214 PIP implants who underwent explantation preceded by MRI. In 2013, two radiologists blinded for previous MRI findings or outcome at surgery, independently re-evaluated all MRI examinations. A structured protocol described the MRI findings. The ex vivo findings served as reference standard. In 208 of the 214 explanted prostheses, radiologists agreed independently about the condition of the implants. In five of the six cases they disagreed (2.6 %), but subsequently reached consensus. A sensitivity of 93 %, specificity of 93 %, positive predictive value of 77 % and negative predictive value of 98 % was found. The interobserver agreement was excellent (kappa value of 0.92). MRI has a high accuracy in diagnosing rupture in silicone breast implants. Considering the high kappa value of interobserver agreement, MRI appears to be a consistent diagnostic test. A simple, uniform classification, may improve communication between radiologist and plastic surgeon. (orig.)

  10. Heart rate variability in healthy population

    International Nuclear Information System (INIS)

    Alamgir, M.; Hussain, M.M.

    2010-01-01

    Background: Heart rate variability has been considered as an indicator of autonomic status. Little work has been done on heart rate variability in normal healthy volunteers. We aimed at evolving the reference values of heart rate variability in our healthy population. Methods: Twenty-four hour holter monitoring of 37 healthy individuals was done using Holter ECG recorder 'Life card CF' from 'Reynolds Medical'. Heart rate variability in both time and frequency domains was analysed with 'Reynolds Medical Pathfinder Digital/700'. Results: The heart rate variability in normal healthy volunteers of our population was found in time domain using standard deviation of R-R intervals (SDNN), standard deviation of average NN intervals (SDANN), and Square root of the mean squared differences of successive NN intervals (RMSSD). Variation in heart rate variability indices was observed between local and foreign volunteers and RMSSD was found significantly increased (p<0.05) in local population. Conclusions: The values of heart rate variability (RMSSD) in healthy Pakistani volunteers were found increased compared to the foreign data reflecting parasympathetic dominance in our population. (author)

  11. LB02.03: EVALUATION OF DAY-BY-DAY BLOOD PRESSURE VARIABILITY IN CLINIC (DO WE STILL NEED STANDARD DEVIATION?).

    Science.gov (United States)

    Ryuzaki, M; Nakamoto, H; Hosoya, K; Komatsu, M; Hibino, Y

    2015-06-01

    Blood pressure (BP) variability correlates with cardio-vascular disease as BP level itself. There is not known easy way to evaluate the BP variability in clinic.To evaluate the usefulness of maximum-minimum difference (MMD) of BP in a month compared to standard deviation (SD), as an index of BP variability. Study-1: Twelve patients (age 65.9 ± 12.1 y/o) were enrolled. Measurements of home systolic (S) BP were required in the morning. The 12 months consecutive data and at least 3 times measurements a month were required for including. (Mean 29.0 ± 4.5 times/month in the morning). We checked the correlation between MMD and SD. Study-2: Six hemodialized patients monitored with i-TECHO system (J of Hypertens 2007: 25: 2353-2358) for longer than one year were analyzed. As in study-1, we analyzed the correlation between SD and MMD of SBP. 17.4 ± 11.9 times per month. Study-3: The data from our previous study (FUJIYAM study Clin. Exp Hypertens 2014: 36:508-16) were extracted. 1524 patient-month morning BP data were calculated as in study-1. Picking up data measuring more than 24 times a month, 517 patient-month BP data were analyzed. We compared the ratio to 25 times measured data of SD and MMD, in the setting 5, 10, 15, 20 times measured data. Study-1: SBP, MMD was correlated very well to SD (p  2 times. If data were extracted (measurements>24 times), correlation was 0.927 (P < 0.0001). The equation of SBPSD = 1.520+ 0.201xMMD. The ratios of SD to 25 times were as follows; 0.956 in 5 times, 0.956 in 10, 0.979 in 15, 0.991 in 20 times. The ratios of MMD to 25 times were as follows; 0.558 in 5, 0.761 in 10, 0.874 in 15, 0.944 in 20. We can assume SD easily by measuring MMD as an index of day-by-day BP variability of a month. The equation formulas were very similar though the patients' groups were different. But we have to be careful how many times patients measure in a month.

  12. Standard Errors for Matrix Correlations.

    Science.gov (United States)

    Ogasawara, Haruhiko

    1999-01-01

    Derives the asymptotic standard errors and intercorrelations for several matrix correlations assuming multivariate normality for manifest variables and derives the asymptotic standard errors of the matrix correlations for two factor-loading matrices. (SLD)

  13. Mechanics of deformations in terms of scalar variables

    Science.gov (United States)

    Ryabov, Valeriy A.

    2017-05-01

    Theory of particle and continuous mechanics is developed which allows a treatment of pure deformation in terms of the set of variables "coordinate-momentum-force" instead of the standard treatment in terms of tensor-valued variables "strain-stress." This approach is quite natural for a microscopic description of atomic system, according to which only pointwise forces caused by the stress act to atoms making a body deform. The new concept starts from affine transformation of spatial to material coordinates in terms of the stretch tensor or its analogs. Thus, three principal stretches and three angles related to their orientation form a set of six scalar variables to describe deformation. Instead of volume-dependent potential used in the standard theory, which requires conditions of equilibrium for surface and body forces acting to a volume element, a potential dependent on scalar variables is introduced. A consistent introduction of generalized force associated with this potential becomes possible if a deformed body is considered to be confined on the surface of torus having six genuine dimensions. Strain, constitutive equations and other fundamental laws of the continuum and particle mechanics may be neatly rewritten in terms of scalar variables. Giving a new presentation for finite deformation new approach provides a full treatment of hyperelasticity including anisotropic case. Derived equations of motion generate a new kind of thermodynamical ensemble in terms of constant tension forces. In this ensemble, six internal deformation forces proportional to the components of Irving-Kirkwood stress are controlled by applied external forces. In thermodynamical limit, instead of the pressure and volume as state variables, this ensemble employs deformation force measured in kelvin unit and stretch ratio.

  14. Ambulatory blood pressure monitoring-derived short-term blood pressure variability in primary hyperparathyroidism.

    Science.gov (United States)

    Concistrè, A; Grillo, A; La Torre, G; Carretta, R; Fabris, B; Petramala, L; Marinelli, C; Rebellato, A; Fallo, F; Letizia, C

    2018-04-01

    Primary hyperparathyroidism is associated with a cluster of cardiovascular manifestations, including hypertension, leading to increased cardiovascular risk. The aim of our study was to investigate the ambulatory blood pressure monitoring-derived short-term blood pressure variability in patients with primary hyperparathyroidism, in comparison with patients with essential hypertension and normotensive controls. Twenty-five patients with primary hyperparathyroidism (7 normotensive,18 hypertensive) underwent ambulatory blood pressure monitoring at diagnosis, and fifteen out of them were re-evaluated after parathyroidectomy. Short-term-blood pressure variability was derived from ambulatory blood pressure monitoring and calculated as the following: 1) Standard Deviation of 24-h, day-time and night-time-BP; 2) the average of day-time and night-time-Standard Deviation, weighted for the duration of the day and night periods (24-h "weighted" Standard Deviation of BP); 3) average real variability, i.e., the average of the absolute differences between all consecutive BP measurements. Baseline data of normotensive and essential hypertension patients were matched for age, sex, BMI and 24-h ambulatory blood pressure monitoring values with normotensive and hypertensive-primary hyperparathyroidism patients, respectively. Normotensive-primary hyperparathyroidism patients showed a 24-h weighted Standard Deviation (P blood pressure higher than that of 12 normotensive controls. 24-h average real variability of systolic BP, as well as serum calcium and parathyroid hormone levels, were reduced in operated patients (P blood pressure variability is increased in normotensive patients with primary hyperparathyroidism and is reduced by parathyroidectomy, and may potentially represent an additional cardiovascular risk factor in this disease.

  15. Eutrophication Modeling Using Variable Chlorophyll Approach

    International Nuclear Information System (INIS)

    Abdolabadi, H.; Sarang, A.; Ardestani, M.; Mahjoobi, E.

    2016-01-01

    In this study, eutrophication was investigated in Lake Ontario to identify the interactions among effective drivers. The complexity of such phenomenon was modeled using a system dynamics approach based on a consideration of constant and variable stoichiometric ratios. The system dynamics approach is a powerful tool for developing object-oriented models to simulate complex phenomena that involve feedback effects. Utilizing stoichiometric ratios is a method for converting the concentrations of state variables. During the physical segmentation of the model, Lake Ontario was divided into two layers, i.e., the epilimnion and hypolimnion, and differential equations were developed for each layer. The model structure included 16 state variables related to phytoplankton, herbivorous zooplankton, carnivorous zooplankton, ammonium, nitrate, dissolved phosphorus, and particulate and dissolved carbon in the epilimnion and hypolimnion during a time horizon of one year. The results of several tests to verify the model, close to 1 Nash-Sutcliff coefficient (0.98), the data correlation coefficient (0.98), and lower standard errors (0.96), have indicated well-suited model’s efficiency. The results revealed that there were significant differences in the concentrations of the state variables in constant and variable stoichiometry simulations. Consequently, the consideration of variable stoichiometric ratios in algae and nutrient concentration simulations may be applied in future modeling studies to enhance the accuracy of the results and reduce the likelihood of inefficient control policies.

  16. Neonatal therapeutic hypothermia outside of standard guidelines: a survey of U.S. neonatologists.

    Science.gov (United States)

    Burnsed, Jennifer; Zanelli, Santina A

    2017-11-01

    Therapeutic hypothermia is standard of care in term infants with moderate-to-severe hypoxic-ischaemic encephalopathy (HIE). The goal of this survey was to explore the attitudes of U.S. neonatologists caring for infants with HIE who fall outside of current guidelines. Case-based survey administered to members of the Section on Neonatal-Perinatal Medicine of the American Academy of Pediatrics. A total of 447 responses were analysed, a response rate of 19%. We found significant variability amongst U.S. neonatologists with regard to the use of therapeutic hypothermia for infants with HIE who fall outside standard inclusion criteria. Scenarios with the most variability included HIE in a late preterm infant and HIE following a postnatal code. Provision of therapeutic hypothermia outside of standard guidelines was not influenced by number of years in practice, neonatal intensive care type (NICU) or NICU size. Significant variability in practice exists when caring for infants with HIE who do not meet standard inclusion criteria, emphasizing the need for continued and rigorous research in this area. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.

  17. Investigation of load reduction for a variable speed, variable pitch, and variable coning wind turbine

    Energy Technology Data Exchange (ETDEWEB)

    Pierce, K. [Univ. of Utah, Salt Lake City, UT (United States)

    1997-12-31

    A two bladed, variable speed and variable pitch wind turbine was modeled using ADAMS{reg_sign} to evaluate load reduction abilities of a variable coning configuration as compared to a teetered rotor, and also to evaluate control methods. The basic dynamic behavior of the variable coning turbine was investigated and compared to the teetered rotor under constant wind conditions as well as turbulent wind conditions. Results indicate the variable coning rotor has larger flap oscillation amplitudes and much lower root flap bending moments than the teetered rotor. Three methods of control were evaluated for turbulent wind simulations. These were a standard IPD control method, a generalized predictive control method, and a bias estimate control method. Each control method was evaluated for both the variable coning configuration and the teetered configuration. The ability of the different control methods to maintain the rotor speed near the desired set point is evaluated from the RMS error of rotor speed. The activity of the control system is evaluated from cycles per second of the blade pitch angle. All three of the methods were found to produce similar results for the variable coning rotor and the teetered rotor, as well as similar results to each other.

  18. Statistical variability of hydro-meteorological variables as indicators ...

    African Journals Online (AJOL)

    Statistical variability of hydro-meteorological variables as indicators of climate change in north-east Sokoto-Rima basin, Nigeria. ... water resources development including water supply project, agriculture and tourism in the study area. Key word: Climate change, Climatic variability, Actual evapotranspiration, Global warming ...

  19. Process variables in organizational stress management intervention evaluation research: a systematic review.

    Science.gov (United States)

    Havermans, Bo M; Schlevis, Roosmarijn Mc; Boot, Cécile Rl; Brouwers, Evelien Pm; Anema, Johannes; van der Beek, Allard J

    2016-09-01

    This systematic review aimed to explore which process variables are used in stress management intervention (SMI) evaluation research. A systematic review was conducted using seven electronic databases. Studies were included if they reported on an SMI aimed at primary or secondary stress prevention, were directed at paid employees, and reported process data. Two independent researchers checked all records and selected the articles for inclusion. Nielsen and Randall's model for process evaluation was used to cluster the process variables. The three main clusters were context, intervention, and mental models. In the 44 articles included, 47 process variables were found, clustered into three main categories: context (two variables), intervention (31 variables), and mental models (14 variables). Half of the articles contained no reference to process evaluation literature. The collection of process evaluation data mostly took place after the intervention and at the level of the employee. The findings suggest that there is great heterogeneity in methods and process variables used in process evaluations of SMI. This, together with the lack of use of a standardized framework for evaluation, hinders the advancement of process evaluation theory development.

  20. Normalization method for metabolomics data using optimal selection of multiple internal standards

    Directory of Open Access Journals (Sweden)

    Yetukuri Laxman

    2007-03-01

    Full Text Available Abstract Background Success of metabolomics as the phenotyping platform largely depends on its ability to detect various sources of biological variability. Removal of platform-specific sources of variability such as systematic error is therefore one of the foremost priorities in data preprocessing. However, chemical diversity of molecular species included in typical metabolic profiling experiments leads to different responses to variations in experimental conditions, making normalization a very demanding task. Results With the aim to remove unwanted systematic variation, we present an approach that utilizes variability information from multiple internal standard compounds to find optimal normalization factor for each individual molecular species detected by metabolomics approach (NOMIS. We demonstrate the method on mouse liver lipidomic profiles using Ultra Performance Liquid Chromatography coupled to high resolution mass spectrometry, and compare its performance to two commonly utilized normalization methods: normalization by l2 norm and by retention time region specific standard compound profiles. The NOMIS method proved superior in its ability to reduce the effect of systematic error across the full spectrum of metabolite peaks. We also demonstrate that the method can be used to select best combinations of standard compounds for normalization. Conclusion Depending on experiment design and biological matrix, the NOMIS method is applicable either as a one-step normalization method or as a two-step method where the normalization parameters, influenced by variabilities of internal standard compounds and their correlation to metabolites, are first calculated from a study conducted in repeatability conditions. The method can also be used in analytical development of metabolomics methods by helping to select best combinations of standard compounds for a particular biological matrix and analytical platform.

  1. The Impact of Approved Accounting Standard AASB 1024 “Consolidated Accounts” on the Information Included in Consolidated Financial Statements

    OpenAIRE

    Pramuka, Bambang Agus

    1995-01-01

    The intent of consolidated financial statements is to provide meaningful, relevant, useful, and reliable information about the operations of a group of companies. In compliance with AASB 1024 'Consolidated Accounts', and AAS 24 Consolidated Financial Reports', a parent entity now has to include in its consolidated financial statements all controlled entities, regardless of their legal form or the ownership interest held. The new Standard also pr...

  2. Extent of, and variables associated with, blood pressure variability among older subjects.

    Science.gov (United States)

    Morano, Arianna; Ravera, Agnese; Agosta, Luca; Sappa, Matteo; Falcone, Yolanda; Fonte, Gianfranco; Isaia, Gianluca; Isaia, Giovanni Carlo; Bo, Mario

    2018-02-23

    Blood pressure variability (BPV) may have prognostic implications for cardiovascular risk and cognitive decline; however, BPV has yet to be studied in old and very old people. Aim of the present study was to evaluate the extent of BPV and to identify variables associated with BPV among older subjects. A retrospective study of patients aged ≥ 65 years who underwent 24-h ambulatory blood pressure monitoring (ABPM) was carried out. Three different BPV indexes were calculated for systolic and diastolic blood pressure (SBP and DBP): standard deviation (SD), coefficient of variation (CV), and average real variability (ARV). Demographic variables and use of antihypertensive medications were considered. The study included 738 patients. Mean age was 74.8 ± 6.8 years. Mean SBP and DBP SD were 20.5 ± 4.4 and 14.6 ± 3.4 mmHg. Mean SBP and DBP CV were 16 ± 3 and 20 ± 5%. Mean SBP and DBP ARV were 15.7 ± 3.9 and 11.8 ± 3.6 mmHg. At multivariate analysis older age, female sex and uncontrolled mean blood pressure were associated with both systolic and diastolic BPV indexes. The use of calcium channel blockers and alpha-adrenergic antagonists was associated with lower systolic and diastolic BPV indexes, respectively. Among elderly subjects undergoing 24-h ABPM, we observed remarkably high indexes of BPV, which were associated with older age, female sex, and uncontrolled blood pressure values.

  3. Linear versus non-linear measures of temporal variability in finger tapping and their relation to performance on open- versus closed-loop motor tasks: comparing standard deviations to Lyapunov exponents.

    Science.gov (United States)

    Christman, Stephen D; Weaver, Ryan

    2008-05-01

    The nature of temporal variability during speeded finger tapping was examined using linear (standard deviation) and non-linear (Lyapunov exponent) measures. Experiment 1 found that right hand tapping was characterised by lower amounts of both linear and non-linear measures of variability than left hand tapping, and that linear and non-linear measures of variability were often negatively correlated with one another. Experiment 2 found that increased non-linear variability was associated with relatively enhanced performance on a closed-loop motor task (mirror tracing) and relatively impaired performance on an open-loop motor task (pointing in a dark room), especially for left hand performance. The potential uses and significance of measures of non-linear variability are discussed.

  4. FACTORS AFFECTING THE COMPLIANCE OF MYANMAR NURSES IN PERFORMING STANDARD PRECAUTION

    Directory of Open Access Journals (Sweden)

    Sa Sa Aung

    2017-06-01

    Full Text Available Introduction: Exposure to pathogens is a serious issue for nurses. The literature explains that standard precaution have not consistently done in nursing. The purpose of this study was to analyze the factors affecting the compliance of nurses in Myanmar in performing standard precautions. Methods: This study used a cross-sectional design. Samples included 34 nurses in Waibagi Specialist Hospital (SHW, Myanmar. The independent variables were the characteristics of nurses, knowledge of standard precaution, and exposure to blood / body fluids and needle puncture wounds. The dependent variable was the performance of standard prevention. Data analyzed using descriptive analysis and logistic regression. Results: The result showed that almost respondents (91.18% had a good knowledge about prevention standards and 73.5% of respondents had good adherence in performing standard precaution. However, in practice nurses have not been consistent in closing the needles that have been used correctly. The results showed that nurse characteristics did not significantly affect adherence to standard precaution with statistical test results as follows: age (p = 0.97, gender (p = 1.00, religion (p = 0.72, education (p = 0.85, work experience at SHW (p = 0, 84, education training program (p = 0.71, knowledge (p = 0.76, and needle stick injury (p = 0,17. But, there was a significant influence between adherence to standard precaution on the incidence of injury due to puncture needle with p value = 0.01. Discussion: The barriers to applying standard precautions by Myanmar nurses can be reduced by providing basic training, supervision and improvement of operational standard procedures.

  5. Classification of decays involving variable decay chains with convolutional architectures

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Vidyo contribution We present a technique to perform classification of decays that exhibit decay chains involving a variable number of particles, which include a broad class of $B$ meson decays sensitive to new physics. The utility of such decays as a probe of the Standard Model is dependent upon accurate determination of the decay rate, which is challenged by the combinatorial background arising in high-multiplicity decay modes. In our model, each particle in the decay event is represented as a fixed-dimensional vector of feature attributes, forming an $n \\times k$ representation of the event, where $n$ is the number of particles in the event and $k$ is the dimensionality of the feature vector. A convolutional architecture is used to capture dependencies between the embedded particle representations and perform the final classification. The proposed model performs outperforms standard machine learning approaches based on Monte Carlo studies across a range of variable final-state decays with the Belle II det...

  6. Design and validation of a standards-based science teacher efficacy instrument

    Science.gov (United States)

    Kerr, Patricia Reda

    National standards for K--12 science education address all aspects of science education, with their main emphasis on curriculum---both science subject matter and the process involved in doing science. Standards for science teacher education programs have been developing along a parallel plane, as is self-efficacy research involving classroom teachers. Generally, studies about efficacy have been dichotomous---basing the theoretical underpinnings on the work of either Rotter's Locus of Control theory or on Bandura's explanations of efficacy beliefs and outcome expectancy. This study brings all three threads together---K--12 science standards, teacher education standards, and efficacy beliefs---in an instrument designed to measure science teacher efficacy with items based on identified critical attributes of standards-based science teaching and learning. Based on Bandura's explanation of efficacy being task-specific and having outcome expectancy, a developmental, systematic progression from standards-based strategies and activities to tasks to critical attributes was used to craft items for a standards-based science teacher efficacy instrument. Demographic questions related to school characteristics, teacher characteristics, preservice background, science teaching experience, and post-certification professional development were included in the instrument. The instrument was completed by 102 middle level science teachers, with complete data for 87 teachers. A principal components analysis of the science teachers' responses to the instrument resulted in two components: Standards-Based Science Teacher Efficacy: Beliefs About Teaching (BAT, reliability = .92) and Standards-Based Science Teacher Efficacy: Beliefs About Student Achievement (BASA, reliability = .82). Variables that were characteristic of professional development activities, science content preparation, and school environment were identified as members of the sets of variables predicting the BAT and BASA

  7. Logic Learning Machine and standard supervised methods for Hodgkin's lymphoma prognosis using gene expression data and clinical variables.

    Science.gov (United States)

    Parodi, Stefano; Manneschi, Chiara; Verda, Damiano; Ferrari, Enrico; Muselli, Marco

    2018-03-01

    This study evaluates the performance of a set of machine learning techniques in predicting the prognosis of Hodgkin's lymphoma using clinical factors and gene expression data. Analysed samples from 130 Hodgkin's lymphoma patients included a small set of clinical variables and more than 54,000 gene features. Machine learning classifiers included three black-box algorithms ( k-nearest neighbour, Artificial Neural Network, and Support Vector Machine) and two methods based on intelligible rules (Decision Tree and the innovative Logic Learning Machine method). Support Vector Machine clearly outperformed any of the other methods. Among the two rule-based algorithms, Logic Learning Machine performed better and identified a set of simple intelligible rules based on a combination of clinical variables and gene expressions. Decision Tree identified a non-coding gene ( XIST) involved in the early phases of X chromosome inactivation that was overexpressed in females and in non-relapsed patients. XIST expression might be responsible for the better prognosis of female Hodgkin's lymphoma patients.

  8. Control principles of confounders in ecological comparative studies: standardization and regressive modelss

    Directory of Open Access Journals (Sweden)

    Varaksin Anatoly

    2014-03-01

    Full Text Available The methods of the analysis of research data including the concomitant variables (confounders associated with both the response and the current factor are considered. There are two usual ways to take into account such variables: the first, at the stage of planning the experiment and the second, in analyzing the received data. Despite the equal effectiveness of these approaches, there exists strong reason to restrict the usage of regression method to accounting for confounders by ANCOVA. Authors consider the standardization by stratification as a reliable method to account for the effect of confounding factors as opposed to the widely-implemented application of logistic regression and the covariance analysis. The program for the automation of standardization procedure is proposed, it is available at the site of the Institute of Industrial Ecology.

  9. Comparing Standard Deviation Effects across Contexts

    Science.gov (United States)

    Ost, Ben; Gangopadhyaya, Anuj; Schiman, Jeffrey C.

    2017-01-01

    Studies using tests scores as the dependent variable often report point estimates in student standard deviation units. We note that a standard deviation is not a standard unit of measurement since the distribution of test scores can vary across contexts. As such, researchers should be cautious when interpreting differences in the numerical size of…

  10. TEC variability over Havana

    International Nuclear Information System (INIS)

    Lazo, B.; Alazo, K.; Rodriguez, M.; Calzadilla, A.

    2003-01-01

    The variability of total electron content (TEC) measured over Havana using ATS-6, SMS-1 and GOES-3 geosynchronous satellite signals has been investigated for low, middle and high solar activity periods from 1974 to 1982. The obtained results show that standard deviation is smooth during nighttime hours and maximum at noon or postnoon hours. Strong solar activity dependence of standard deviation with a maximum values during HSA has been found. (author)

  11. Consistency Across Standards or Standards in a New Business Model

    Science.gov (United States)

    Russo, Dane M.

    2010-01-01

    Presentation topics include: standards in a changing business model, the new National Space Policy is driving change, a new paradigm for human spaceflight, consistency across standards, the purpose of standards, danger of over-prescriptive standards, a balance is needed (between prescriptive and general standards), enabling versus inhibiting, characteristics of success-oriented standards, characteristics of success-oriented standards, and conclusions. Additional slides include NASA Procedural Requirements 8705.2B identifies human rating standards and requirements, draft health and medical standards for human rating, what's been done, government oversight models, examples of consistency from anthropometry, examples of inconsistency from air quality and appendices of government and non-governmental human factors standards.

  12. Statistical methodology for discrete fracture model - including fracture size, orientation uncertainty together with intensity uncertainty and variability

    Energy Technology Data Exchange (ETDEWEB)

    Darcel, C. (Itasca Consultants SAS (France)); Davy, P.; Le Goc, R.; Dreuzy, J.R. de; Bour, O. (Geosciences Rennes, UMR 6118 CNRS, Univ. def Rennes, Rennes (France))

    2009-11-15

    the lineament scale (k{sub t} = 2) on the other, addresses the issue of the nature of the transition. We develop a new 'mechanistic' model that could help in modeling why and where this transition can occur. The transition between both regimes would occur for a fracture length of 1-10 m and even at a smaller scale for the few outcrops that follow the self-similar density model. A consequence for the disposal issue is that the model that is likely to apply in the 'blind' scale window between 10-100 m is the self-similar model as it is defined for large-scale lineaments. The self-similar model, as it is measured for some outcrops and most lineament maps, is definitely worth being investigated as a reference for scales above 1-10 m. In the rest of the report, we develop a methodology for incorporating uncertainty and variability into the DFN modeling. Fracturing properties arise from complex processes which produce an intrinsic variability; characterizing this variability as an admissible variation of model parameter or as the division of the site into subdomains with distinct DFN models is a critical point of the modeling effort. Moreover, the DFN model encompasses a part of uncertainty, due to data inherent uncertainties and sampling limits. Both effects must be quantified and incorporated into the DFN site model definition process. In that context, all available borehole data including recording of fracture intercept positions, pole orientation and relative uncertainties are used as the basis for the methodological development and further site model assessment. An elementary dataset contains a set of discrete fracture intercepts from which a parent orientation/density distribution can be computed. The elementary bricks of the site, from which these initial parent density distributions are computed, rely on the former Single Hole Interpretation division of the boreholes into sections whose local boundaries are expected to reflect - locally - geology

  13. Uncertainty Analysis of Spectral Irradiance Reference Standards Used for NREL Calibrations

    Energy Technology Data Exchange (ETDEWEB)

    Habte, A.; Andreas, A.; Reda, I.; Campanelli, M.; Stoffel, T.

    2013-05-01

    Spectral irradiance produced by lamp standards such as the National Institute of Standards and Technology (NIST) FEL-type tungsten halogen lamps are used to calibrate spectroradiometers at the National Renewable Energy Laboratory. Spectroradiometers are often used to characterize spectral irradiance of solar simulators, which in turn are used to characterize photovoltaic device performance, e.g., power output and spectral response. Therefore, quantifying the calibration uncertainty of spectroradiometers is critical to understanding photovoltaic system performance. In this study, we attempted to reproduce the NIST-reported input variables, including the calibration uncertainty in spectral irradiance for a standard NIST lamp, and quantify uncertainty for measurement setup at the Optical Metrology Laboratory at the National Renewable Energy Laboratory.

  14. Relationship of suicide rates with climate and economic variables in Europe during 2000-2012

    DEFF Research Database (Denmark)

    Fountoulakis, Konstantinos N; Chatzikosta, Isaia; Pastiadis, Konstantinos

    2016-01-01

    BACKGROUND: It is well known that suicidal rates vary considerably among European countries and the reasons for this are unknown, although several theories have been proposed. The effect of economic variables has been extensively studied but not that of climate. METHODS: Data from 29 European...... countries covering the years 2000-2012 and concerning male and female standardized suicidal rates (according to WHO), economic variables (according World Bank) and climate variables were gathered. The statistical analysis included cluster and principal component analysis and categorical regression. RESULTS......: The derived models explained 62.4 % of the variability of male suicidal rates. Economic variables alone explained 26.9 % and climate variables 37.6 %. For females, the respective figures were 41.7, 11.5 and 28.1 %. Male suicides correlated with high unemployment rate in the frame of high growth rate and high...

  15. Standardized nomenclatures: keys to continuity of care, nursing accountability and nursing effectiveness.

    Science.gov (United States)

    Keenan, G; Aquilino, M L

    1998-01-01

    Standardized nursing nomenclatures must be included in clinical documentation systems to generate data that more accurately represent nursing practice than outcomes-related measures currently used to support important policy decisions. NANDA, NIC, and NOC--comprehensive nomenclatures for the needed variables of nursing diagnoses, interventions, and outcomes--are described. Added benefits of using NANDA, NIC, and NOC in everyday practice are outlined, including facilitation of the continuity of care of patients in integrated health systems.

  16. Technical support document: Energy conservation standards for consumer products: Dishwashers, clothes washers, and clothes dryers including: Environmental impacts; regulatory impact analysis

    Energy Technology Data Exchange (ETDEWEB)

    1990-12-01

    The Energy Policy and Conservation Act as amended (P.L. 94-163), establishes energy conservation standards for 12 of the 13 types of consumer products specifically covered by the Act. The legislation requires the Department of Energy (DOE) to consider new or amended standards for these and other types of products at specified times. This Technical Support Document presents the methodology, data and results from the analysis of the energy and economic impacts of standards on dishwashers, clothes washers, and clothes dryers. The economic impact analysis is performed in five major areas: An Engineering Analysis, which establishes technical feasibility and product attributes including costs of design options to improve appliance efficiency. A Consumer Analysis at two levels: national aggregate impacts, and impacts on individuals. The national aggregate impacts include forecasts of appliance sales, efficiencies, energy use, and consumer expenditures. The individual impacts are analyzed by Life-Cycle Cost (LCC), Payback Periods, and Cost of Conserved Energy (CCE), which evaluate the savings in operating expenses relative to increases in purchase price; A Manufacturer Analysis, which provides an estimate of manufacturers' response to the proposed standards. Their response is quantified by changes in several measures of financial performance for a firm. An Industry Impact Analysis shows financial and competitive impacts on the appliance industry. A Utility Analysis that measures the impacts of the altered energy-consumption patterns on electric utilities. A Environmental Effects analysis, which estimates changes in emissions of carbon dioxide, sulfur oxides, and nitrogen oxides, due to reduced energy consumption in the home and at the power plant. A Regulatory Impact Analysis collects the results of all the analyses into the net benefits and costs from a national perspective. 47 figs., 171 tabs. (JF)

  17. Decommissioning standards

    International Nuclear Information System (INIS)

    Crofford, W.N.

    1980-01-01

    EPA has agreed to establish a series of environmental standards for the safe disposal of radioactive waste through participation in the Interagency Review Group on Nuclear Waste Management (IRG). One of the standards required under the IRG is the standard for decommissioning of radioactive contaminated sites, facilities, and materials. This standard is to be proposed by December 1980 and promulgated by December 1981. Several considerations are important in establishing these standards. This study includes discussions of some of these considerations and attempts to evaluate their relative importance. Items covered include: the form of the standards, timing for decommissioning, occupational radiation protection, costs and financial provisions. 4 refs

  18. Variability, Predictability, and Race Factors Affecting Performance in Elite Biathlon.

    Science.gov (United States)

    Skattebo, Øyvind; Losnegard, Thomas

    2018-03-01

    To investigate variability, predictability, and smallest worthwhile performance enhancement in elite biathlon sprint events. In addition, the effects of race factors on performance were assessed. Data from 2005 to 2015 including >10,000 and >1000 observations for each sex for all athletes and annual top-10 athletes, respectively, were included. Generalized linear mixed models were constructed based on total race time, skiing time, shooting time, and proportions of targets hit. Within-athlete race-to-race variability was expressed as coefficient of variation of performance times and standard deviation (SD) in proportion units (%) of targets hit. The models were adjusted for random and fixed effects of subject identity, season, event identity, and race factors. The within-athlete variability was independent of sex and performance standard of athletes: 2.5-3.2% for total race time, 1.5-1.8% for skiing time, and 11-15% for shooting times. The SD of the proportion of hits was ∼10% in both shootings combined (meaning ±1 hit in 10 shots). The predictability in total race time was very high to extremely high for all athletes (ICC .78-.84) but trivial for top-10 athletes (ICC .05). Race times during World Championships and Olympics were ∼2-3% faster than in World Cups. Moreover, race time increased by ∼2% per 1000 m of altitude, by ∼5% per 1% of gradient, by 1-2% per 1 m/s of wind speed, and by ∼2-4% on soft vs hard tracks. Researchers and practitioners should focus on strategies that improve biathletes' performance by at least 0.8-0.9%, corresponding to the smallest worthwhile enhancement (0.3 × within-athlete variability).

  19. Performance Standards': Utility for Different Uses of Assessments

    Directory of Open Access Journals (Sweden)

    Robert L. Linn

    2003-09-01

    Full Text Available Performance standards are arguably one of the most controversial topics in educational measurement. There are uses of assessments such as licensure and certification where performance standards are essential. There are many other uses, however, where performance standards have been mandated or become the preferred method of reporting assessment results where the standards are not essential to the use. Distinctions between essential and nonessential uses of performance standards are discussed. It is argued that the insistence on reporting in terms of performance standards in situations where they are not essential has been more harmful than helpful. Variability in the definitions of proficient academic achievement by states for purposes of the No Child Left Behind Act of 2001 is discussed and it is argued that the variability is so great that characterizing achievement is meaningless. Illustrations of the great uncertainty in standards are provided.

  20. The number of subjects per variable required in linear regression analyses.

    Science.gov (United States)

    Austin, Peter C; Steyerberg, Ewout W

    2015-06-01

    To determine the number of independent variables that can be included in a linear regression model. We used a series of Monte Carlo simulations to examine the impact of the number of subjects per variable (SPV) on the accuracy of estimated regression coefficients and standard errors, on the empirical coverage of estimated confidence intervals, and on the accuracy of the estimated R(2) of the fitted model. A minimum of approximately two SPV tended to result in estimation of regression coefficients with relative bias of less than 10%. Furthermore, with this minimum number of SPV, the standard errors of the regression coefficients were accurately estimated and estimated confidence intervals had approximately the advertised coverage rates. A much higher number of SPV were necessary to minimize bias in estimating the model R(2), although adjusted R(2) estimates behaved well. The bias in estimating the model R(2) statistic was inversely proportional to the magnitude of the proportion of variation explained by the population regression model. Linear regression models require only two SPV for adequate estimation of regression coefficients, standard errors, and confidence intervals. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  1. Variations in Carabidae assemblages across the farmland habitats in relation to selected environmental variables including soil properties

    Directory of Open Access Journals (Sweden)

    Beáta Baranová

    2018-03-01

    Full Text Available The variations in ground beetles (Coleoptera: Carabidae assemblages across the three types of farmland habitats, arable land, meadows and woody vegetation were studied in relation to vegetation cover structure, intensity of agrotechnical interventions and selected soil properties. Material was pitfall trapped in 2010 and 2011 on twelve sites of the agricultural landscape in the Prešov town and its near vicinity, Eastern Slovakia. A total of 14,763 ground beetle individuals were entrapped. Material collection resulted into 92 Carabidae species, with the following six species dominating: Poecilus cupreus, Pterostichus melanarius, Pseudoophonus rufipes, Brachinus crepitans, Anchomenus dorsalis and Poecilus versicolor. Studied habitats differed significantly in the number of entrapped individuals, activity abundance as well as representation of the carabids according to their habitat preferences and ability to fly. However, no significant distinction was observed in the diversity, evenness neither dominance. The most significant environmental variables affecting Carabidae assemblages species variability were soil moisture and herb layer 0-20 cm. Another best variables selected by the forward selection were intensity of agrotechnical interventions, humus content and shrub vegetation. The other from selected soil properties seem to have just secondary meaning for the adult carabids. Environmental variables have the strongest effect on the habitat specialists, whereas ground beetles without special requirements to the habitat quality seem to be affected by the studied environmental variables just little.

  2. Technical support document: Energy efficiency standards for consumer products: Refrigerators, refrigerator-freezers, and freezers including draft environmental assessment, regulatory impact analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-01

    The Energy Policy and Conservation Act (P.L. 94-163), as amended by the National Appliance Energy Conservation Act of 1987 (P.L. 100-12) and by the National Appliance Energy Conservation Amendments of 1988 (P.L. 100-357), and by the Energy Policy Act of 1992 (P.L. 102-486), provides energy conservation standards for 12 of the 13 types of consumer products` covered by the Act, and authorizes the Secretary of Energy to prescribe amended or new energy standards for each type (or class) of covered product. The assessment of the proposed standards for refrigerators, refrigerator-freezers, and freezers presented in this document is designed to evaluate their economic impacts according to the criteria in the Act. It includes an engineering analysis of the cost and performance of design options to improve the efficiency of the products; forecasts of the number and average efficiency of products sold, the amount of energy the products will consume, and their prices and operating expenses; a determination of change in investment, revenues, and costs to manufacturers of the products; a calculation of the costs and benefits to consumers, electric utilities, and the nation as a whole; and an assessment of the environmental impacts of the proposed standards.

  3. Impact of Subsurface Temperature Variability on Meteorological Variability: An AGCM Study

    Science.gov (United States)

    Mahanama, S. P.; Koster, R. D.; Liu, P.

    2006-05-01

    Anomalous atmospheric conditions can lead to surface temperature anomalies, which in turn can lead to temperature anomalies deep in the soil. The deep soil temperature (and the associated ground heat content) has significant memory -- the dissipation of a temperature anomaly may take weeks to months -- and thus deep soil temperature may contribute to the low frequency variability of energy and water variables elsewhere in the system. The memory may even provide some skill to subseasonal and seasonal forecasts. This study uses two long-term AGCM experiments to isolate the contribution of deep soil temperature variability to variability elsewhere in the climate system. The first experiment consists of a standard ensemble of AMIP-type simulations, simulations in which the deep soil temperature variable is allowed to interact with the rest of the system. In the second experiment, the coupling of the deep soil temperature to the rest of the climate system is disabled -- at each grid cell, the local climatological seasonal cycle of deep soil temperature (as determined from the first experiment) is prescribed. By comparing the variability of various atmospheric quantities as generated in the two experiments, we isolate the contribution of interactive deep soil temperature to that variability. The results show that interactive deep soil temperature contributes significantly to surface temperature variability. Interactive deep soil temperature, however, reduces the variability of the hydrological cycle (evaporation and precipitation), largely because it allows for a negative feedback between evaporation and temperature.

  4. Efficiency of a new internal combustion engine concept with variable piston motion

    Directory of Open Access Journals (Sweden)

    Dorić Jovan Ž.

    2014-01-01

    Full Text Available This paper presents simulation of working process in a new IC engine concept. The main feature of this new IC engine concept is the realization of variable movement of the piston. With this unconventional piston movement it is easy to provide variable compression ratio, variable displacement and combustion during constant volume. These advantages over standard piston mechanism are achieved through synthesis of the two pairs of non-circular gears. Presented mechanism is designed to obtain a specific motion law which provides better fuel consumption of IC engines. For this paper Ricardo/WAVE software was used, which provides a fully integrated treatment of time-dependent fluid dynamics and thermodynamics by means of onedimensional formulation. The results obtained herein include the efficiency characteristic of this new heat engine concept. The results show that combustion during constant volume, variable compression ratio and variable displacement have significant impact on improvement of fuel consumption.

  5. Technical standards and guidelines: prenatal screening for Down syndrome that includes first-trimester biochemistry and/or ultrasound measurements.

    Science.gov (United States)

    Palomaki, Glenn E; Lee, Jo Ellen S; Canick, Jacob A; McDowell, Geraldine A; Donnenfeld, Alan E

    2009-09-01

    This statement is intended to augment the current general ACMG Standards and Guidelines for Clinical Genetics Laboratories and to address guidelines specific to first-trimester screening for Down syndrome. The aim is to provide the laboratory the necessary information to ensure accurate and reliable Down syndrome screening results given a screening protocol (e.g., combined first trimester and integrated testing). Information about various test combinations and their expected performance are provided, but other issues such as availability of reagents, patient interest in early test results, access to open neural tube defect screening, and availability of chorionic villus sampling are all contextual factors in deciding which screening protocol(s) will be selected by individual health care providers. Individual laboratories are responsible for meeting the quality assurance standards described by the Clinical Laboratory Improvement Act, the College of American Pathologists, and other regulatory agencies, with respect to appropriate sample documentation, assay validation, general proficiency, and quality control measures. These guidelines address first-trimester screening that includes ultrasound measurement and interpretation of nuchal translucency thickness and protocols that combine markers from both the first and second trimesters. Laboratories can use their professional judgment to make modification or additions.

  6. Acoustic response variability in automotive vehicles

    Science.gov (United States)

    Hills, E.; Mace, B. R.; Ferguson, N. S.

    2009-03-01

    A statistical analysis of a series of measurements of the audio-frequency response of a large set of automotive vehicles is presented: a small hatchback model with both a three-door (411 vehicles) and five-door (403 vehicles) derivative and a mid-sized family five-door car (316 vehicles). The sets included vehicles of various specifications, engines, gearboxes, interior trim, wheels and tyres. The tests were performed in a hemianechoic chamber with the temperature and humidity recorded. Two tests were performed on each vehicle and the interior cabin noise measured. In the first, the excitation was acoustically induced by sets of external loudspeakers. In the second test, predominantly structure-borne noise was induced by running the vehicle at a steady speed on a rough roller. For both types of excitation, it is seen that the effects of temperature are small, indicating that manufacturing variability is larger than that due to temperature for the tests conducted. It is also observed that there are no significant outlying vehicles, i.e. there are at most only a few vehicles that consistently have the lowest or highest noise levels over the whole spectrum. For the acoustically excited tests, measured 1/3-octave noise reduction levels typically have a spread of 5 dB or so and the normalised standard deviation of the linear data is typically 0.1 or higher. Regarding the statistical distribution of the linear data, a lognormal distribution is a somewhat better fit than a Gaussian distribution for lower 1/3-octave bands, while the reverse is true at higher frequencies. For the distribution of the overall linear levels, a Gaussian distribution is generally the most representative. As a simple description of the response variability, it is sufficient for this series of measurements to assume that the acoustically induced airborne cabin noise is best described by a Gaussian distribution with a normalised standard deviation between 0.09 and 0.145. There is generally

  7. Fatigue Behavior under Multiaxial Stress States Including Notch Effects and Variable Amplitude Loading

    Science.gov (United States)

    Gates, Nicholas R.

    The central objective of the research performed in this study was to be able to better understand and predict fatigue crack initiation and growth from stress concentrations subjected to complex service loading histories. As such, major areas of focus were related to the understanding and modeling of material deformation behavior, fatigue damage quantification, notch effects, cycle counting, damage accumulation, and crack growth behavior under multiaxial nominal loading conditions. To support the analytical work, a wide variety of deformation and fatigue tests were also performed using tubular and plate specimens made from 2024-T3 aluminum alloy, with and without the inclusion of a circular through-thickness hole. However, the analysis procedures implemented were meant to be general in nature, and applicable to a wide variety of materials and component geometries. As a result, experimental data from literature were also used, when appropriate, to supplement the findings of various analyses. Popular approaches currently used for multiaxial fatigue life analysis are based on the idea of computing an equivalent stress/strain quantity through the extension of static yield criteria. This equivalent stress/strain is then considered to be equal, in terms of fatigue damage, to a uniaxial loading of the same magnitude. However, it has often been shown, and was shown again in this study, that although equivalent stress- and strain-based analysis approaches may work well in certain situations, they lack a general robustness and offer little room for improvement. More advanced analysis techniques, on the other hand, provide an opportunity to more accurately account for various aspects of the fatigue failure process under both constant and variable amplitude loading conditions. As a result, such techniques were of primary interest in the investigations performed. By implementing more advanced life prediction methodologies, both the overall accuracy and the correlation of fatigue

  8. hmF2 variability over Havana

    International Nuclear Information System (INIS)

    Lazo, B.; Alazo, K.; Rodriguez, M.; Calzadilla, A.

    2003-01-01

    The hmF2 variability over Havana station (Geo. Latitude 23 deg. N, Geo Longitude 278 deg. E; Dip 54.6 deg. N; Modip: 44.8 deg. N) is presented. In this study different solar and seasonal conditions are considered. The results show that, in general, standard deviation of hmF2 is quite irregular and reaches its values at nighttimes hours. Lower and upper quartiles variability has a similar behaviour to IQ variability, showing its higher values at nighttimes too. (author)

  9. The WFCAM multiwavelength Variable Star Catalog

    Science.gov (United States)

    Ferreira Lopes, C. E.; Dékány, I.; Catelan, M.; Cross, N. J. G.; Angeloni, R.; Leão, I. C.; De Medeiros, J. R.

    2015-01-01

    Context. Stellar variability in the near-infrared (NIR) remains largely unexplored. The exploitation of public science archives with data-mining methods offers a perspective for a time-domain exploration of the NIR sky. Aims: We perform a comprehensive search for stellar variability using the optical-NIR multiband photometric data in the public Calibration Database of the WFCAM Science Archive (WSA), with the aim of contributing to the general census of variable stars and of extending the current scarce inventory of accurate NIR light curves for a number of variable star classes. Methods: Standard data-mining methods were applied to extract and fine-tune time-series data from the WSA. We introduced new variability indices designed for multiband data with correlated sampling, and applied them for preselecting variable star candidates, i.e., light curves that are dominated by correlated variations, from noise-dominated ones. Preselection criteria were established by robust numerical tests for evaluating the response of variability indices to the colored noise characteristic of the data. We performed a period search using the string-length minimization method on an initial catalog of 6551 variable star candidates preselected by variability indices. Further frequency analysis was performed on positive candidates using three additional methods in combination, in order to cope with aliasing. Results: We find 275 periodic variable stars and an additional 44 objects with suspected variability with uncertain periods or apparently aperiodic variation. Only 44 of these objects had been previously known, including 11 RR Lyrae stars on the outskirts of the globular cluster M 3 (NGC 5272). We provide a preliminary classification of the new variable stars that have well-measured light curves, but the variability types of a large number of objects remain ambiguous. We classify most of the new variables as contact binary stars, but we also find several pulsating stars, among which

  10. Partial summations of stationary sequences of non-Gaussian random variables

    DEFF Research Database (Denmark)

    Mohr, Gunnar; Ditlevsen, Ove Dalager

    1996-01-01

    The distribution of the sum of a finite number of identically distributed random variables is in many cases easily determined given that the variables are independent. The moments of any order of the sum can always be expressed by the moments of the single term without computational problems...... of convergence of the distribution of a sum (or an integral) of mutually dependent random variables to the Gaussian distribution. The paper is closely related to the work in Ditlevsen el al. [Ditlevsen, O., Mohr, G. & Hoffmeyer, P. Integration of non-Gaussian fields. Prob. Engng Mech 11 (1996) 15-23](2)....... lognormal variables or polynomials of standard Gaussian variables. The dependency structure is induced by specifying the autocorrelation structure of the sequence of standard Gaussian variables. Particularly useful polynomials are the Winterstein approximations that distributionally fit with non...

  11. Introducing a true internal standard for the Comet assay to minimize intra- and inter-experiment variability in measures of DNA damage and repair

    Science.gov (United States)

    Zainol, Murizal; Stoute, Julia; Almeida, Gabriela M.; Rapp, Alexander; Bowman, Karen J.; Jones, George D. D.

    2009-01-01

    The Comet assay (CA) is a sensitive/simple measure of genotoxicity. However, many features of CA contribute variability. To minimize these, we have introduced internal standard materials consisting of ‘reference’ cells which have their DNA substituted with BrdU. Using a fluorescent anti-BrdU antibody, plus an additional barrier filter, comets derived from these cells could be readily distinguished from the ‘test’-cell comets, present in the same gel. In experiments to evaluate the reference cell comets as external and internal standards, the reference and test cells were present in separate gels on the same slide or mixed together in the same gel, respectively, before their co-exposure to X-irradiation. Using the reference cell comets as internal standards led to substantial reductions in the coefficient of variation (CoV) for intra- and inter-experimental measures of comet formation and DNA damage repair; only minor reductions in CoV were noted when the reference and test cell comets were in separate gels. These studies indicate that differences between individual gels appreciably contribute to CA variation. Further studies using the reference cells as internal standards allowed greater significance to be obtained between groups of replicate samples. Ultimately, we anticipate that development will deliver robust quality assurance materials for CA. PMID:19828597

  12. Auxiliary variables in multiple imputation in regression with missing X: a warning against including too many in small sample research

    Directory of Open Access Journals (Sweden)

    Hardt Jochen

    2012-12-01

    Full Text Available Abstract Background Multiple imputation is becoming increasingly popular. Theoretical considerations as well as simulation studies have shown that the inclusion of auxiliary variables is generally of benefit. Methods A simulation study of a linear regression with a response Y and two predictors X1 and X2 was performed on data with n = 50, 100 and 200 using complete cases or multiple imputation with 0, 10, 20, 40 and 80 auxiliary variables. Mechanisms of missingness were either 100% MCAR or 50% MAR + 50% MCAR. Auxiliary variables had low (r=.10 vs. moderate correlations (r=.50 with X’s and Y. Results The inclusion of auxiliary variables can improve a multiple imputation model. However, inclusion of too many variables leads to downward bias of regression coefficients and decreases precision. When the correlations are low, inclusion of auxiliary variables is not useful. Conclusion More research on auxiliary variables in multiple imputation should be performed. A preliminary rule of thumb could be that the ratio of variables to cases with complete data should not go below 1 : 3.

  13. A comparison of important international and national standards for limiting exposure to EMF including the scientific rationale.

    Science.gov (United States)

    Roy, Colin R; Martin, Lindsay J

    2007-06-01

    A comparison of Eastern (from Russia, Hungary, Bulgaria, Poland, and the Czech Republic) and Western (represented by the International Commission on Non-Ionizing Radiation Protection guidelines and the Institute of Electrical and Electronic Engineers standards) radiofrequency standards reveals key differences. The Eastern approach is to protect against non-thermal effects caused by chronic exposure to low level exposure, and the occupational basic restriction is power load (the product of intensity and exposure duration). In contrast, the Western approach is to protect against established acute biological effects that could signal an adverse health effect, and the principal basic restriction is the specific absorption rate to protect against thermal effects. All of the standards are science-based, but a fundamental difference arises from a lack of agreement on the composition of the reference scientific database and of which adverse effect needs to be protected against. However, differences also exist between the ICNIRP and IEEE standards. An additional complication arises when standards are derived or modified using a precautionary approach. For ELF the differences between ICNIRP and IEEE are more fundamental; namely, differences in the basic restriction used (induced current; in-situ electric field) and the location of breakpoints in the strength-frequency curves result in large differences. In 2006, ICNIRP will initiate the review of their ELF and radiofrequency guidelines, and this will provide an opportunity to address differences in standards and the move towards harmonization of EMF standards and guidelines.

  14. [Standard algorithm of molecular typing of Yersinia pestis strains].

    Science.gov (United States)

    Eroshenko, G A; Odinokov, G N; Kukleva, L M; Pavlova, A I; Krasnov, Ia M; Shavina, N Iu; Guseva, N P; Vinogradova, N A; Kutyrev, V V

    2012-01-01

    Development of the standard algorithm of molecular typing of Yersinia pestis that ensures establishing of subspecies, biovar and focus membership of the studied isolate. Determination of the characteristic strain genotypes of plague infectious agent of main and nonmain subspecies from various natural foci of plague of the Russian Federation and the near abroad. Genotyping of 192 natural Y. pestis strains of main and nonmain subspecies was performed by using PCR methods, multilocus sequencing and multilocus analysis of variable tandem repeat number. A standard algorithm of molecular typing of plague infectious agent including several stages of Yersinia pestis differentiation by membership: in main and nonmain subspecies, various biovars of the main subspecies, specific subspecies; natural foci and geographic territories was developed. The algorithm is based on 3 typing methods--PCR, multilocus sequence typing and multilocus analysis of variable tandem repeat number using standard DNA targets--life support genes (terC, ilvN, inv, glpD, napA, rhaS and araC) and 7 loci of variable tandem repeats (ms01, ms04, ms06, ms07, ms46, ms62, ms70). The effectiveness of the developed algorithm is shown on the large number of natural Y. pestis strains. Characteristic sequence types of Y. pestis strains of various subspecies and biovars as well as MLVA7 genotypes of strains from natural foci of plague of the Russian Federation and the near abroad were established. The application of the developed algorithm will increase the effectiveness of epidemiologic monitoring of plague infectious agent, and analysis of epidemics and outbreaks of plague with establishing the source of origin of the strain and routes of introduction of the infection.

  15. Variability of the Wind Turbine Power Curve

    Directory of Open Access Journals (Sweden)

    Mahesh M. Bandi

    2016-09-01

    Full Text Available Wind turbine power curves are calibrated by turbine manufacturers under requirements stipulated by the International Electrotechnical Commission to provide a functional mapping between the mean wind speed v ¯ and the mean turbine power output P ¯ . Wind plant operators employ these power curves to estimate or forecast wind power generation under given wind conditions. However, it is general knowledge that wide variability exists in these mean calibration values. We first analyse how the standard deviation in wind speed σ v affects the mean P ¯ and the standard deviation σ P of wind power. We find that the magnitude of wind power fluctuations scales as the square of the mean wind speed. Using data from three planetary locations, we find that the wind speed standard deviation σ v systematically varies with mean wind speed v ¯ , and in some instances, follows a scaling of the form σ v = C × v ¯ α ; C being a constant and α a fractional power. We show that, when applicable, this scaling form provides a minimal parameter description of the power curve in terms of v ¯ alone. Wind data from different locations establishes that (in instances when this scaling exists the exponent α varies with location, owing to the influence of local environmental conditions on wind speed variability. Since manufacturer-calibrated power curves cannot account for variability influenced by local conditions, this variability translates to forecast uncertainty in power generation. We close with a proposal for operators to perform post-installation recalibration of their turbine power curves to account for the influence of local environmental factors on wind speed variability in order to reduce the uncertainty of wind power forecasts. Understanding the relationship between wind’s speed and its variability is likely to lead to lower costs for the integration of wind power into the electric grid.

  16. Groundwater level responses to precipitation variability in Mediterranean insular aquifers

    Science.gov (United States)

    Lorenzo-Lacruz, Jorge; Garcia, Celso; Morán-Tejeda, Enrique

    2017-09-01

    Groundwater is one of the largest and most important sources of fresh water on many regions under Mediterranean climate conditions, which are exposed to large precipitation variability that includes frequent meteorological drought episodes, and present high evapotranspiration rates and water demand during the dry season. The dependence on groundwater increases in those areas with predominant permeable lithologies, contributing to aquifer recharge and the abundance of ephemeral streams. The increasing pressure of tourism on water resources in many Mediterranean coastal areas, and uncertainty related to future precipitation and water availability, make it urgent to understand the spatio-temporal response of groundwater bodies to precipitation variability, if sustainable use of the resource is to be achieved. We present an assessment of the response of aquifers to precipitation variability based on correlations between the Standardized Precipitation Index (SPI) at various time scales and the Standardized Groundwater Index (SGI) across a Mediterranean island. We detected three main responses of aquifers to accumulated precipitation anomalies: (i) at short time scales of the SPI (24 months). The differing responses were mainly explained by differences in lithology and the percentage of highly permeable rock strata in the aquifer recharge areas. We also identified differences in the months and seasons when aquifer storages are more dependent on precipitation; these were related to climate seasonality and the degree of aquifer exploitation or underground water extraction. The recharge of some aquifers, especially in mountainous areas, is related to precipitation variability within a limited spatial extent, whereas for aquifers located in the plains, precipitation variability influence much larger areas; the topography and geological structure of the island explain these differences. Results indicate large spatial variability in the response of aquifers to precipitation in

  17. A method to standardize gait and balance variables for gait velocity.

    NARCIS (Netherlands)

    Iersel, M.B. van; Olde Rikkert, M.G.M.; Borm, G.F.

    2007-01-01

    Many gait and balance variables depend on gait velocity, which seriously hinders the interpretation of gait and balance data derived from walks at different velocities. However, as far as we know there is no widely accepted method to correct for effects of gait velocity on other gait and balance

  18. Spontaneous temporal changes and variability of peripheral nerve conduction analyzed using a random effects model

    DEFF Research Database (Denmark)

    Krøigård, Thomas; Gaist, David; Otto, Marit

    2014-01-01

    SUMMARY: The reproducibility of variables commonly included in studies of peripheral nerve conduction in healthy individuals has not previously been analyzed using a random effects regression model. We examined the temporal changes and variability of standard nerve conduction measures in the leg...... reexamined after 2 and 26 weeks. There was no change in the variables except for a minor decrease in sural nerve sensory action potential amplitude and a minor increase in tibial nerve minimal F-wave latency. Reproducibility was best for peroneal nerve distal motor latency and motor conduction velocity......, sural nerve sensory conduction velocity, and tibial nerve minimal F-wave latency. Between-subject variability was greater than within-subject variability. Sample sizes ranging from 21 to 128 would be required to show changes twice the magnitude of the spontaneous changes observed in this study. Nerve...

  19. Inlet-engine matching for SCAR including application of a bicone variable geometry inlet

    Science.gov (United States)

    Wasserbauer, J. F.; Gerstenmaier, W. H.

    1978-01-01

    Airflow characteristics of variable cycle engines (VCE) designed for Mach 2.32 can have transonic airflow requirements as high as 1.6 times the cruise airflow. This is a formidable requirement for conventional, high performance, axisymmetric, translating centerbody mixed compression inlets. An alternate inlet is defined, where the second cone of a two cone center body collapses to the initial cone angle to provide a large off-design airflow capability, and incorporates modest centerbody translation to minimize spillage drag. Estimates of transonic spillage drag are competitive with those of conventional translating centerbody inlets. The inlet's cruise performance exhibits very low bleed requirements with good recovery and high angle of attack capability.

  20. Statistical methodology for discrete fracture model - including fracture size, orientation uncertainty together with intensity uncertainty and variability

    International Nuclear Information System (INIS)

    Darcel, C.; Davy, P.; Le Goc, R.; Dreuzy, J.R. de; Bour, O.

    2009-11-01

    the other, addresses the issue of the nature of the transition. We develop a new 'mechanistic' model that could help in modeling why and where this transition can occur. The transition between both regimes would occur for a fracture length of 1-10 m and even at a smaller scale for the few outcrops that follow the self-similar density model. A consequence for the disposal issue is that the model that is likely to apply in the 'blind' scale window between 10-100 m is the self-similar model as it is defined for large-scale lineaments. The self-similar model, as it is measured for some outcrops and most lineament maps, is definitely worth being investigated as a reference for scales above 1-10 m. In the rest of the report, we develop a methodology for incorporating uncertainty and variability into the DFN modeling. Fracturing properties arise from complex processes which produce an intrinsic variability; characterizing this variability as an admissible variation of model parameter or as the division of the site into subdomains with distinct DFN models is a critical point of the modeling effort. Moreover, the DFN model encompasses a part of uncertainty, due to data inherent uncertainties and sampling limits. Both effects must be quantified and incorporated into the DFN site model definition process. In that context, all available borehole data including recording of fracture intercept positions, pole orientation and relative uncertainties are used as the basis for the methodological development and further site model assessment. An elementary dataset contains a set of discrete fracture intercepts from which a parent orientation/density distribution can be computed. The elementary bricks of the site, from which these initial parent density distributions are computed, rely on the former Single Hole Interpretation division of the boreholes into sections whose local boundaries are expected to reflect - locally - geology and fracturing properties main characteristics. From that

  1. Treatise on water hammer in hydropower standards and guidelines

    International Nuclear Information System (INIS)

    Bergant, A; Mazij, J; Karney, B; Pejović, S

    2014-01-01

    This paper reviews critical water hammer parameters as they are presented in official hydropower standards and guidelines. A particular emphasize is given to a number of IEC standards and guidelines that are used worldwide. The paper critically assesses water hammer control strategies including operational scenarios (closing and opening laws), surge control devices (surge tank, pressure regulating valve, flywheel, etc.), redesign of the water conveyance system components (tunnel, penstock), or limitation of operating conditions (limited operating range) that are variably covered in standards and guidelines. Little information is given on industrial water hammer models and solutions elsewhere. These are briefly introduced and discussed in the light of capability (simple versus complex systems), availability of expertise (in house and/or commercial) and uncertainty. The paper concludes with an interesting water hammer case study referencing the rules and recommendations from existing hydropower standards and guidelines in a view of effective water hammer control. Recommendations are given for further work on development of a special guideline on water hammer (hydraulic transients) in hydropower plants

  2. Treatise on water hammer in hydropower standards and guidelines

    Science.gov (United States)

    Bergant, A.; Karney, B.; Pejović, S.; Mazij, J.

    2014-03-01

    This paper reviews critical water hammer parameters as they are presented in official hydropower standards and guidelines. A particular emphasize is given to a number of IEC standards and guidelines that are used worldwide. The paper critically assesses water hammer control strategies including operational scenarios (closing and opening laws), surge control devices (surge tank, pressure regulating valve, flywheel, etc.), redesign of the water conveyance system components (tunnel, penstock), or limitation of operating conditions (limited operating range) that are variably covered in standards and guidelines. Little information is given on industrial water hammer models and solutions elsewhere. These are briefly introduced and discussed in the light of capability (simple versus complex systems), availability of expertise (in house and/or commercial) and uncertainty. The paper concludes with an interesting water hammer case study referencing the rules and recommendations from existing hydropower standards and guidelines in a view of effective water hammer control. Recommendations are given for further work on development of a special guideline on water hammer (hydraulic transients) in hydropower plants.

  3. A Framework for Categorizing Important Project Variables

    Science.gov (United States)

    Parsons, Vickie S.

    2003-01-01

    While substantial research has led to theories concerning the variables that affect project success, no universal set of such variables has been acknowledged as the standard. The identification of a specific set of controllable variables is needed to minimize project failure. Much has been hypothesized about the need to match project controls and management processes to individual projects in order to increase the chance for success. However, an accepted taxonomy for facilitating this matching process does not exist. This paper surveyed existing literature on classification of project variables. After an analysis of those proposals, a simplified categorization is offered to encourage further research.

  4. Using Copulas in the Estimation of the Economic Project Value in the Mining Industry, Including Geological Variability

    Science.gov (United States)

    Krysa, Zbigniew; Pactwa, Katarzyna; Wozniak, Justyna; Dudek, Michal

    2017-12-01

    Geological variability is one of the main factors that has an influence on the viability of mining investment projects and on the technical risk of geology projects. In the current scenario, analyses of economic viability of new extraction fields have been performed for the KGHM Polska Miedź S.A. underground copper mine at Fore Sudetic Monocline with the assumption of constant averaged content of useful elements. Research presented in this article is aimed at verifying the value of production from copper and silver ore for the same economic background with the use of variable cash flows resulting from the local variability of useful elements. Furthermore, the ore economic model is investigated for a significant difference in model value estimated with the use of linear correlation between useful elements content and the height of mine face, and the approach in which model parameters correlation is based upon the copula best matched information capacity criterion. The use of copula allows the simulation to take into account the multi variable dependencies at the same time, thereby giving a better reflection of the dependency structure, which linear correlation does not take into account. Calculation results of the economic model used for deposit value estimation indicate that the correlation between copper and silver estimated with the use of copula generates higher variation of possible project value, as compared to modelling correlation based upon linear correlation. Average deposit value remains unchanged.

  5. Regional regression models of percentile flows for the contiguous United States: Expert versus data-driven independent variable selection

    Directory of Open Access Journals (Sweden)

    Geoffrey Fouad

    2018-06-01

    New hydrological insights for the region: A set of three variables selected based on an expert assessment of factors that influence percentile flows performed similarly to larger sets of variables selected using a data-driven method. Expert assessment variables included mean annual precipitation, potential evapotranspiration, and baseflow index. Larger sets of up to 37 variables contributed little, if any, additional predictive information. Variables used to describe the distribution of basin data (e.g. standard deviation were not useful, and average values were sufficient to characterize physical and climatic basin conditions. Effectiveness of the expert assessment variables may be due to the high degree of multicollinearity (i.e. cross-correlation among additional variables. A tool is provided in the Supplementary material to predict percentile flows based on the three expert assessment variables. Future work should develop new variables with a strong understanding of the processes related to percentile flows.

  6. BioInfra.Prot: A comprehensive proteomics workflow including data standardization, protein inference, expression analysis and data publication.

    Science.gov (United States)

    Turewicz, Michael; Kohl, Michael; Ahrens, Maike; Mayer, Gerhard; Uszkoreit, Julian; Naboulsi, Wael; Bracht, Thilo; Megger, Dominik A; Sitek, Barbara; Marcus, Katrin; Eisenacher, Martin

    2017-11-10

    The analysis of high-throughput mass spectrometry-based proteomics data must address the specific challenges of this technology. To this end, the comprehensive proteomics workflow offered by the de.NBI service center BioInfra.Prot provides indispensable components for the computational and statistical analysis of this kind of data. These components include tools and methods for spectrum identification and protein inference, protein quantification, expression analysis as well as data standardization and data publication. All particular methods of the workflow which address these tasks are state-of-the-art or cutting edge. As has been shown in previous publications, each of these methods is adequate to solve its specific task and gives competitive results. However, the methods included in the workflow are continuously reviewed, updated and improved to adapt to new scientific developments. All of these particular components and methods are available as stand-alone BioInfra.Prot services or as a complete workflow. Since BioInfra.Prot provides manifold fast communication channels to get access to all components of the workflow (e.g., via the BioInfra.Prot ticket system: bioinfraprot@rub.de) users can easily benefit from this service and get support by experts. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  7. Toward the rational use of standardized infection ratios to benchmark surgical site infections.

    Science.gov (United States)

    Fukuda, Haruhisa; Morikane, Keita; Kuroki, Manabu; Taniguchi, Shinichiro; Shinzato, Takashi; Sakamoto, Fumie; Okada, Kunihiko; Matsukawa, Hiroshi; Ieiri, Yuko; Hayashi, Kouji; Kawai, Shin

    2013-09-01

    The National Healthcare Safety Network transitioned from surgical site infection (SSI) rates to the standardized infection ratio (SIR) calculated by statistical models that included perioperative factors (surgical approach and surgery duration). Rationally, however, only patient-related variables should be included in the SIR model. Logistic regression was performed to predict expected SSI rate in 2 models that included or excluded perioperative factors. Observed and expected SSI rates were used to calculate the SIR for each participating hospital. The difference of SIR in each model was then evaluated. Surveillance data were collected from a total of 1,530 colon surgery patients and 185 SSIs. C-index in the model with perioperative factors was statistically greater than that in the model including patient-related factors only (0.701 vs 0.621, respectively, P operative process or the competence of surgical teams, these factors should not be considered predictive variables. Copyright © 2013 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.

  8. The nebular variables

    CERN Document Server

    Glasby, John S

    1974-01-01

    The Nebular Variables focuses on the nebular variables and their characteristics. Discussions are organized by type of nebular variable, namely, RW Aurigae stars, T Orionis stars, T Tauri stars, and peculiar nebular objects. Topics range from light variations of the stars to their spectroscopic and physical characteristics, spatial distribution, interaction with nebulosity, and evolutionary features. This volume is divided into four sections and consists of 25 chapters, the first of which provides general information on nebular variables, including their stellar associations and their classifi

  9. EXTraS: Exploring the X-ray Transient and variable Sky

    Science.gov (United States)

    De Luca, A.; Salvaterra, R.; Tiengo, A.; D'Agostino, D.; Watson, M.; Haberl, F.; Wilms, J.

    2017-10-01

    The EXTraS project extracted all temporal domain information buried in the whole database collected by the EPIC cameras onboard the XMM-Newton mission. This included a search and characterisation of variability, both periodic and aperiodic, in hundreds of thousands of sources spanning more than eight orders of magnitude in time scale and six orders of magnitude in flux, as well as a search for fast transients, missed by standard image analysis. Phenomenological classification of variable sources, based on X-ray and multiwavelength information, has also been performed. All results and products of EXTraS are made available to the scientific community through a web public data archive. A dedicated science gateway will allow scientists to apply EXTraS pipelines on new observations. EXTraS is the most comprehensive analysis of variability, on the largest ever sample of soft X-ray sources. The resulting archive and tools disclose an enormous scientific discovery space to the community, with applications ranging from the search for rare events to population studies, with impact on the study of virtually all astrophysical source classes. EXTraS, funded within the EU/FP7 framework, is carried out by a collaboration including INAF (Italy), IUSS (Italy), CNR/IMATI (Italy), University of Leicester (UK), MPE (Germany) and ECAP (Germany).

  10. 14 CFR 35.21 - Variable and reversible pitch propellers.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Variable and reversible pitch propellers. 35.21 Section 35.21 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT AIRWORTHINESS STANDARDS: PROPELLERS Design and Construction § 35.21 Variable and...

  11. Including indigestible carbohydrates in the evening meal of healthy subjects improves glucose tolerance, lowers inflammatory markers, and increases satiety after a subsequent standardized breakfast

    DEFF Research Database (Denmark)

    Nilsson, A.C.; Ostman, E.M.; Holst, Jens Juul

    2008-01-01

    Low-glycemic index (GI) foods and foods rich in whole grain are associated with reduced risk of type 2 diabetes and cardiovascular disease. We studied the effect of cereal-based bread evening meals (50 g available starch), varying in GI and content of indigestible carbohydrates, on glucose...... tolerance and related variables after a subsequent standardized breakfast in healthy subjects (n = 15). At breakfast, blood was sampled for 3 h for analysis of blood glucose, serum insulin, serum FFA, serum triacylglycerides, plasma glucagon, plasma gastric-inhibitory peptide, plasma glucagon-like peptide-1...... based bread (ordinary, high-amylose- or beta-glucan-rich genotypes) or an evening meal with white wheat flour bread (WWB) enriched with a mixture of barley fiber and resistant starch improved glucose tolerance at the subsequent breakfast compared with unsupplemented WWB (P

  12. Outpatient versus inpatient mixed meal tolerance and arginine stimulation testing yields comparable measures of variability for assessment of beta cell function

    Directory of Open Access Journals (Sweden)

    Sudha S. Shankar

    2018-06-01

    Full Text Available Standard practice to minimize variability in beta cell function (BCF measurement is to test in inpatient (IP settings. IP testing strains trial subjects, investigators, and budgets. Outpatient (OP testing may be a solution although there are few reports on OP BCF testing variability. We compared variability metrics between OP and IP from a standardized mixed meal tolerance test (MMTT and arginine stimulation test (AST in two separate type 2 diabetes (T2DM cohorts (OP, n = 20; IP n = 22 in test-retest design. MMTT variables included: insulin sensitivity (Si; beta cell responsivity (Φtot; and disposition index (DItot = Si* Φtot following 470 kCal meal. AST variables included: acute insulin response to arginine (AIRarg and during hyperglycemia (AIRargMAX. Results: Baseline characteristics were well-matched. Between and within subject variance for each parameter across cohorts, and intraclass correlation coefficients (ICC-a measure of reproducibility across parameters were generally comparable for OP to IP. Table summarizes the ICC results for each key parameter and cohort.Test/ParameterOutpatient (95% CIInpatient (95% CIMMTT: Si0.49(0,0.690.28(0,0.60MMTT: Φtot0.65(0.16,0.890.81(0.44,0.93MMTT: DI0.67(0,0.830.36(0,0.69AST: AIR Arg0.96(0.88,0.980.84(0.59,0.94AST: AIR Arg Max0.97(0.90,0.990.95(0.86,0.97AST: ISR0.93(0.77,0.970.93(0.82,0.96In conclusion, the variability (reproducibility of BCF measures from standardized MMTT and AST is comparable between OP and IP settings. These observations have significant implications for complexity and cost of metabolic studies.

  13. HEART RATE VARIABILITY AND BODY COMPOSITION AS VO2MAX DETERMINANTS

    Directory of Open Access Journals (Sweden)

    Henry Humberto León-Ariza

    Full Text Available ABSTRACT Introduction: The maximum oxygen consumption (VO2max is the gold standard in the cardiorespiratory endurance assessment. Objective: This study aimed to develop a mathematical model that contains variables to determine the VO2max of sedentary people. Methods: Twenty participants (10 men and 10 women with a mean age of 19.8±1.77 years were included. For each participant, body composition (percentage of fat and muscle, heart rate variability (HRV at rest (supine and standing, and VO2max were evaluated through an indirect test on a cycloergometer. A multivariate linear regression model was developed from the data obtained, and the model assumptions were verified. Results: Using the data obtained, including percentage of fat (F, percentage of muscle (M, percentage of power at very low frequency (VLF, α-value of the detrended fluctuation analysis (DFAα1, heart rate (HR in the resting standing position, and age of the participants, a model was established for men, which was expressed as VO2max = 4.216 + (Age*0.153 + (F*0.110 - (M*0.053 - (VLF*0.649 - (DFAα1*2.441 - (HR*0.014, with R2 = 0.965 and standard error = 0.146 L/min. For women, the model was expressed as VO2max = 1.947 - (Age*0.047 + (F*0.024 + (M*0.054 + (VLF*1.949 - (DFAα1*0.424 - (HR*0.019, with R2 = 0.987 and standard error = 0.077 L/min. Conclusion: The obtained model demonstrated the influence exerted by body composition, the autonomic nervous system, and age in the prediction of VO2max.

  14. Effects of variable transformations on errors in FORM results

    International Nuclear Information System (INIS)

    Qin Quan; Lin Daojin; Mei Gang; Chen Hao

    2006-01-01

    On the basis of studies on second partial derivatives of the variable transformation functions for nine different non-normal variables the paper comprehensively discusses the effects of the transformation on FORM results and shows that senses and values of the errors in FORM results depend on distributions of the basic variables, whether resistances or actions basic variables represent, and the design point locations in the standard normal space. The transformations of the exponential or Gamma resistance variables can generate +24% errors in the FORM failure probability, and the transformation of Frechet action variables could generate -31% errors

  15. Identifying individuality and variability in team tactics by means of statistical shape analysis and multilayer perceptrons.

    Science.gov (United States)

    Jäger, Jörg M; Schöllhorn, Wolfgang I

    2012-04-01

    Offensive and defensive systems of play represent important aspects of team sports. They include the players' positions at certain situations during a match, i.e., when players have to be on specific positions on the court. Patterns of play emerge based on the formations of the players on the court. Recognition of these patterns is important to react adequately and to adjust own strategies to the opponent. Furthermore, the ability to apply variable patterns of play seems to be promising since they make it harder for the opponent to adjust. The purpose of this study is to identify different team tactical patterns in volleyball and to analyze differences in variability. Overall 120 standard situations of six national teams in women's volleyball are analyzed during a world championship tournament. Twenty situations from each national team are chosen, including the base defence position (start configuration) and the two players block with middle back deep (end configuration). The shapes of the defence formations at the start and end configurations during the defence of each national team as well as the variability of these defence formations are statistically analyzed. Furthermore these shapes data are used to train multilayer perceptrons in order to test whether artificial neural networks can recognize the teams by their tactical patterns. Results show significant differences between the national teams in both the base defence position at the start and the two players block with middle back deep at the end of the standard defence situation. Furthermore, the national teams show significant differences in variability of the defence systems and start-positions are more variable than the end-positions. Multilayer perceptrons are able to recognize the teams at an average of 98.5%. It is concluded that defence systems in team sports are highly individual at a competitive level and variable even in standard situations. Artificial neural networks can be used to recognize

  16. Quality Improvement Initiative to Decrease Variability of Emergency Physician Opioid Analgesic Prescribing

    Directory of Open Access Journals (Sweden)

    John H. Burton

    2016-05-01

    Full Text Available Introduction: Addressing pain is a crucial aspect of emergency medicine. Prescription opioids are commonly prescribed for moderate to severe pain in the emergency department (ED; unfortunately, prescribing practices are variable. High variability of opioid prescribing decisions suggests a lack of consensus and an opportunity to improve care. This quality improvement (QI initiative aimed to reduce variability in ED opioid analgesic prescribing. Methods: We evaluated the impact of a three-part QI initiative on ED opioid prescribing by physicians at seven sites. Stage 1: Retrospective baseline period (nine months. Stage 2: Physicians were informed that opioid prescribing information would be prospectively collected and feedback on their prescribing and that of the group would be shared at the end of the stage (three months. Stage 3: After physicians received their individual opioid prescribing data with blinded comparison to the group means (from Stage 2 they were informed that individual prescribing data would be unblinded and shared with the group after three months. The primary outcome was variability of the standard error of the mean and standard deviation of the opioid prescribing rate (defined as number of patients discharged with an opioid divided by total number of discharges for each provider. Secondary observations included mean quantity of pills per opioid prescription, and overall frequency of opioid prescribing. Results: The study group included 47 physicians with 149,884 ED patient encounters. The variability in prescribing decreased through each stage of the initiative as represented by the distributions for the opioid prescribing rate: Stage 1 mean 20%; Stage 2 mean 13% (46% reduction, p<0.01, and Stage 3 mean 8% (60% reduction, p<0.01. The mean quantity of pills prescribed per prescription was 16 pills in Stage 1, 14 pills in Stage 2 (18% reduction, p<0.01, and 13 pills in Stage 3 (18% reduction, p<0.01. The group mean

  17. Quantum interference of probabilities and hidden variable theories

    International Nuclear Information System (INIS)

    Srinivas, M.D.

    1984-01-01

    One of the fundamental contributions of Louis de Broglie, which does not get cited often, has been his analysis of the basic difference between the calculus of the probabilities as predicted by quantum theory and the usual calculus of probabilities - the one employed by most mathematicians, in its standard axiomatised version due to Kolmogorov. This paper is basically devoted to a discussion of the 'quantum interference of probabilities', discovered by de Broglie. In particular, it is shown that it is this feature of the quantum theoretic probabilities which leads to some serious constraints on the possible 'hidden-variable formulations' of quantum mechanics, including the celebrated theorem of Bell. (Auth.)

  18. Genotypic variability enhances the reproducibility of an ecological study.

    Science.gov (United States)

    Milcu, Alexandru; Puga-Freitas, Ruben; Ellison, Aaron M; Blouin, Manuel; Scheu, Stefan; Freschet, Grégoire T; Rose, Laura; Barot, Sebastien; Cesarz, Simone; Eisenhauer, Nico; Girin, Thomas; Assandri, Davide; Bonkowski, Michael; Buchmann, Nina; Butenschoen, Olaf; Devidal, Sebastien; Gleixner, Gerd; Gessler, Arthur; Gigon, Agnès; Greiner, Anna; Grignani, Carlo; Hansart, Amandine; Kayler, Zachary; Lange, Markus; Lata, Jean-Christophe; Le Galliard, Jean-François; Lukac, Martin; Mannerheim, Neringa; Müller, Marina E H; Pando, Anne; Rotter, Paula; Scherer-Lorenzen, Michael; Seyhun, Rahme; Urban-Mead, Katherine; Weigelt, Alexandra; Zavattaro, Laura; Roy, Jacques

    2018-02-01

    Many scientific disciplines are currently experiencing a 'reproducibility crisis' because numerous scientific findings cannot be repeated consistently. A novel but controversial hypothesis postulates that stringent levels of environmental and biotic standardization in experimental studies reduce reproducibility by amplifying the impacts of laboratory-specific environmental factors not accounted for in study designs. A corollary to this hypothesis is that a deliberate introduction of controlled systematic variability (CSV) in experimental designs may lead to increased reproducibility. To test this hypothesis, we had 14 European laboratories run a simple microcosm experiment using grass (Brachypodium distachyon L.) monocultures and grass and legume (Medicago truncatula Gaertn.) mixtures. Each laboratory introduced environmental and genotypic CSV within and among replicated microcosms established in either growth chambers (with stringent control of environmental conditions) or glasshouses (with more variable environmental conditions). The introduction of genotypic CSV led to 18% lower among-laboratory variability in growth chambers, indicating increased reproducibility, but had no significant effect in glasshouses where reproducibility was generally lower. Environmental CSV had little effect on reproducibility. Although there are multiple causes for the 'reproducibility crisis', deliberately including genetic variability may be a simple solution for increasing the reproducibility of ecological studies performed under stringently controlled environmental conditions.

  19. Normalization of High Dimensional Genomics Data Where the Distribution of the Altered Variables Is Skewed

    Science.gov (United States)

    Landfors, Mattias; Philip, Philge; Rydén, Patrik; Stenberg, Per

    2011-01-01

    Genome-wide analysis of gene expression or protein binding patterns using different array or sequencing based technologies is now routinely performed to compare different populations, such as treatment and reference groups. It is often necessary to normalize the data obtained to remove technical variation introduced in the course of conducting experimental work, but standard normalization techniques are not capable of eliminating technical bias in cases where the distribution of the truly altered variables is skewed, i.e. when a large fraction of the variables are either positively or negatively affected by the treatment. However, several experiments are likely to generate such skewed distributions, including ChIP-chip experiments for the study of chromatin, gene expression experiments for the study of apoptosis, and SNP-studies of copy number variation in normal and tumour tissues. A preliminary study using spike-in array data established that the capacity of an experiment to identify altered variables and generate unbiased estimates of the fold change decreases as the fraction of altered variables and the skewness increases. We propose the following work-flow for analyzing high-dimensional experiments with regions of altered variables: (1) Pre-process raw data using one of the standard normalization techniques. (2) Investigate if the distribution of the altered variables is skewed. (3) If the distribution is not believed to be skewed, no additional normalization is needed. Otherwise, re-normalize the data using a novel HMM-assisted normalization procedure. (4) Perform downstream analysis. Here, ChIP-chip data and simulated data were used to evaluate the performance of the work-flow. It was found that skewed distributions can be detected by using the novel DSE-test (Detection of Skewed Experiments). Furthermore, applying the HMM-assisted normalization to experiments where the distribution of the truly altered variables is skewed results in considerably higher

  20. Imaging Variable Stars with HST

    Science.gov (United States)

    Karovska, M.

    2012-06-01

    (Abstract only) The Hubble Space Telescope (HST) observations of astronomical sources, ranging from objects in our solar system to objects in the early Universe, have revolutionized our knowledge of the Universe its origins and contents. I highlight results from HST observations of variable stars obtained during the past twenty or so years. Multiwavelength observations of numerous variable stars and stellar systems were obtained using the superb HST imaging capabilities and its unprecedented angular resolution, especially in the UV and optical. The HST provided the first detailed images probing the structure of variable stars including their atmospheres and circumstellar environments. AAVSO observations and light curves have been critical for scheduling of many of these observations and provided important information and context for understanding of the imaging results of many variable sources. I describe the scientific results from the imaging observations of variable stars including AGBs, Miras, Cepheids, semiregular variables (including supergiants and giants), YSOs and interacting stellar systems with a variable stellar components. These results have led to an unprecedented understanding of the spatial and temporal characteristics of these objects and their place in the stellar evolutionary chains, and in the larger context of the dynamic evolving Universe.

  1. Double standards: a cross-European study on differences in norms on voluntary childlessness for men and women. Paper presentation

    NARCIS (Netherlands)

    Rijken, A.J.; Merz, E.-M.

    2011-01-01

    We examine double standards in norms on voluntary childlessness. Whether choosing childlessness is more accepted for men or for women is not a priori clear; we formulate arguments in both directions. Multilevel analyses are conducted, including individual and societal-level variables. Our sample

  2. Biological Sampling Variability Study

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hutchison, Janine R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-11-08

    There are many sources of variability that exist in the sample collection and analysis process. This paper addresses many, but not all, sources of variability. The main focus of this paper was to better understand and estimate variability due to differences between samplers. Variability between days was also studied, as well as random variability within each sampler. Experiments were performed using multiple surface materials (ceramic and stainless steel), multiple contaminant concentrations (10 spores and 100 spores), and with and without the presence of interfering material. All testing was done with sponge sticks using 10-inch by 10-inch coupons. Bacillus atrophaeus was used as the BA surrogate. Spores were deposited using wet deposition. Grime was coated on the coupons which were planned to include the interfering material (Section 3.3). Samples were prepared and analyzed at PNNL using CDC protocol (Section 3.4) and then cultured and counted. Five samplers were trained so that samples were taken using the same protocol. Each sampler randomly sampled eight coupons each day, four coupons with 10 spores deposited and four coupons with 100 spores deposited. Each day consisted of one material being tested. The clean samples (no interfering materials) were run first, followed by the dirty samples (coated with interfering material). There was a significant difference in recovery efficiency between the coupons with 10 spores deposited (mean of 48.9%) and those with 100 spores deposited (mean of 59.8%). There was no general significant difference between the clean and dirty (containing interfering material) coupons or between the two surface materials; however, there was a significant interaction between concentration amount and presence of interfering material. The recovery efficiency was close to the same for coupons with 10 spores deposited, but for the coupons with 100 spores deposited, the recovery efficiency for the dirty samples was significantly larger (65

  3. Iwamoto-Harada coalescence/pickup model for cluster emission: state density approach including angular momentum variables

    Directory of Open Access Journals (Sweden)

    Běták Emil

    2014-04-01

    Full Text Available For low-energy nuclear reactions well above the resonance region, but still below the pion threshold, statistical pre-equilibrium models (e.g., the exciton and the hybrid ones are a frequent tool for analysis of energy spectra and the cross sections of cluster emission. For α’s, two essentially distinct approaches are popular, namely the preformed one and the different versions of coalescence approaches, whereas only the latter group of models can be used for other types of cluster ejectiles. The original Iwamoto-Harada model of pre-equilibrium cluster emission was formulated using the overlap of the cluster and its constituent nucleons in momentum space. Transforming it into level or state densities is not a straigthforward task; however, physically the same model was presented at a conference on reaction models five years earlier. At that time, only the densities without spin were used. The introduction of spin variables into the exciton model enabled detailed calculation of the γ emission and its competition with nucleon channels, and – at the same time – it stimulated further developments of the model. However – to the best of our knowledge – no spin formulation has been presented for cluster emission till recently, when the first attempts have been reported, but restricted to the first emission only. We have updated this effort now and we are able to handle (using the same simplifications as in our previous work pre-equilibrium cluster emission with spin including all nuclei in the reaction chain.

  4. A Geometrical Framework for Covariance Matrices of Continuous and Categorical Variables

    Science.gov (United States)

    Vernizzi, Graziano; Nakai, Miki

    2015-01-01

    It is well known that a categorical random variable can be represented geometrically by a simplex. Accordingly, several measures of association between categorical variables have been proposed and discussed in the literature. Moreover, the standard definitions of covariance and correlation coefficient for continuous random variables have been…

  5. First among Others? Cohen's "d" vs. Alternative Standardized Mean Group Difference Measures

    Science.gov (United States)

    Cahan, Sorel; Gamliel, Eyal

    2011-01-01

    Standardized effect size measures typically employed in behavioral and social sciences research in the multi-group case (e.g., [eta][superscript 2], f[superscript 2]) evaluate between-group variability in terms of either total or within-group variability, such as variance or standard deviation--that is, measures of dispersion about the mean. In…

  6. The temporal variability of species densities

    International Nuclear Information System (INIS)

    Redfearn, A.; Pimm, S.L.

    1993-01-01

    Ecologists use the term 'stability' to mean to number of different things (Pimm 1984a). One use is to equate stability with low variability in population density over time (henceforth, temporal variability). Temporal variability varies greatly from species to species, so what effects it? There are at least three sets of factors: the variability of extrinsic abiotic factors, food web structure, and the intrinsic features of the species themselves. We can measure temporal variability using at least three statistics: the coefficient of variation of density (CV); the standard deviation of the logarithms of density (SDL); and the variance in the differences between logarithms of density for pairs of consecutive years (called annual variability, hence AV, b y Wolda 1978). There are advantages and disadvantages to each measure (Williamson 1984), though in our experience, the measures are strongly correlated across sets of taxonomically related species. The increasing availability of long-term data sets allows one to calculate these statistics for many species and so to begin to understand the various causes of species differences in temporal variability

  7. Understanding diagnostic variability in breast pathology: lessons learned from an expert consensus review panel

    Science.gov (United States)

    Allison, Kimberly H; Reisch, Lisa M; Carney, Patricia A; Weaver, Donald L; Schnitt, Stuart J; O’Malley, Frances P; Geller, Berta M; Elmore, Joann G

    2015-01-01

    Aims To gain a better understanding of the reasons for diagnostic variability, with the aim of reducing the phenomenon. Methods and results In preparation for a study on the interpretation of breast specimens (B-PATH), a panel of three experienced breast pathologists reviewed 336 cases to develop consensus reference diagnoses. After independent assessment, cases coded as diagnostically discordant were discussed at consensus meetings. By the use of qualitative data analysis techniques, transcripts of 16 h of consensus meetings for a subset of 201 cases were analysed. Diagnostic variability could be attributed to three overall root causes: (i) pathologist-related; (ii) diagnostic coding/study methodology-related; and (iii) specimen-related. Most pathologist-related root causes were attributable to professional differences in pathologists’ opinions about whether the diagnostic criteria for a specific diagnosis were met, most frequently in cases of atypia. Diagnostic coding/study methodology-related root causes were primarily miscategorizations of descriptive text diagnoses, which led to the development of a standardized electronic diagnostic form (BPATH-Dx). Specimen-related root causes included artefacts, limited diagnostic material, and poor slide quality. After re-review and discussion, a consensus diagnosis could be assigned in all cases. Conclusions Diagnostic variability is related to multiple factors, but consensus conferences, standardized electronic reporting formats and comments on suboptimal specimen quality can be used to reduce diagnostic variability. PMID:24511905

  8. Instrumented Impact Testing: Influence of Machine Variables and Specimen Position

    International Nuclear Information System (INIS)

    Lucon, E.; McCowan, C. N.; Santoyo, R. A.

    2008-01-01

    An investigation has been conducted on the influence of impact machine variables and specimen positioning on characteristic forces and absorbed energies from instrumented Charpy tests. Brittle and ductile fracture behavior has been investigated by testing NIST reference samples of low, high and super-high energy levels. Test machine variables included tightness of foundation, anvil and striker bolts, and the position of the center of percussion with respect to the center of strike. For specimen positioning, we tested samples which had been moved away or sideways with respect to the anvils. In order to assess the influence of the various factors, we compared mean values in the reference (unaltered) and altered conditions; for machine variables, t-test analyses were also performed in order to evaluate the statistical significance of the observed differences. Our results indicate that the only circumstance which resulted in variations larger than 5 percent for both brittle and ductile specimens is when the sample is not in contact with the anvils. These findings should be taken into account in future revisions of instrumented Charpy test standards.

  9. Instrumented Impact Testing: Influence of Machine Variables and Specimen Position

    Energy Technology Data Exchange (ETDEWEB)

    Lucon, E.; McCowan, C. N.; Santoyo, R. A.

    2008-09-15

    An investigation has been conducted on the influence of impact machine variables and specimen positioning on characteristic forces and absorbed energies from instrumented Charpy tests. Brittle and ductile fracture behavior has been investigated by testing NIST reference samples of low, high and super-high energy levels. Test machine variables included tightness of foundation, anvil and striker bolts, and the position of the center of percussion with respect to the center of strike. For specimen positioning, we tested samples which had been moved away or sideways with respect to the anvils. In order to assess the influence of the various factors, we compared mean values in the reference (unaltered) and altered conditions; for machine variables, t-test analyses were also performed in order to evaluate the statistical significance of the observed differences. Our results indicate that the only circumstance which resulted in variations larger than 5 percent for both brittle and ductile specimens is when the sample is not in contact with the anvils. These findings should be taken into account in future revisions of instrumented Charpy test standards.

  10. Standard guide for the determination of technetium-99 in Soil

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This guide is intended to serve as a reference for laboratories wishing to perform Tc-99 analyses in soil. Several options are given for selection of a tracer and for the method of extracting the Tc from the soil matrix. Separation of Tc from the sample matrix is performed using an extraction chromatography resin. Options are then given for the determination of the Tc-99 activity in the original sample. It is up to the user to determine which options are appropriate for use, and to generate acceptance data to support the chosen procedure. 1.2 Due to the various extraction methods available, various tracers used, variable detection methods used, and lack of certified reference materials for Tc-99 in soil, there is insufficient data to support a single method written as a standard method. 1.3 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard.

  11. Variable volume combustor

    Science.gov (United States)

    Ostebee, Heath Michael; Ziminsky, Willy Steve; Johnson, Thomas Edward; Keener, Christopher Paul

    2017-01-17

    The present application provides a variable volume combustor for use with a gas turbine engine. The variable volume combustor may include a liner, a number of micro-mixer fuel nozzles positioned within the liner, and a linear actuator so as to maneuver the micro-mixer fuel nozzles axially along the liner.

  12. Cryotherapy, Sensation, and Isometric-Force Variability

    Science.gov (United States)

    Denegar, Craig R.; Buckley, William E.; Newell, Karl M.

    2003-01-01

    Objective: To determine the changes in sensation of pressure, 2-point discrimination, and submaximal isometric-force production variability due to cryotherapy. Design and Setting: Sensation was assessed using a 2 × 2 × 2 × 3 repeated-measures factorial design, with treatment (ice immersion or control), limb (right or left), digit (finger or thumb), and sensation test time (baseline, posttreatment, or postisometric-force trials) as independent variables. Dependent variables were changes in sensation of pressure and 2-point discrimination. Isometric-force variability was tested with a 2 × 2 × 3 repeated-measures factorial design. Treatment condition (ice immersion or control), limb (right or left), and percentage (10, 25, or 40) of maximal voluntary isometric contraction (MVIC) were the independent variables. The dependent variables were the precision or variability (the standard deviation of mean isometric force) and the accuracy or targeting error (the root mean square error) of the isometric force for each percentage of MVIC. Subjects: Fifteen volunteer college students (8 men, 7 women; age = 22 ± 3 years; mass = 72 ± 21.9 kg; height = 183.4 ± 11.6 cm). Measurements: We measured sensation in the distal palmar aspect of the index finger and thumb. Sensation of pressure and 2-point discrimination were measured before treatment (baseline), after treatment (15 minutes of ice immersion or control), and at the completion of isometric testing (final). Variability (standard deviation of mean isometric force) of the submaximal isometric finger forces was measured by having the subjects exert a pinching force with the thumb and index finger for 30 seconds. Subjects performed the pinching task at the 3 submaximal levels of MVIC (10%, 25%, and 40%), with the order of trials assigned randomly. The subjects were given a target representing the submaximal percentage of MVIC and visual feedback of the force produced as they pinched the testing device. The force exerted

  13. Variable geometry Darrieus wind machine

    Science.gov (United States)

    Pytlinski, J. T.; Serrano, D.

    1983-08-01

    A variable geometry Darrieus wind machine is proposed. The lower attachment of the blades to the rotor can move freely up and down the axle allowing the blades of change shape during rotation. Experimental data for a 17 m. diameter Darrieus rotor and a theoretical model for multiple streamtube performance prediction were used to develop a computer simulation program for studying parameters that affect the machine's performance. This new variable geometry concept is described and interrelated with multiple streamtube theory through aerodynamic parameters. The computer simulation study shows that governor behavior of a Darrieus turbine can not be attained by a standard turbine operating within normally occurring rotational velocity limits. A second generation variable geometry Darrieus wind turbine which uses a telescopic blade is proposed as a potential improvement on the studied concept.

  14. Influence of attrition variables on iron ore flotation

    Directory of Open Access Journals (Sweden)

    Fabiana Fonseca Fortes

    Full Text Available Abstract The presence of slimes is harmful to the flotation process: the performance and consumption of reagents are negatively affected. Traditionally, the desliming stage has been responsible for removing slimes. However, depending on the porosity of the mineral particles, desliming may not be sufficient to maximize the concentration results. An attrition process before the desliming operation can improve the removal of slime, especially when slimes cover the surface and/or are confined to the cavities/pores of the mineral particles. Attrition is present in the flowcharts of the beneficiation process of phosphate and industrial sand (silica sand. Research has been undertaken for its application to produce pre-concentrates of zircon and iron ore. However, there is still little knowledge of the influence of the attrition variables on the beneficiation process of iron ore. This study presents a factorial design and analysis of the effects of these variables on the reverse flotation of iron ore. The standard of the experimental procedures for all tests included the attrition of pulp, under the conditions of dispersion, desliming and flotation. The parameter analysed (variable response was the metallurgical recovery in reverse flotation tests. The planning and analysis of the full factorial experiment indicated that with 95% reliability, the rotation speed of the attrition cell impeller was the main variable in the attrition process of the iron ore. The percentage of solid variables in the pulp and the time of the attrition, as well as their interactions, were not indicated to be significant.

  15. Evaluation of standardized and applied variables in predicting treatment outcomes of polytrauma patients.

    Science.gov (United States)

    Aksamija, Goran; Mulabdic, Adi; Rasic, Ismar; Muhovic, Samir; Gavric, Igor

    2011-01-01

    Polytrauma is defined as an injury where they are affected by at least two different organ systems or body, with at least one life-threatening injuries. Given the multilevel model care of polytrauma patients within KCUS are inevitable weaknesses in the management of this category of patients. To determine the dynamics of existing procedures in treatment of polytrauma patients on admission to KCUS, and based on statistical analysis of variables applied to determine and define the factors that influence the final outcome of treatment, and determine their mutual relationship, which may result in eliminating the flaws in the approach to the problem. The study was based on 263 polytrauma patients. Parametric and non-parametric statistical methods were used. Basic statistics were calculated, based on the calculated parameters for the final achievement of research objectives, multicoleration analysis, image analysis, discriminant analysis and multifactorial analysis were used. From the universe of variables for this study we selected sample of n = 25 variables, of which the first two modular, others belong to the common measurement space (n = 23) and in this paper defined as a system variable methods, procedures and assessments of polytrauma patients. After the multicoleration analysis, since the image analysis gave a reliable measurement results, we started the analysis of eigenvalues, that is defining the factors upon which they obtain information about the system solve the problem of the existing model and its correlation with treatment outcome. The study singled out the essential factors that determine the current organizational model of care, which may affect the treatment and better outcome of polytrauma patients. This analysis has shown the maximum correlative relationships between these practices and contributed to development guidelines that are defined by isolated factors.

  16. Standardization versus customization of glucose reporting.

    Science.gov (United States)

    Rodbard, David

    2013-05-01

    Bergenstal et al. (Diabetes Technol Ther 2013;15:198-211) described an important approach toward standardization of reporting and analysis of continuous glucose monitoring and self-monitoring of blood glucose (SMBG) data. The ambulatory glucose profile (AGP), a composite display of glucose by time of day that superimposes data from multiple days, is perhaps the most informative and useful of the many graphical approaches to display glucose data. However, the AGP has limitations; some variations are desirable and useful. Synchronization with respect to meals, traditionally used in glucose profiles for SMBG data, can improve characterization of postprandial glucose excursions. Several other types of graphical display are available, and recently developed ones can augment the information provided by the AGP. There is a need to standardize the parameters describing glycemic variability and cross-validate the available computer programs that calculate glycemic variability. Clinical decision support software can identify and prioritize clinical problems, make recommendations for modifications of therapy, and explain its justification for those recommendations. The goal of standardization is challenging in view of the diversity of clinical situations and of computing and display platforms and software. Standardization is desirable but must be done in a manner that permits flexibility and fosters innovation.

  17. Variable mechanical ventilation.

    Science.gov (United States)

    Fontela, Paula Caitano; Prestes, Renata Bernardy; Forgiarini, Luiz Alberto; Friedman, Gilberto

    2017-01-01

    To review the literature on the use of variable mechanical ventilation and the main outcomes of this technique. Search, selection, and analysis of all original articles on variable ventilation, without restriction on the period of publication and language, available in the electronic databases LILACS, MEDLINE®, and PubMed, by searching the terms "variable ventilation" OR "noisy ventilation" OR "biologically variable ventilation". A total of 36 studies were selected. Of these, 24 were original studies, including 21 experimental studies and three clinical studies. Several experimental studies reported the beneficial effects of distinct variable ventilation strategies on lung function using different models of lung injury and healthy lungs. Variable ventilation seems to be a viable strategy for improving gas exchange and respiratory mechanics and preventing lung injury associated with mechanical ventilation. However, further clinical studies are necessary to assess the potential of variable ventilation strategies for the clinical improvement of patients undergoing mechanical ventilation.

  18. A sizing method for stand-alone PV installations with variable demand

    Energy Technology Data Exchange (ETDEWEB)

    Posadillo, R. [Grupo de Investigacion en Energias y Recursos Renovables, Dpto. de Fisica Aplicada, E.P.S., Universidad de Cordoba, Avda. Menendez Pidal s/n, 14004 Cordoba (Spain); Lopez Luque, R. [Grupo de Investigacion de Fisica Para las Energias y Recursos Renovables, Dpto. de Fisica Aplicada, Edificio C2 Campus de Rabanales, 14071 Cordoba (Spain)

    2008-05-15

    The practical applicability of the considerations made in a previous paper to characterize energy balances in stand-alone photovoltaic systems (SAPV) is presented. Given that energy balances were characterized based on monthly estimations, the method is appropriate for sizing installations with variable monthly demands and variable monthly panel tilt (for seasonal estimations). The method presented is original in that it is the only method proposed for this type of demand. The method is based on the rational utilization of daily solar radiation distribution functions. When exact mathematical expressions are not available, approximate empirical expressions can be used. The more precise the statistical characterization of the solar radiation on the receiver module, the more precise the sizing method given that the characterization will solely depend on the distribution function of the daily global irradiation on the tilted surface H{sub g{beta}}{sub i}. This method, like previous ones, uses the concept of loss of load probability (LLP) as a parameter to characterize system design and includes information on the standard deviation of this parameter ({sigma}{sub LLP}) as well as two new parameters: annual number of system failures (f) and the standard deviation of annual number of system failures ({sigma}{sub f}). This paper therefore provides an analytical method for evaluating and sizing stand-alone PV systems with variable monthly demand and panel inclination. The sizing method has also been applied in a practical manner. (author)

  19. Protecting chips against hold time violations due to variability

    CERN Document Server

    Neuberger, Gustavo; Reis, Ricardo

    2013-01-01

    With the development of Very-Deep Sub-Micron technologies, process variability is becoming increasingly important and is a very important issue in the design of complex circuits. Process variability is the statistical variation of process parameters, meaning that these parameters do not have always the same value, but become a random variable, with a given mean value and standard deviation. This effect can lead to several issues in digital circuit design.The logical consequence of this parameter variation is that circuit characteristics, as delay and power, also become random variables. Becaus

  20. "First among Others? Cohen's ""d"" vs. Alternative Standardized Mean Group Difference Measures"

    Directory of Open Access Journals (Sweden)

    Sorel Cahan

    2011-06-01

    Full Text Available Standardized effect size measures typically employed in behavioral and social sciences research in the multi-group case (e.g., 2, f2 evaluate between-group variability in terms of either total or within-group variability, such as variance or standard deviation -' that is, measures of dispersion about the mean. In contrast, the definition of Cohen's d, the effect size measure typically computed in the two-group case, is incongruent due to a conceptual difference between the numerator -' which measures between-group variability by the intuitive and straightforward raw difference between the two group means -' and the denominator - which measures within-group variability in terms of the difference between all observations and the group mean (i.e., the pooled within-groups standard deviation, SW. Two congruent alternatives to d, in which the root square or absolute mean difference between all observation pairs is substituted for SW as the variability measure in the denominator of d, are suggested and their conceptual and statistical advantages and disadvantages are discussed.

  1. The effects of auditory stimulation with music on heart rate variability in healthy women

    Directory of Open Access Journals (Sweden)

    Adriano L. Roque

    2013-07-01

    Full Text Available OBJECTIVES: There are no data in the literature with regard to the acute effects of different styles of music on the geometric indices of heart rate variability. In this study, we evaluated the acute effects of relaxant baroque and excitatory heavy metal music on the geometric indices of heart rate variability in women. METHODS: We conducted this study in 21 healthy women ranging in age from 18 to 35 years. We excluded persons with previous experience with musical instruments and persons who had an affinity for the song styles. We evaluated two groups: Group 1 (n = 21, who were exposed to relaxant classical baroque musical and excitatory heavy metal auditory stimulation; and Group 2 (n = 19, who were exposed to both styles of music and white noise auditory stimulation. Using earphones, the volunteers were exposed to baroque or heavy metal music for five minutes. After the first music exposure to baroque or heavy metal music, they remained at rest for five minutes; subsequently, they were re-exposed to the opposite music (70-80 dB. A different group of women were exposed to the same music styles plus white noise auditory stimulation (90 dB. The sequence of the songs was randomized for each individual. We analyzed the following indices: triangular index, triangular interpolation of RR intervals and Poincaré plot (standard deviation of instantaneous beat-by-beat variability, standard deviation of the long-term RR interval, standard deviation of instantaneous beat-by-beat variability and standard deviation of the long-term RR interval ratio, low frequency, high frequency, low frequency/high frequency ratio, standard deviation of all the normal RR intervals, root-mean square of differences between the adjacent normal RR intervals and the percentage of adjacent RR intervals with a difference of duration greater than 50 ms. Heart rate variability was recorded at rest for 10 minutes. RESULTS: The triangular index and the standard deviation of

  2. The effects of auditory stimulation with music on heart rate variability in healthy women.

    Science.gov (United States)

    Roque, Adriano L; Valenti, Vitor E; Guida, Heraldo L; Campos, Mônica F; Knap, André; Vanderlei, Luiz Carlos M; Ferreira, Lucas L; Ferreira, Celso; Abreu, Luiz Carlos de

    2013-07-01

    There are no data in the literature with regard to the acute effects of different styles of music on the geometric indices of heart rate variability. In this study, we evaluated the acute effects of relaxant baroque and excitatory heavy metal music on the geometric indices of heart rate variability in women. We conducted this study in 21 healthy women ranging in age from 18 to 35 years. We excluded persons with previous experience with musical instruments and persons who had an affinity for the song styles. We evaluated two groups: Group 1 (n = 21), who were exposed to relaxant classical baroque musical and excitatory heavy metal auditory stimulation; and Group 2 (n = 19), who were exposed to both styles of music and white noise auditory stimulation. Using earphones, the volunteers were exposed to baroque or heavy metal music for five minutes. After the first music exposure to baroque or heavy metal music, they remained at rest for five minutes; subsequently, they were re-exposed to the opposite music (70-80 dB). A different group of women were exposed to the same music styles plus white noise auditory stimulation (90 dB). The sequence of the songs was randomized for each individual. We analyzed the following indices: triangular index, triangular interpolation of RR intervals and Poincaré plot (standard deviation of instantaneous beat-by-beat variability, standard deviation of the long-term RR interval, standard deviation of instantaneous beat-by-beat variability and standard deviation of the long-term RR interval ratio), low frequency, high frequency, low frequency/high frequency ratio, standard deviation of all the normal RR intervals, root-mean square of differences between the adjacent normal RR intervals and the percentage of adjacent RR intervals with a difference of duration greater than 50 ms. Heart rate variability was recorded at rest for 10 minutes. The triangular index and the standard deviation of the long-term RR interval indices were reduced

  3. Genetic Programming and Standardization in Water Temperature Modelling

    Directory of Open Access Journals (Sweden)

    Maritza Arganis

    2009-01-01

    Full Text Available An application of Genetic Programming (an evolutionary computational tool without and with standardization data is presented with the aim of modeling the behavior of the water temperature in a river in terms of meteorological variables that are easily measured, to explore their explanatory power and to emphasize the utility of the standardization of variables in order to reduce the effect of those with large variance. Recorded data corresponding to the water temperature behavior at the Ebro River, Spain, are used as analysis case, showing a performance improvement on the developed model when data are standardized. This improvement is reflected in a reduction of the mean square error. Finally, the models obtained in this document were applied to estimate the water temperature in 2004, in order to provide evidence about their applicability to forecasting purposes.

  4. Assessment of small-scale integrated water vapour variability during HOPE

    Science.gov (United States)

    Steinke, S.; Eikenberg, S.; Löhnert, U.; Dick, G.; Klocke, D.; Di Girolamo, P.; Crewell, S.

    2015-03-01

    The spatio-temporal variability of integrated water vapour (IWV) on small scales of less than 10 km and hours is assessed with data from the 2 months of the High Definition Clouds and Precipitation for advancing Climate Prediction (HD(CP)2) Observational Prototype Experiment (HOPE). The statistical intercomparison of the unique set of observations during HOPE (microwave radiometer (MWR), Global Positioning System (GPS), sun photometer, radiosondes, Raman lidar, infrared and near-infrared Moderate Resolution Imaging Spectroradiometer (MODIS) on the satellites Aqua and Terra) measuring close together reveals a good agreement in terms of random differences (standard deviation ≤1 kg m-2) and correlation coefficient (≥ 0.98). The exception is MODIS, which appears to suffer from insufficient cloud filtering. For a case study during HOPE featuring a typical boundary layer development, the IWV variability in time and space on scales of less than 10 km and less than 1 h is investigated in detail. For this purpose, the measurements are complemented by simulations with the novel ICOsahedral Nonhydrostatic modelling framework (ICON), which for this study has a horizontal resolution of 156 m. These runs show that differences in space of 3-4 km or time of 10-15 min induce IWV variabilities on the order of 0.4 kg m-2. This model finding is confirmed by observed time series from two MWRs approximately 3 km apart with a comparable temporal resolution of a few seconds. Standard deviations of IWV derived from MWR measurements reveal a high variability (> 1 kg m-2) even at very short time scales of a few minutes. These cannot be captured by the temporally lower-resolved instruments and by operational numerical weather prediction models such as COSMO-DE (an application of the Consortium for Small-scale Modelling covering Germany) of Deutscher Wetterdienst, which is included in the comparison. However, for time scales larger than 1 h, a sampling resolution of 15 min is

  5. Complex variables

    CERN Document Server

    Flanigan, Francis J

    2010-01-01

    A caution to mathematics professors: Complex Variables does not follow conventional outlines of course material. One reviewer noting its originality wrote: ""A standard text is often preferred [to a superior text like this] because the professor knows the order of topics and the problems, and doesn't really have to pay attention to the text. He can go to class without preparation."" Not so here-Dr. Flanigan treats this most important field of contemporary mathematics in a most unusual way. While all the material for an advanced undergraduate or first-year graduate course is covered, discussion

  6. A SEARCH FOR L/T TRANSITION DWARFS WITH Pan-STARRS1 AND WISE: DISCOVERY OF SEVEN NEARBY OBJECTS INCLUDING TWO CANDIDATE SPECTROSCOPIC VARIABLES

    International Nuclear Information System (INIS)

    Best, William M. J.; Liu, Michael C.; Magnier, Eugene A.; Aller, Kimberly M.; Burgett, W. S.; Chambers, K. C.; Hodapp, K. W.; Kaiser, N.; Kudritzki, R.-P.; Morgan, J. S.; Tonry, J. L.; Wainscoat, R. J.; Deacon, Niall R.; Dupuy, Trent J.; Redstone, Joshua; Price, P. A.

    2013-01-01

    We present initial results from a wide-field (30,000 deg 2 ) search for L/T transition brown dwarfs within 25 pc using the Pan-STARRS1 and Wide-field Infrared Survey Explorer (WISE) surveys. Previous large-area searches have been incomplete for L/T transition dwarfs, because these objects are faint in optical bands and have near-infrared (near-IR) colors that are difficult to distinguish from background stars. To overcome these obstacles, we have cross-matched the Pan-STARRS1 (optical) and WISE (mid-IR) catalogs to produce a unique multi-wavelength database for finding ultracool dwarfs. As part of our initial discoveries, we have identified seven brown dwarfs in the L/T transition within 9-15 pc of the Sun. The L9.5 dwarf PSO J140.2308+45.6487 and the T1.5 dwarf PSO J307.6784+07.8263 (both independently discovered by Mace et al.) show possible spectroscopic variability at the Y and J bands. Two more objects in our sample show evidence of photometric J-band variability, and two others are candidate unresolved binaries based on their spectra. We expect our full search to yield a well-defined, volume-limited sample of L/T transition dwarfs that will include many new targets for study of this complex regime. PSO J307.6784+07.8263 in particular may be an excellent candidate for in-depth study of variability, given its brightness (J = 14.2 mag) and proximity (11 pc)

  7. Control system architecture: The standard and non-standard models

    International Nuclear Information System (INIS)

    Thuot, M.E.; Dalesio, L.R.

    1993-01-01

    Control system architecture development has followed the advances in computer technology through mainframes to minicomputers to micros and workstations. This technology advance and increasingly challenging accelerator data acquisition and automation requirements have driven control system architecture development. In summarizing the progress of control system architecture at the last International Conference on Accelerator and Large Experimental Physics Control Systems (ICALEPCS) B. Kuiper asserted that the system architecture issue was resolved and presented a ''standard model''. The ''standard model'' consists of a local area network (Ethernet or FDDI) providing communication between front end microcomputers, connected to the accelerator, and workstations, providing the operator interface and computational support. Although this model represents many present designs, there are exceptions including reflected memory and hierarchical architectures driven by requirements for widely dispersed, large channel count or tightly coupled systems. This paper describes the performance characteristics and features of the ''standard model'' to determine if the requirements of ''non-standard'' architectures can be met. Several possible extensions to the ''standard model'' are suggested including software as well as the hardware architectural feature

  8. Wavelength standards in the infrared

    CERN Document Server

    Rao, KN

    2012-01-01

    Wavelength Standards in the Infrared is a compilation of wavelength standards suitable for use with high-resolution infrared spectrographs, including both emission and absorption standards. The book presents atomic line emission standards of argon, krypton, neon, and xenon. These atomic line emission standards are from the deliberations of Commission 14 of the International Astronomical Union, which is the recognized authority for such standards. The text also explains the techniques employed in determining spectral positions in the infrared. One of the techniques used includes the grating con

  9. Physical attraction to reliable, low variability nervous systems: Reaction time variability predicts attractiveness.

    Science.gov (United States)

    Butler, Emily E; Saville, Christopher W N; Ward, Robert; Ramsey, Richard

    2017-01-01

    The human face cues a range of important fitness information, which guides mate selection towards desirable others. Given humans' high investment in the central nervous system (CNS), cues to CNS function should be especially important in social selection. We tested if facial attractiveness preferences are sensitive to the reliability of human nervous system function. Several decades of research suggest an operational measure for CNS reliability is reaction time variability, which is measured by standard deviation of reaction times across trials. Across two experiments, we show that low reaction time variability is associated with facial attractiveness. Moreover, variability in performance made a unique contribution to attractiveness judgements above and beyond both physical health and sex-typicality judgements, which have previously been associated with perceptions of attractiveness. In a third experiment, we empirically estimated the distribution of attractiveness preferences expected by chance and show that the size and direction of our results in Experiments 1 and 2 are statistically unlikely without reference to reaction time variability. We conclude that an operating characteristic of the human nervous system, reliability of information processing, is signalled to others through facial appearance. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Towards identifying dyslexia in Standard Indonesian: the development of a reading assessment battery.

    Science.gov (United States)

    Jap, Bernard A J; Borleffs, Elisabeth; Maassen, Ben A M

    2017-01-01

    With its transparent orthography, Standard Indonesian is spoken by over 160 million inhabitants and is the primary language of instruction in education and the government in Indonesia. An assessment battery of reading and reading-related skills was developed as a starting point for the diagnosis of dyslexia in beginner learners. Founded on the International Dyslexia Association's definition of dyslexia, the test battery comprises nine empirically motivated reading and reading-related tasks assessing word reading, pseudoword reading, arithmetic, rapid automatized naming, phoneme deletion, forward and backward digit span, verbal fluency, orthographic choice (spelling), and writing. The test was validated by computing the relationships between the outcomes on the reading-skills and reading-related measures by means of correlation and factor analyses. External variables, i.e., school grades and teacher ratings of the reading and learning abilities of individual students, were also utilized to provide evidence of its construct validity. Four variables were found to be significantly related with reading-skill measures: phonological awareness, rapid naming, spelling, and digit span. The current study on reading development in Standard Indonesian confirms findings from other languages with transparent orthographies and suggests a test battery including preliminary norm scores for screening and assessment of elementary school children learning to read Standard Indonesian.

  11. Variability of indication criteria in knee and hip replacement: an observational study.

    Science.gov (United States)

    Cobos, Raquel; Latorre, Amaia; Aizpuru, Felipe; Guenaga, Jose I; Sarasqueta, Cristina; Escobar, Antonio; García, Lidia; Herrera-Espiñeira, Carmen

    2010-10-26

    Total knee (TKR) and hip (THR) replacement (arthroplasty) are effective surgical procedures that relieve pain, improve patients' quality of life and increase functional capacity. Studies on variations in medical practice usually place the indications for performing these procedures to be highly variable, because surgeons appear to follow different criteria when recommending surgery in patients with different severity levels. We therefore proposed a study to evaluate inter-hospital variability in arthroplasty indication. The pre-surgical condition of 1603 patients included was compared by their personal characteristics, clinical situation and self-perceived health status. Patients were asked to complete two health-related quality of life questionnaires: the generic SF-12 (Short Form) and the specific WOMAC (Western Ontario and Mcmaster Universities) scale. The type of patient undergoing primary arthroplasty was similar in the 15 different hospitals evaluated.The variability in baseline WOMAC score between hospitals in THR and TKR indication was described by range, mean and standard deviation (SD), mean and standard deviation weighted by the number of procedures at each hospital, high/low ratio or extremal quotient (EQ5-95), variation coefficient (CV5-95) and weighted variation coefficient (WCV5-95) for 5-95 percentile range. The variability in subjective and objective signs was evaluated using median, range and WCV5-95. The appropriateness of the procedures performed was calculated using a specific threshold proposed by Quintana et al for assessing pain and functional capacity. The variability expressed as WCV5-95 was very low, between 0.05 and 0.11 for all three dimensions on WOMAC scale for both types of procedure in all participating hospitals. The variability in the physical and mental SF-12 components was very low for both types of procedure (0.08 and 0.07 for hip and 0.03 and 0.07 for knee surgery patients). However, a moderate-high variability was detected in

  12. Loop quantum cosmology with self-dual variables

    Science.gov (United States)

    Wilson-Ewing, Edward

    2015-12-01

    Using the complex-valued self-dual connection variables, the loop quantum cosmology of a closed Friedmann space-time coupled to a massless scalar field is studied. It is shown how the reality conditions can be imposed in the quantum theory by choosing a particular inner product for the kinematical Hilbert space. While holonomies of the self-dual Ashtekar connection are not well defined in the kinematical Hilbert space, it is possible to introduce a family of generalized holonomylike operators of which some are well defined; these operators in turn are used in the definition of the Hamiltonian constraint operator where the scalar field can be used as a relational clock. The resulting quantum theory is closely related, although not identical, to standard loop quantum cosmology constructed from the Ashtekar-Barbero variables with a real Immirzi parameter. Effective Friedmann equations are derived which provide a good approximation to the full quantum dynamics for sharply peaked states whose volume remains much larger than the Planck volume, and they show that for these states quantum gravity effects resolve the big-bang and big-crunch singularities and replace them by a nonsingular bounce. Finally, the loop quantization in self-dual variables of a flat Friedmann space-time is recovered in the limit of zero spatial curvature and is identical to the standard loop quantization in terms of the real-valued Ashtekar-Barbero variables.

  13. Variable Work Hours--The MONY Experience

    Science.gov (United States)

    Fields, Cynthia J.

    1974-01-01

    An experiment with variable work hours in one department of a large company was so successful that it has become standard procedure in various corporate areas, both staff and line. The result? Increased production, fewer errors, improved employee morale, and a significant reduction in lateness and absenteeism. (Author)

  14. The Frontlines of Medicine Project: a proposal for the standardized communication of emergency department data for public health uses including syndromic surveillance for biological and chemical terrorism.

    Science.gov (United States)

    Barthell, Edward N; Cordell, William H; Moorhead, John C; Handler, Jonathan; Feied, Craig; Smith, Mark S; Cochrane, Dennis G; Felton, Christopher W; Collins, Michael A

    2002-04-01

    The Frontlines of Medicine Project is a collaborative effort of emergency medicine (including emergency medical services and clinical toxicology), public health, emergency government, law enforcement, and informatics. This collaboration proposes to develop a nonproprietary, "open systems" approach for reporting emergency department patient data. The common element is a standard approach to sending messages from individual EDs to regional oversight entities that could then analyze the data received. ED encounter data could be used for various public health initiatives, including syndromic surveillance for chemical and biological terrorism. The interlinking of these regional systems could also permit public health surveillance at a national level based on ED patient encounter data. Advancements in the Internet and Web-based technologies could allow the deployment of these standardized tools in a rapid time frame.

  15. The minimal non-minimal standard model

    International Nuclear Information System (INIS)

    Bij, J.J. van der

    2006-01-01

    In this Letter I discuss a class of extensions of the standard model that have a minimal number of possible parameters, but can in principle explain dark matter and inflation. It is pointed out that the so-called new minimal standard model contains a large number of parameters that can be put to zero, without affecting the renormalizability of the model. With the extra restrictions one might call it the minimal (new) non-minimal standard model (MNMSM). A few hidden discrete variables are present. It is argued that the inflaton should be higher-dimensional. Experimental consequences for the LHC and the ILC are discussed

  16. Evaluation of a draft standard on performance specifications for health physics instrumentation: results for environmental tests

    International Nuclear Information System (INIS)

    Kenoyer, J.L.; Swinth, K.L.; Mashburn, K.R.; Selby, J.M.

    1984-06-01

    Draft ANSI Standard N42.17 on performance specifications for health physics instrumentation is currently being evaluated by the Pacific Northwest Laboratory. Evaluation is performed by testing a cross-section of currently available instruments with testing procedures based on specifications of the standard and then determining the degree of conformance to the various elements of the proposed standard. Data will be presented on the performance of a cross-section of beta-gamma survey instruments under various environmental tests. Test results that will be presented include temperature effects, humidity effects, radio frequency (r.f.) susceptibility, ambient pressure effects, vibration effects, and shock effects. Tests performed to date show that most instruments will meet the temperature, humidity, and ambient pressure tests. A large variability is noted among instruments from the same or different vendors. Preliminary r.f. susceptibility tests have shown large artificial responses at some frequencies for specific instruments. The presentation will also include a discussion of procedures used in the testing and weaknesses identified in the proposed standard

  17. Lyral has been included in the patch test standard series in Germany.

    Science.gov (United States)

    Geier, Johannes; Brasch, Jochen; Schnuch, Axel; Lessmann, Holger; Pirker, Claudia; Frosch, Peter J

    2002-05-01

    Lyral 5% pet. was tested in 3245 consecutive patch test patients in 20 departments of dermatology in order (i) to check the diagnostic quality of this patch test preparation, (ii) to examine concomitant reactions to Lyral and fragrance mix (FM), and (iii) to assess the frequency of contact allergy to Lyral in an unselected patch test population of German dermatological clinics. 62 patients reacted to Lyral, i.e. 1.9%. One third of the positive reactions were + + and + + +. The reaction index was 0.27. Thus, the test preparation can be regarded a good diagnostic tool. Lyral and fragrance mix (FM) were tested in parallel in 3185 patients. Of these, 300 (9.4%) reacted to FM, and 59 (1.9%) to Lyral. In 40 patients, positive reactions to both occurred, which is 13.3% of those reacting to FM, and 67.8% of those reacting to Lyral. So the concordance of positive test reactions to Lyral and FM was only slight. Based on these results, the German Contact Dermatitis Research Group (DKG) decided to add Lyral 5% pet. to the standard series.

  18. A SEARCH FOR L/T TRANSITION DWARFS WITH Pan-STARRS1 AND WISE: DISCOVERY OF SEVEN NEARBY OBJECTS INCLUDING TWO CANDIDATE SPECTROSCOPIC VARIABLES

    Energy Technology Data Exchange (ETDEWEB)

    Best, William M. J.; Liu, Michael C.; Magnier, Eugene A.; Aller, Kimberly M.; Burgett, W. S.; Chambers, K. C.; Hodapp, K. W.; Kaiser, N.; Kudritzki, R.-P.; Morgan, J. S.; Tonry, J. L.; Wainscoat, R. J. [Institute for Astronomy, University of Hawaii at Manoa, Honolulu, HI 96822 (United States); Deacon, Niall R. [Max Planck Institute for Astronomy, Koenigstuhl 17, D-69117 Heidelberg (Germany); Dupuy, Trent J. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Redstone, Joshua [Facebook, 335 Madison Ave, New York, NY 10017-4677 (United States); Price, P. A., E-mail: wbest@ifa.hawaii.edu [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States)

    2013-11-10

    We present initial results from a wide-field (30,000 deg{sup 2}) search for L/T transition brown dwarfs within 25 pc using the Pan-STARRS1 and Wide-field Infrared Survey Explorer (WISE) surveys. Previous large-area searches have been incomplete for L/T transition dwarfs, because these objects are faint in optical bands and have near-infrared (near-IR) colors that are difficult to distinguish from background stars. To overcome these obstacles, we have cross-matched the Pan-STARRS1 (optical) and WISE (mid-IR) catalogs to produce a unique multi-wavelength database for finding ultracool dwarfs. As part of our initial discoveries, we have identified seven brown dwarfs in the L/T transition within 9-15 pc of the Sun. The L9.5 dwarf PSO J140.2308+45.6487 and the T1.5 dwarf PSO J307.6784+07.8263 (both independently discovered by Mace et al.) show possible spectroscopic variability at the Y and J bands. Two more objects in our sample show evidence of photometric J-band variability, and two others are candidate unresolved binaries based on their spectra. We expect our full search to yield a well-defined, volume-limited sample of L/T transition dwarfs that will include many new targets for study of this complex regime. PSO J307.6784+07.8263 in particular may be an excellent candidate for in-depth study of variability, given its brightness (J = 14.2 mag) and proximity (11 pc)

  19. ASSESSMENT OF THE CHANGES IN BLOOD PRESSURE CIRCADIAN PROFILE AND VARIABILITY IN PATIENTS WITH CHRONIC HEART FAILURE AND ARTERIAL HYPERTENSION DURING COMBINED THERAPY INCLUDING IVABRADINE

    Directory of Open Access Journals (Sweden)

    M. V. Surovtseva

    2012-01-01

    Full Text Available Aim. To assess the changes in blood pressure (BP circadian profile and variability in patients with chronic heart failure (CHF of ischemic etiology and arterial hypertension (HT due to the complex therapy including ivabradine. Material and methods. Patients (n=90 with CHF class II–III NYHA associated with stable angina II-III class and HT were examined. The patients were randomized into 3 groups depending on received drugs: perindopril and ivabradine - group 1; perindopril, bisoprolol and ivabradine - group 2; perindopril and bisoprolol - group 3. The duration of therapy was 6 months. Ambulatory BP monitoring (ABPM was assessed at baseline and after treatment. Results. More significant reduction in average 24-hours systolic BP was found in groups 1 and 2 compared to group 3 (Δ%: -19.4±0,4; -21.1±0.4 and -11.8±0.6, respectively as well as diastolic BP (Δ%: -10.6±0.6; -12.9±0.4 and -4,3±0.3, respectively and other ABPM indicators. Improvement of BP circadian rhythm was found due to increase in the number of «Dipper» patients (p=0.016. More significant reduction in average daily and night systolic and diastolic BP (p=0.001, as well as daily and night BP variability (p=0.001 was also found in patients of group 2 compared to these of group 1. Conclusion. Moderate antihypertensive effect (in respect of both diastolic and systolic BP was shown when ivabradine was included into the complex therapy of patients with ischemic CHF and HT. The effect was more pronounced when ivabradine was combined with perindopril and bisoprolol. This was accompanied by reduction in high BP daily variability and improvement of the BP circadian rhythm. 

  20. ASSESSMENT OF THE CHANGES IN BLOOD PRESSURE CIRCADIAN PROFILE AND VARIABILITY IN PATIENTS WITH CHRONIC HEART FAILURE AND ARTERIAL HYPERTENSION DURING COMBINED THERAPY INCLUDING IVABRADINE

    Directory of Open Access Journals (Sweden)

    M. V. Surovtseva

    2015-12-01

    Full Text Available Aim. To assess the changes in blood pressure (BP circadian profile and variability in patients with chronic heart failure (CHF of ischemic etiology and arterial hypertension (HT due to the complex therapy including ivabradine. Material and methods. Patients (n=90 with CHF class II–III NYHA associated with stable angina II-III class and HT were examined. The patients were randomized into 3 groups depending on received drugs: perindopril and ivabradine - group 1; perindopril, bisoprolol and ivabradine - group 2; perindopril and bisoprolol - group 3. The duration of therapy was 6 months. Ambulatory BP monitoring (ABPM was assessed at baseline and after treatment. Results. More significant reduction in average 24-hours systolic BP was found in groups 1 and 2 compared to group 3 (Δ%: -19.4±0,4; -21.1±0.4 and -11.8±0.6, respectively as well as diastolic BP (Δ%: -10.6±0.6; -12.9±0.4 and -4,3±0.3, respectively and other ABPM indicators. Improvement of BP circadian rhythm was found due to increase in the number of «Dipper» patients (p=0.016. More significant reduction in average daily and night systolic and diastolic BP (p=0.001, as well as daily and night BP variability (p=0.001 was also found in patients of group 2 compared to these of group 1. Conclusion. Moderate antihypertensive effect (in respect of both diastolic and systolic BP was shown when ivabradine was included into the complex therapy of patients with ischemic CHF and HT. The effect was more pronounced when ivabradine was combined with perindopril and bisoprolol. This was accompanied by reduction in high BP daily variability and improvement of the BP circadian rhythm. 

  1. Events per variable for risk differences and relative risks using pseudo-observations

    DEFF Research Database (Denmark)

    Hansen, Stefan Nygaard; Andersen, Per Kragh; Parner, Erik Thorlund

    2014-01-01

    A method based on pseudo-observations has been proposed for direct regression modeling of functionals of interest with right-censored data, including the survival function, the restricted mean and the cumulative incidence function in competing risks. The models, once the pseudo-observations have...... been computed, can be fitted using standard generalized estimating equation software. Regression models can however yield problematic results if the number of covariates is large in relation to the number of events observed. Guidelines of events per variable are often used in practice. These rules...

  2. The Selection, Use, and Reporting of Control Variables in International Business Research

    DEFF Research Database (Denmark)

    Nielsen, Bo Bernhard; Raswant, Arpit

    2018-01-01

    This study explores the selection, use, and reporting of control variables in studies published in the leading international business (IB) research journals. We review a sample of 246 empirical studies published in the top five IB journals over the period 2012–2015 with particular emphasis...... on selection, use, and reporting of controls. Approximately 83% of studies included only half of what we consider Minimum Standard of Practice with regards to controls, whereas only 38% of the studies met the 75% threshold. We provide recommendations on how to effectively identify, use and report controls...

  3. Blood Pressure Variability and Cognitive Function Among Older African Americans: Introducing a New Blood Pressure Variability Measure.

    Science.gov (United States)

    Tsang, Siny; Sperling, Scott A; Park, Moon Ho; Helenius, Ira M; Williams, Ishan C; Manning, Carol

    2017-09-01

    Although blood pressure (BP) variability has been reported to be associated with cognitive impairment, whether this relationship affects African Americans has been unclear. We sought correlations between systolic and diastolic BP variability and cognitive function in community-dwelling older African Americans, and introduced a new BP variability measure that can be applied to BP data collected in clinical practice. We assessed cognitive function in 94 cognitively normal older African Americans using the Mini-Mental State Examination (MMSE) and the Computer Assessment of Mild Cognitive Impairment (CAMCI). We used BP measurements taken at the patients' three most recent primary care clinic visits to generate three traditional BP variability indices, range, standard deviation, and coefficient of variation, plus a new index, random slope, which accounts for unequal BP measurement intervals within and across patients. MMSE scores did not correlate with any of the BP variability indices. Patients with greater diastolic BP variability were less accurate on the CAMCI verbal memory and incidental memory tasks. Results were similar across the four BP variability indices. In a sample of cognitively intact older African American adults, BP variability did not correlate with global cognitive function, as measured by the MMSE. However, higher diastolic BP variability correlated with poorer verbal and incidental memory. By accounting for differences in BP measurement intervals, our new BP variability index may help alert primary care physicians to patients at particular risk for cognitive decline.

  4. Damaris: Addressing performance variability in data management for post-petascale simulations

    International Nuclear Information System (INIS)

    Dorier, Matthieu; Antoniu, Gabriel; Cappello, Franck; Snir, Marc; Sisneros, Robert

    2016-01-01

    With exascale computing on the horizon, reducing performance variability in data management tasks (storage, visualization, analysis, etc.) is becoming a key challenge in sustaining high performance. Here, this variability significantly impacts the overall application performance at scale and its predictability over time. In this article, we present Damaris, a system that leverages dedicated cores in multicore nodes to offload data management tasks, including I/O, data compression, scheduling of data movements, in situ analysis, and visualization. We evaluate Damaris with the CM1 atmospheric simulation and the Nek5000 computational fluid dynamic simulation on four platforms, including NICS’s Kraken and NCSA’s Blue Waters. Our results show that (1) Damaris fully hides the I/O variability as well as all I/O-related costs, thus making simulation performance predictable; (2) it increases the sustained write throughput by a factor of up to 15 compared with standard I/O approaches; (3) it allows almost perfect scalability of the simulation up to over 9,000 cores, as opposed to state-of-the-art approaches that fail to scale; and (4) it enables a seamless connection to the VisIt visualization software to perform in situ analysis and visualization in a way that impacts neither the performance of the simulation nor its variability. In addition, we extended our implementation of Damaris to also support the use of dedicated nodes and conducted a thorough comparison of the two approaches—dedicated cores and dedicated nodes—for I/O tasks with the aforementioned applications.

  5. Derivation and Validation of a Risk Standardization Model for Benchmarking Hospital Performance for Health-Related Quality of Life Outcomes after Acute Myocardial Infarction

    Science.gov (United States)

    Arnold, Suzanne V.; Masoudi, Frederick A.; Rumsfeld, John S.; Li, Yan; Jones, Philip G.; Spertus, John A.

    2014-01-01

    Background Before outcomes-based measures of quality can be used to compare and improve care, they must be risk-standardized to account for variations in patient characteristics. Despite the importance of health-related quality of life (HRQL) outcomes among patients with acute myocardial infarction (AMI), no risk-standardized models have been developed. Methods and Results We assessed disease-specific HRQL using the Seattle Angina Questionnaire at baseline and 1 year later in 2693 unselected AMI patients from 24 hospitals enrolled in the TRIUMPH registry. Using 57 candidate sociodemographic, economic, and clinical variables present on admission, we developed a parsimonious, hierarchical linear regression model to predict HRQL. Eleven variables were independently associated with poor HRQL after AMI, including younger age, prior CABG, depressive symptoms, and financial difficulties (R2=20%). The model demonstrated excellent internal calibration and reasonable calibration in an independent sample of 1890 AMI patients in a separate registry, although the model slightly over-predicted HRQL scores in the higher deciles. Among the 24 TRIUMPH hospitals, 1-year unadjusted HRQL scores ranged from 67–89. After risk-standardization, HRQL scores variability narrowed substantially (range=79–83), and the group of hospital performance (bottom 20%/middle 60%/top 20%) changed in 14 of the 24 hospitals (58% reclassification with risk-standardization). Conclusions In this predictive model for HRQL after AMI, we identified risk factors, including economic and psychological characteristics, associated with HRQL outcomes. Adjusting for these factors substantially altered the rankings of hospitals as compared with unadjusted comparisons. Using this model to compare risk-standardized HRQL outcomes across hospitals may identify processes of care that maximize this important patient-centered outcome. PMID:24163068

  6. Resting heart rate variability is associated with ex-Gaussian metrics of intra-individual reaction time variability.

    Science.gov (United States)

    Spangler, Derek P; Williams, DeWayne P; Speller, Lassiter F; Brooks, Justin R; Thayer, Julian F

    2018-03-01

    The relationships between vagally mediated heart rate variability (vmHRV) and the cognitive mechanisms underlying performance can be elucidated with ex-Gaussian modeling-an approach that quantifies two different forms of intra-individual variability (IIV) in reaction time (RT). To this end, the current study examined relations of resting vmHRV to whole-distribution and ex-Gaussian IIV. Subjects (N = 83) completed a 5-minute baseline while vmHRV (root mean square of successive differences; RMSSD) was measured. Ex-Gaussian (sigma, tau) and whole-distribution (standard deviation) estimates of IIV were derived from reaction times on a Stroop task. Resting vmHRV was found to be inversely related to tau (exponential IIV) but not to sigma (Gaussian IIV) or the whole-distribution standard deviation of RTs. Findings suggest that individuals with high vmHRV can better prevent attentional lapses but not difficulties with motor control. These findings inform the differential relationships of cardiac vagal control to the cognitive processes underlying human performance. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  7. Metronome cueing of walking reduces gait variability after a cerebellar stroke

    Directory of Open Access Journals (Sweden)

    Rachel Lindsey Wright

    2016-06-01

    Full Text Available Cerebellar stroke typically results in increased variability during walking. Previous research has suggested that auditory-cueing reduces excessive variability in conditions such as Parkinson’s disease and post-stroke hemiparesis. The aim of this case report was to investigate whether the use of a metronome cue during walking could reduce excessive variability in gait parameters after a cerebellar stroke. An elderly female with a history of cerebellar stroke and recurrent falling undertook 3 standard gait trials and 3 gait trials with an auditory metronome. A Vicon system was used to collect 3-D marker trajectory data. The coefficient of variation was calculated for temporal and spatial gait parameters. Standard deviations of the joint angles were calculated and used to give a measure of joint kinematic variability. Step time, stance time and double support time variability were reduced with metronome cueing. Variability in the sagittal hip, knee and ankle angles were reduced to normal values when walking to the metronome. In summary, metronome cueing resulted in a decrease in variability for step, stance and double support times and joint kinematics. Further research is needed to establish whether a metronome may be useful in gait rehabilitation after cerebellar stroke, and whether this leads to a decreased risk of falling.

  8. Variability of inter-team distances associated with match events in elite-standard soccer

    NARCIS (Netherlands)

    Frencken, Wouter; De Poel, Harjo; Visscher, Chris; Lemmink, Koen

    2012-01-01

    In soccer, critical match events like goal attempts can be preceded by periods of instability in the balance between the two teams' behaviours. Therefore, we determined periods of high variability in the distance between the teams' centroid positions longitudinally and laterally in an

  9. Size-dependent standard deviation for growth rates: empirical results and theoretical modeling.

    Science.gov (United States)

    Podobnik, Boris; Horvatic, Davor; Pammolli, Fabio; Wang, Fengzhong; Stanley, H Eugene; Grosse, I

    2008-05-01

    We study annual logarithmic growth rates R of various economic variables such as exports, imports, and foreign debt. For each of these variables we find that the distributions of R can be approximated by double exponential (Laplace) distributions in the central parts and power-law distributions in the tails. For each of these variables we further find a power-law dependence of the standard deviation sigma(R) on the average size of the economic variable with a scaling exponent surprisingly close to that found for the gross domestic product (GDP) [Phys. Rev. Lett. 81, 3275 (1998)]. By analyzing annual logarithmic growth rates R of wages of 161 different occupations, we find a power-law dependence of the standard deviation sigma(R) on the average value of the wages with a scaling exponent beta approximately 0.14 close to those found for the growth of exports, imports, debt, and the growth of the GDP. In contrast to these findings, we observe for payroll data collected from 50 states of the USA that the standard deviation sigma(R) of the annual logarithmic growth rate R increases monotonically with the average value of payroll. However, also in this case we observe a power-law dependence of sigma(R) on the average payroll with a scaling exponent beta approximately -0.08 . Based on these observations we propose a stochastic process for multiple cross-correlated variables where for each variable (i) the distribution of logarithmic growth rates decays exponentially in the central part, (ii) the distribution of the logarithmic growth rate decays algebraically in the far tails, and (iii) the standard deviation of the logarithmic growth rate depends algebraically on the average size of the stochastic variable.

  10. Size-dependent standard deviation for growth rates: Empirical results and theoretical modeling

    Science.gov (United States)

    Podobnik, Boris; Horvatic, Davor; Pammolli, Fabio; Wang, Fengzhong; Stanley, H. Eugene; Grosse, I.

    2008-05-01

    We study annual logarithmic growth rates R of various economic variables such as exports, imports, and foreign debt. For each of these variables we find that the distributions of R can be approximated by double exponential (Laplace) distributions in the central parts and power-law distributions in the tails. For each of these variables we further find a power-law dependence of the standard deviation σ(R) on the average size of the economic variable with a scaling exponent surprisingly close to that found for the gross domestic product (GDP) [Phys. Rev. Lett. 81, 3275 (1998)]. By analyzing annual logarithmic growth rates R of wages of 161 different occupations, we find a power-law dependence of the standard deviation σ(R) on the average value of the wages with a scaling exponent β≈0.14 close to those found for the growth of exports, imports, debt, and the growth of the GDP. In contrast to these findings, we observe for payroll data collected from 50 states of the USA that the standard deviation σ(R) of the annual logarithmic growth rate R increases monotonically with the average value of payroll. However, also in this case we observe a power-law dependence of σ(R) on the average payroll with a scaling exponent β≈-0.08 . Based on these observations we propose a stochastic process for multiple cross-correlated variables where for each variable (i) the distribution of logarithmic growth rates decays exponentially in the central part, (ii) the distribution of the logarithmic growth rate decays algebraically in the far tails, and (iii) the standard deviation of the logarithmic growth rate depends algebraically on the average size of the stochastic variable.

  11. Variable Saline Concentrations for Initial Resuscitation Following Polytrauma

    Science.gov (United States)

    2017-02-22

    AFRL-SA-WP-TR-2017-0008 Variable Saline Concentrations for Initial Resuscitation Following Polytrauma Dr. Michael Goodman...Following Polytrauma 5a. CONTRACT NUMBER FA8650-10-2-6140 5b. GRANT NUMBER FA8650-14-2-6B29 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Dr. Michael...established. We investigated the utility of standard variable saline concentrations (0.9%, 3%, 23.4%) in a murine polytrauma model of traumatic brain injury

  12. Health impact assessment in the United States: Has practice followed standards?

    International Nuclear Information System (INIS)

    Schuchter, Joseph; Bhatia, Rajiv; Corburn, Jason; Seto, Edmund

    2014-01-01

    As an emerging practice, Health Impact Assessment is heterogeneous in purpose, form, and scope and applied in a wide range of decision contexts. This heterogeneity challenges efforts to evaluate the quality and impact of practice. We examined whether information in completed HIA reports reflected objectively-evaluable criteria proposed by the North American HIA Practice Standards Working Group in 2009. From publically-available reports of HIAs conducted in the U.S. and published from 2009 to 2011, we excluded those that were components of, or comment letters on, Environmental Impact Assessments (5) or were demonstration projects or student exercises (8). For the remaining 23 reports, we used practice standards as a template to abstract data on the steps of HIA, including details on the rationale, authorship, funding, decision and decision-makers, participation, pathways and methods, quality of evidence, and recommendations. Most reports described screening, scoping, and assessment processes, but there was substantial variation in the extent of these processes and the degree of stakeholder participation. Community stakeholders participated in screening or scoping in just two-thirds of the HIAs (16). On average, these HIAs analyzed 5.5 determinants related to 10.6 health impacts. Most HIA reports did not include evaluation or monitoring plans. This study identifies issues for field development and improvement. The standards might be adapted to better account for variability in resources, produce fit-for-purpose HIAs, and facilitate innovation guided by the principles. - Highlights: • Our study examined reported HIAs in the U.S. against published practice standards. • Most HIAs used some screening, scoping and assessment elements from the standards. • The extent of these processes and stakeholder participation varied widely. • The average HIA considered multiple health determinants and impacts. • Evaluation or monitoring plans were generally not included in

  13. Health impact assessment in the United States: Has practice followed standards?

    Energy Technology Data Exchange (ETDEWEB)

    Schuchter, Joseph, E-mail: jws@berkeley.edu [University of California, Berkeley, School of Public Health, Department of Environmental Health Sciences, 50 University Hall, Berkeley, CA 94720-7360 (United States); Bhatia, Rajiv [University of California, Berkeley, Institute of Urban and Regional Development (United States); Corburn, Jason [University of California, Berkeley, College of Environmental Design, Department of City and Regional Planning (United States); Seto, Edmund [University of Washington, School of Public Health, Department of Environmental and Occupational Health (United States)

    2014-07-01

    As an emerging practice, Health Impact Assessment is heterogeneous in purpose, form, and scope and applied in a wide range of decision contexts. This heterogeneity challenges efforts to evaluate the quality and impact of practice. We examined whether information in completed HIA reports reflected objectively-evaluable criteria proposed by the North American HIA Practice Standards Working Group in 2009. From publically-available reports of HIAs conducted in the U.S. and published from 2009 to 2011, we excluded those that were components of, or comment letters on, Environmental Impact Assessments (5) or were demonstration projects or student exercises (8). For the remaining 23 reports, we used practice standards as a template to abstract data on the steps of HIA, including details on the rationale, authorship, funding, decision and decision-makers, participation, pathways and methods, quality of evidence, and recommendations. Most reports described screening, scoping, and assessment processes, but there was substantial variation in the extent of these processes and the degree of stakeholder participation. Community stakeholders participated in screening or scoping in just two-thirds of the HIAs (16). On average, these HIAs analyzed 5.5 determinants related to 10.6 health impacts. Most HIA reports did not include evaluation or monitoring plans. This study identifies issues for field development and improvement. The standards might be adapted to better account for variability in resources, produce fit-for-purpose HIAs, and facilitate innovation guided by the principles. - Highlights: • Our study examined reported HIAs in the U.S. against published practice standards. • Most HIAs used some screening, scoping and assessment elements from the standards. • The extent of these processes and stakeholder participation varied widely. • The average HIA considered multiple health determinants and impacts. • Evaluation or monitoring plans were generally not included in

  14. Textural features and SUV-based variables assessed by dual time point 18F-FDG PET/CT in locally advanced breast cancer.

    Science.gov (United States)

    Garcia-Vicente, Ana María; Molina, David; Pérez-Beteta, Julián; Amo-Salas, Mariano; Martínez-González, Alicia; Bueno, Gloria; Tello-Galán, María Jesús; Soriano-Castrejón, Ángel

    2017-12-01

    To study the influence of dual time point 18F-FDG PET/CT in textural features and SUV-based variables and their relation among them. Fifty-six patients with locally advanced breast cancer (LABC) were prospectively included. All of them underwent a standard 18F-FDG PET/CT (PET-1) and a delayed acquisition (PET-2). After segmentation, SUV variables (SUVmax, SUVmean, and SUVpeak), metabolic tumor volume (MTV), and total lesion glycolysis (TLG) were obtained. Eighteen three-dimensional (3D) textural measures were computed including: run-length matrices (RLM) features, co-occurrence matrices (CM) features, and energies. Differences between all PET-derived variables obtained in PET-1 and PET-2 were studied. Significant differences were found between the SUV-based parameters and MTV obtained in the dual time point PET/CT, with higher values of SUV-based variables and lower MTV in the PET-2 with respect to the PET-1. In relation with the textural parameters obtained in dual time point acquisition, significant differences were found for the short run emphasis, low gray-level run emphasis, short run high gray-level emphasis, run percentage, long run emphasis, gray-level non-uniformity, homogeneity, and dissimilarity. Textural variables showed relations with MTV and TLG. Significant differences of textural features were found in dual time point 18F-FDG PET/CT. Thus, a dynamic behavior of metabolic characteristics should be expected, with higher heterogeneity in delayed PET acquisition compared with the standard PET. A greater heterogeneity was found in bigger tumors.

  15. Does a Threshold Inflation Rate Exist? Quantile Inferences for Inflation and Its Variability

    OpenAIRE

    WenShwo Fang; Stephen M. Miller; Chih-Chuan Yeh

    2009-01-01

    Using quantile regressions and cross-sectional data from 152 countries, we examine the relationship between inflation and its variability. We consider two measures of inflation – the mean and median – and three different measures of inflation variability – the standard deviation, relative variation, and median deviation. All results from the mean and standard deviation, the mean and relative variation, or the median and the median deviation support both the hypothesis that higher inflation cr...

  16. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  17. Size and Topology Optimization for Trusses with Discrete Design Variables by Improved Firefly Algorithm

    Directory of Open Access Journals (Sweden)

    Yue Wu

    2017-01-01

    Full Text Available Firefly Algorithm (FA, for short is inspired by the social behavior of fireflies and their phenomenon of bioluminescent communication. Based on the fundamentals of FA, two improved strategies are proposed to conduct size and topology optimization for trusses with discrete design variables. Firstly, development of structural topology optimization method and the basic principle of standard FA are introduced in detail. Then, in order to apply the algorithm to optimization problems with discrete variables, the initial positions of fireflies and the position updating formula are discretized. By embedding the random-weight and enhancing the attractiveness, the performance of this algorithm is improved, and thus an Improved Firefly Algorithm (IFA, for short is proposed. Furthermore, using size variables which are capable of including topology variables and size and topology optimization for trusses with discrete variables is formulated based on the Ground Structure Approach. The essential techniques of variable elastic modulus technology and geometric construction analysis are applied in the structural analysis process. Subsequently, an optimization method for the size and topological design of trusses based on the IFA is introduced. Finally, two numerical examples are shown to verify the feasibility and efficiency of the proposed method by comparing with different deterministic methods.

  18. A composite model including visfatin, tissue polypeptide-specific antigen, hyaluronic acid, and hematological variables for the diagnosis of moderate-to-severe fibrosis in nonalcoholic fatty liver disease: a preliminary study.

    Science.gov (United States)

    Chwist, Alina; Hartleb, Marek; Lekstan, Andrzej; Kukla, Michał; Gutkowski, Krzysztof; Kajor, Maciej

    2014-01-01

    Histopathological risk factors for end-stage liver failure in patients with nonalcoholic fatty liver disease (NAFLD) include nonalcoholic steatohepatitis (NASH) and advanced liver fibrosis. There is a need for noninvasive diagnostic methods for these 2 conditions. The aim of this study was to investigate new laboratory variables with a predictive potential to detect advanced fibrosis (stages 2 and 3) in NAFLD. The study involved 70 patients with histologically proven NAFLD of varied severity. Additional laboratory variables included zonulin, haptoglobin, visfatin, adiponectin, leptin, tissue polypeptide-specific antigen (TPSA), hyaluronic acid, and interleukin 6. Patients with NASH (NAFLD activity score of ≥5) had significantly higher HOMA-IR values and serum levels of visfatin, haptoglobin, and zonulin as compared with those without NASH on histological examination. Advanced fibrosis was found in 16 patients (22.9%) and the risk factors associated with its prevalence were age, the ratio of erythrocyte count to red blood cell distribution width, platelet count, and serum levels of visfatin and TPSA. Based on these variables, we constructed a scoring system that differentiated between NAFLD patients with and without advanced fibrosis with a sensitivity of 75% and specificity of 100% (area under the receiver operating characteristic curve, 0.93). The scoring system based on the above variables allows to predict advanced fibrosis with high sensitivity and specificity. However, its clinical utility should be verified in further studies involving a larger number of patients.

  19. Analytical and Experimental Performance Evaluation of BLE Neighbor Discovery Process Including Non-Idealities of Real Chipsets

    Directory of Open Access Journals (Sweden)

    David Perez-Diaz de Cerio

    2017-03-01

    Full Text Available The purpose of this paper is to evaluate from a real perspective the performance of Bluetooth Low Energy (BLE as a technology that enables fast and reliable discovery of a large number of users/devices in a short period of time. The BLE standard specifies a wide range of configurable parameter values that determine the discovery process and need to be set according to the particular application requirements. Many previous works have been addressed to investigate the discovery process through analytical and simulation models, according to the ideal specification of the standard. However, measurements show that additional scanning gaps appear in the scanning process, which reduce the discovery capabilities. These gaps have been identified in all of the analyzed devices and respond to both regular patterns and variable events associated with the decoding process. We have demonstrated that these non-idealities, which are not taken into account in other studies, have a severe impact on the discovery process performance. Extensive performance evaluation for a varying number of devices and feasible parameter combinations has been done by comparing simulations and experimental measurements. This work also includes a simple mathematical model that closely matches both the standard implementation and the different chipset peculiarities for any possible parameter value specified in the standard and for any number of simultaneous advertising devices under scanner coverage.

  20. Quality of semantic standards

    NARCIS (Netherlands)

    Folmer, Erwin Johan Albert

    2012-01-01

    Little scientific literature addresses the issue of quality of semantic standards, albeit a problem with high economic and social impact. Our problem survey, including 34 semantic Standard Setting Organizations (SSOs), gives evidence that quality of standards can be improved, but for improvement a

  1. The use of personal values in living standards measures | Ungerer ...

    African Journals Online (AJOL)

    The Living Standards Measure (LSM), a South African marketing segmentation method, is a multivariate wealth measure based on standard of living. This article reports on whether a rationale can be found for the inclusion of psychological variables, particularly personal values, in this type of multivariate segmentation.

  2. The EPICS process variable Gateway Version 2

    International Nuclear Information System (INIS)

    Evans, K.

    2005-01-01

    The EPICS Process Variable Gateway is both a Channel Access Server and Channel Access Client that provides a means for many clients, typically on different subnets, to access a process variable while making only one connection to the server that owns the process variable. It also provides additional access security beyond that implemented on the server. It thus protects critical servers while providing suitably restricted access to needed process variables. The original version of the Gateway worked with EPICS Base 3.13 but required a special version, since the changes necessary for its operation were never incorporated into EPICS Base. Version 2 works with any standard EPICS Base 3.14.6 or later and has many improvements in both performance and features over the older version. The Gateway is now used at many institutions and has become a stable, high-performance application. It is capable of handling tens of thousands of process variables with hundreds of thousands of events per second. It has run for over three months in a production environment without having to be restarted. It has many internal process variables that can be used to monitor its state using standard EPICS client tools, such as MEDM and StripTool. Other internal process variables can be used to stop the Gateway, make several kinds of reports, or change the access security without stopping the Gateway. It can even be started on remote workstations from MEDM by using a Secure Shell script. This paper will describe the new Gateway and how it is used. The Gateway is both a server (like an EPICS Input/Output Controller (IOC)) and a client (like the EPICS Motif Editor and Display Manager (MEDM), StripTool, and others). Clients connect to the server side, and the client side connects to IOCs and other servers, possibly other Gateways. See Fig. 1. There are perhaps three principal reasons for using the Gateway: (1) it allows many clients to access a process variable while making only one connection to

  3. Grid impact of variable-speed wind turbines

    Energy Technology Data Exchange (ETDEWEB)

    Larsson, Aa [Chalmers Univ. of Technology, Dept. of Electric Power Engineering, Goeteborg (Sweden); Soerensen, P [Risoe National Lab., Roskilde (Denmark); Santjer, F [German Wind Energy Inst., DEWI, Wilhelmshaven (Germany)

    1999-03-01

    In this paper the power quality of variable-speed wind turbines equipped with forced-commutated inverters is investigated. Measurements have been taken on the same type of variable-speed wind turbines in Germany and Sweden. The measurements have been analysed according to existing IEC standards. Special attention has been paid to the aggregation of several wind turbines on flicker emission and harmonics. The aggregation has been compared with the summation laws used in the draft IEC 61400-21 `Power Quality Requirements for Grid Connected wind turbines`. The methods for calculating and summing flicker proposed by IEC Standards are reliable. Harmonics and inter-harmonics are treated in IEC 61000-4-7 and IEC 61000-3-6. The methods for summing harmonics and inter-harmonics in IEC 61000-3-6 are applicable to wind turbines. In order to obtain a correct magnitude of the frequency components, the use of a well-defined window width, according to IEC 61000-4-7 Amendment 1 is of a great importance. (au)

  4. An empirical assessment of the impact of technical standards on the export of meat in Nigeria

    Directory of Open Access Journals (Sweden)

    Queeneth Odichi Ekeocha

    2017-10-01

    Full Text Available The study is an assessment of the impact of technical standards on meat export in Nigeria. Several literatures were reviewed in relation to meat standards, issues associated with standards compliance, the effects of SPS standards on food exports in developing countries, causes of non-export of meat in Nigeria, amongst others. A survey method was used and a cross tabulation analysis was made to ascertain the relationship among various variables and how significant they were in relation to food product standards. The findings of the study among others include- sanitary conditions for meat processing is a significant factor for meat export; standards compliance is a step in the right direction towards agricultural export diversification, food standard compliance can create market access for meat exports, etc. The study concluded that technical standard is very significant to meat exports in Nigeria. Therefore, the study recommends among others that the government should invest in the productive capacity of SPS requirements for meat export, standard abattoirs should be built and maintained, policymakers should re-think flexible export diversification policy that could attract foreign investor and meat companies in Nigeria.

  5. Impact of a standardized nurse observation protocol including MEWS after Intensive Care Unit discharge.

    Science.gov (United States)

    De Meester, K; Das, T; Hellemans, K; Verbrugghe, W; Jorens, P G; Verpooten, G A; Van Bogaert, P

    2013-02-01

    Analysis of in-hospital mortality after serious adverse events (SAE's) in our hospital showed the need for more frequent observation in medical and surgical wards. We hypothesized that the incidence of SAE's could be decreased by introducing a standard nurse observation protocol. To investigate the effect of a standard nurse observation protocol implementing the Modified Early Warning Score (MEWS) and a color graphic observation chart. Pre- and post-intervention study by analysis of patients records for a 5-day period after Intensive Care Unit (ICU) discharge to 14 medical and surgical wards before (n=530) and after (n=509) the intervention. For the total study population the mean Patient Observation Frequency Per Nursing Shift (POFPNS) during the 5-day period after ICU discharge increased from .9993 (95% C.I. .9637-1.0350) in the pre-intervention period to 1.0732 (95% C.I. 1.0362-1.1101) (p=.005) in the post-intervention period. There was an increased risk of a SAE in patients with MEWS 4 or higher in the present nursing shift (HR 8.25; 95% C.I. 2.88-23.62) and the previous nursing shift (HR 12.83;95% C.I. 4.45-36.99). There was an absolute risk reduction for SAE's within 120h after ICU discharge of 2.2% (95% C.I. -0.4-4.67%) from 5.7% to 3.5%. The intervention had a positive impact on the observation frequency. MEWS had a predictive value for SAE's in patients after ICU discharge. The drop in SAE's was substantial but did not reach statistical significance. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  6. Food Standards are Good– for Middle-class Farmers

    DEFF Research Database (Denmark)

    Hansen, Henrik; Trifkovic, Neda

    results indicate that large returns can be accrued from food standards, but only for the upper middle-class farmers, i.e., those between the 50% and 85% quantiles of the expenditure distribution. Overall, our result points to an exclusionary impact of standards for the poorest farmers while the richest do......We estimate the causal effect of food standards on Vietnamese pangasius farmers’ wellbeing measured by per capita consumption expenditure. We estimate both the average effects and the local average treatment effects on poorer and richer farmers by instrumental variable quantile regression. Our...

  7. Individualized Anemia Management Reduces Hemoglobin Variability in Hemodialysis Patients

    OpenAIRE

    Gaweda, Adam E.; Aronoff, George R.; Jacobs, Alfred A.; Rai, Shesh N.; Brier, Michael E.

    2013-01-01

    One-size-fits-all protocol-based approaches to anemia management with erythropoiesis-stimulating agents (ESAs) may result in undesired patterns of hemoglobin variability. In this single-center, double-blind, randomized controlled trial, we tested the hypothesis that individualized dosing of ESA improves hemoglobin variability over a standard population-based approach. We enrolled 62 hemodialysis patients and followed them over a 12-month period. Patients were randomly assigned to receive ESA ...

  8. Review of the Commission program for standardization of nuclear power plants and recommendations to improve standardization concepts

    International Nuclear Information System (INIS)

    1978-02-01

    This is a report of a staff study describing the need and utility of specific changes to the Commission's standardization program. The various matters considered in the study include: (1) A discussion of industry use to date of the standardization program. (2) A discussion of the experience to date with each of the standardization concepts. (3) A review of public comments on the standardization program and the staff response to each principal comment. (4) A review of the need for standardization considering the likely number of license applications to be submitted in the coming years. (5) A discussion of the reference system concept, including review of applicable experience and recommended changes to the concept. (6) A discussion of the duplicate plant concept, including review of applicable experience and recommended changes to the concept. (7) A discussion of the manufacturing license concept, including review of applicable experience and recommended changes to the concept. (8) A discussion of the replicate plant concept, including review of applicable experience and recommended changes to the concept. (9) A discussion of the effective periods for approved designs under all four standardization concepts. (10) A description of continuing staff activities related to the standardization program

  9. Variability of a "force signature" during windmill softball pitching and relationship between discrete force variables and pitch velocity.

    Science.gov (United States)

    Nimphius, Sophia; McGuigan, Michael R; Suchomel, Timothy J; Newton, Robert U

    2016-06-01

    This study assessed reliability of discrete ground reaction force (GRF) variables over multiple pitching trials, investigated the relationships between discrete GRF variables and pitch velocity (PV) and assessed the variability of the "force signature" or continuous force-time curve during the pitching motion of windmill softball pitchers. Intraclass correlation coefficient (ICC) for all discrete variables was high (0.86-0.99) while the coefficient of variance (CV) was low (1.4-5.2%). Two discrete variables were significantly correlated to PV; second vertical peak force (r(5)=0.81, p=0.03) and time between peak forces (r(5)=-0.79; p=0.03). High ICCs and low CVs support the reliability of discrete GRF and PV variables over multiple trials and significant correlations indicate there is a relationship between the ability to produce force and the timing of this force production with PV. The mean of all pitchers' curve-average standard deviation of their continuous force-time curves demonstrated low variability (CV=4.4%) indicating a repeatable and identifiable "force signature" pattern during this motion. As such, the continuous force-time curve in addition to discrete GRF variables should be examined in future research as a potential method to monitor or explain changes in pitching performance. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Bayesian Group Bridge for Bi-level Variable Selection.

    Science.gov (United States)

    Mallick, Himel; Yi, Nengjun

    2017-06-01

    A Bayesian bi-level variable selection method (BAGB: Bayesian Analysis of Group Bridge) is developed for regularized regression and classification. This new development is motivated by grouped data, where generic variables can be divided into multiple groups, with variables in the same group being mechanistically related or statistically correlated. As an alternative to frequentist group variable selection methods, BAGB incorporates structural information among predictors through a group-wise shrinkage prior. Posterior computation proceeds via an efficient MCMC algorithm. In addition to the usual ease-of-interpretation of hierarchical linear models, the Bayesian formulation produces valid standard errors, a feature that is notably absent in the frequentist framework. Empirical evidence of the attractiveness of the method is illustrated by extensive Monte Carlo simulations and real data analysis. Finally, several extensions of this new approach are presented, providing a unified framework for bi-level variable selection in general models with flexible penalties.

  11. A Proposed Framework for Applying the National Standards of Quality Assurance in Higher Education in Sudan from the Teaching Staff’s Perspective - Faculties of Business Administration

    Directory of Open Access Journals (Sweden)

    Alfatih Alamin Elfaki

    2017-08-01

    Full Text Available This study aimed to clarify the importance of having national standards and their role in achieving quality, as well as establishing a framework for the actual application of national standards in quality assurance so as to achieve quality in higher education institutions. The researchers  followed a descriptive analytical method to achieve the objectives of the study and developed a questionnaire covering primary and secondary variables that have a role in the design of specific models to help in applying the national standards by the Sudanese universities. The questionnaire included one dependent variable; the effective application of national standards of quality assurance in higher education institutions, and the four main variables (independent are: the national standards of quality assurance in higher education in Sudan, the standard of quality assurance, the standard of teaching and learning and the standard of scientific research and publication. The study revealed a number of conclusions: there were statistically significant differences in the extent of familiarity with the  national quality assurance standards in Sudan according to the academic rank of the  faculty members; there were also significant differences in the extent of compliance with the  national quality assurance standards in Sudan according to the academic rank of the faculty members; there was full agreement between the national standards for quality assurance in Sudan and the international standards for quality assurance; and  there were statistically significant differences in that the absence of specific models would have a negative impact on effective application of national standards of quality assurance in higher education in Sudan, according to the academic rank of the faculty members. Keywords: Quality, The program, Standards, University, Total quality management.

  12. Derivation and validation of a risk standardization model for benchmarking hospital performance for health-related quality of life outcomes after acute myocardial infarction.

    Science.gov (United States)

    Arnold, Suzanne V; Masoudi, Frederick A; Rumsfeld, John S; Li, Yan; Jones, Philip G; Spertus, John A

    2014-01-21

    Before outcomes-based measures of quality can be used to compare and improve care, they must be risk-standardized to account for variations in patient characteristics. Despite the importance of health-related quality of life (HRQL) outcomes among patients with acute myocardial infarction (AMI), no risk-standardized models have been developed. We assessed disease-specific HRQL using the Seattle Angina Questionnaire at baseline and 1 year later in 2693 unselected AMI patients from 24 hospitals enrolled in the Translational Research Investigating Underlying disparities in acute Myocardial infarction Patients' Health status (TRIUMPH) registry. Using 57 candidate sociodemographic, economic, and clinical variables present on admission, we developed a parsimonious, hierarchical linear regression model to predict HRQL. Eleven variables were independently associated with poor HRQL after AMI, including younger age, previous coronary artery bypass graft surgery, depressive symptoms, and financial difficulties (R(2)=20%). The model demonstrated excellent internal calibration and reasonable calibration in an independent sample of 1890 AMI patients in a separate registry, although the model slightly overpredicted HRQL scores in the higher deciles. Among the 24 TRIUMPH hospitals, 1-year unadjusted HRQL scores ranged from 67-89. After risk-standardization, HRQL score variability narrowed substantially (range=79-83), and the group of hospital performance (bottom 20%/middle 60%/top 20%) changed in 14 of the 24 hospitals (58% reclassification with risk-standardization). In this predictive model for HRQL after AMI, we identified risk factors, including economic and psychological characteristics, associated with HRQL outcomes. Adjusting for these factors substantially altered the rankings of hospitals as compared with unadjusted comparisons. Using this model to compare risk-standardized HRQL outcomes across hospitals may identify processes of care that maximize this important patient

  13. Variability of indication criteria in knee and hip replacement: an observational study

    Directory of Open Access Journals (Sweden)

    Sarasqueta Cristina

    2010-10-01

    Full Text Available Abstract Background Total knee (TKR and hip (THR replacement (arthroplasty are effective surgical procedures that relieve pain, improve patients' quality of life and increase functional capacity. Studies on variations in medical practice usually place the indications for performing these procedures to be highly variable, because surgeons appear to follow different criteria when recommending surgery in patients with different severity levels. We therefore proposed a study to evaluate inter-hospital variability in arthroplasty indication. Methods The pre-surgical condition of 1603 patients included was compared by their personal characteristics, clinical situation and self-perceived health status. Patients were asked to complete two health-related quality of life questionnaires: the generic SF-12 (Short Form and the specific WOMAC (Western Ontario and Mcmaster Universities scale. The type of patient undergoing primary arthroplasty was similar in the 15 different hospitals evaluated. The variability in baseline WOMAC score between hospitals in THR and TKR indication was described by range, mean and standard deviation (SD, mean and standard deviation weighted by the number of procedures at each hospital, high/low ratio or extremal quotient (EQ5-95, variation coefficient (CV5-95 and weighted variation coefficient (WCV5-95 for 5-95 percentile range. The variability in subjective and objective signs was evaluated using median, range and WCV5-95. The appropriateness of the procedures performed was calculated using a specific threshold proposed by Quintana et al for assessing pain and functional capacity. Results The variability expressed as WCV5-95 was very low, between 0.05 and 0.11 for all three dimensions on WOMAC scale for both types of procedure in all participating hospitals. The variability in the physical and mental SF-12 components was very low for both types of procedure (0.08 and 0.07 for hip and 0.03 and 0.07 for knee surgery patients

  14. European standards for composite construction

    NARCIS (Netherlands)

    Stark, J.W.B.

    2000-01-01

    The European Standards Organisation (CEN) has planned to develop a complete set of harmonized European building standards. This set includes standards for composite steel and concrete buildings and bridges. The Eurocodes, being the design standards, form part of this total system of European

  15. Analytic function theory of several variables elements of Oka’s coherence

    CERN Document Server

    Noguchi, Junjiro

    2016-01-01

    The purpose of this book is to present the classical analytic function theory of several variables as a standard subject in a course of mathematics after learning the elementary materials (sets, general topology, algebra, one complex variable). This includes the essential parts of Grauert–Remmert's two volumes, GL227(236) (Theory of Stein spaces) and GL265 (Coherent analytic sheaves) with a lowering of the level for novice graduate students (here, Grauert's direct image theorem is limited to the case of finite maps). The core of the theory is "Oka's Coherence", found and proved by Kiyoshi Oka. It is indispensable, not only in the study of complex analysis and complex geometry, but also in a large area of modern mathematics. In this book, just after an introductory chapter on holomorphic functions (Chap. 1), we prove Oka's First Coherence Theorem for holomorphic functions in Chap. 2. This defines a unique character of the book compared with other books on this subject, in which the notion of coherence appear...

  16. Updating OSHA standards based on national consensus standards. Direct final rule.

    Science.gov (United States)

    2007-12-14

    In this direct final rule, the Agency is removing several references to consensus standards that have requirements that duplicate, or are comparable to, other OSHA rules; this action includes correcting a paragraph citation in one of these OSHA rules. The Agency also is removing a reference to American Welding Society standard A3.0-1969 ("Terms and Definitions") in its general-industry welding standards. This rulemaking is a continuation of OSHA's ongoing effort to update references to consensus and industry standards used throughout its rules.

  17. Analysis of the in vivo confocal Raman spectral variability in human skin

    Science.gov (United States)

    Mogilevych, Borys; dos Santos, Laurita; Rangel, Joao L.; Grancianinov, Karen J. S.; Sousa, Mariane P.; Martin, Airton A.

    2015-06-01

    Biochemical composition of the skin changes in each layer and, therefore, the skin spectral profile vary with the depth. In this work, in vivo Confocal Raman spectroscopy studies were performed at different skin regions and depth profile (from the surface down to 10 μm) of the stratum corneum, to verify the variability and reproducibility of the intra- and interindividual Raman data. The Raman spectra were collected from seven healthy female study participants using a confocal Raman system from Rivers Diagnostic, with 785 nm excitation line and a CCD detector. Measurements were performed in the volar forearm region, at three different points at different depth, with the step of 2 μm. For each depth point, three spectra were acquired. Data analysis included the descriptive statistics (mean, standard deviation and residual) and Pearson's correlation coefficient calculation. Our results show that inter-individual variability is higher than intraindividual variability, and variability inside the SC is higher than on the skin surface. In all these cases we obtained r values, higher than 0.94, which correspond to high correlation between Raman spectra. It reinforces the possibility of the data reproducibility and direct comparison of in vivo results obtained with different study participants of the same age group and phototype.

  18. Variability in the Initial Costs of Care and One-Year Outcomes of Observation Services

    Directory of Open Access Journals (Sweden)

    Abbass, Ibrahim

    2015-05-01

    Full Text Available Introduction: The use of observation units (OUs following emergency departments (ED visits as a model of care has increased exponentially in the last decade. About one-third of U.S. hospitals now have OUs within their facilities. While their use is associated with lower costs and comparable level of care compared to inpatient units, there is a wide variation in OUs characteristics and operational procedures. The objective of this research was to explore the variability in the initial costs of care of placing patients with non-specific chest pain in observation units (OUs and the one-year outcomes. Methods: The author retrospectively investigated medical insurance claims of 22,962 privately insured patients (2009-2011 admitted to 41 OUs. Outcomes included the one-year chest pain/cardiovascular related costs and primary and secondary outcomes. Primary outcomes included myocardial infarction, congestive heart failure, stroke or cardiac arrest, while secondary outcomes included revascularization procedures, ED revisits for angina pectoris or chest pain and hospitalization due to cardiovascular diseases. The author aggregated the adjusted costs and prevalence rates of outcomes for patients over OUs, and computed the weighted coefficients of variation (WCV to compare variations across OUs. Results: There was minimal variability in the initial costs of care (WCV=2.2%, while the author noticed greater variability in the outcomes. Greater variability were associated with the adjusted cardiovascular-related costs of medical services (WCV=17.6% followed by the adjusted prevalence odds ratio of patients experiencing primary outcomes (WCV=16.3% and secondary outcomes (WCV=10%. Conclusion: Higher variability in the outcomes suggests the need for more standardization of the observation services for chest pain patients. [West J Emerg Med. 2015;16(3:395–400.

  19. The satisfactory growth and development at 2 years of age of the INTERGROWTH-21st Fetal Growth Standards cohort support its appropriateness for constructing international standards.

    Science.gov (United States)

    Villar, José; Cheikh Ismail, Leila; Staines Urias, Eleonora; Giuliani, Francesca; Ohuma, Eric O; Victora, Cesar G; Papageorghiou, Aris T; Altman, Douglas G; Garza, Cutberto; Barros, Fernando C; Puglia, Fabien; Ochieng, Roseline; Jaffer, Yasmin A; Noble, Julia A; Bertino, Enrico; Purwar, Manorama; Pang, Ruyan; Lambert, Ann; Chumlea, Cameron; Stein, Alan; Fernandes, Michelle; Bhutta, Zulfiqar A; Kennedy, Stephen H

    2018-02-01

    used to estimate the percentage variability among individuals within a study site compared with that among study sites. There were 3711 eligible singleton live births; 3042 children (82%) were evaluated at 2 years of age. There were no substantive differences between the included group and the lost-to-follow up group. Infant mortality rate was 3 per 1000; neonatal mortality rate was 1.6 per 1000. At the 2-year visit, the children included in the INTERGROWTH-21 st Fetal Growth Standards were at the 49th percentile for length, 50th percentile for head circumference, and 58th percentile for weight of the World Health Organization Child Growth Standards. Similar results were seen for the preterm subgroup that was included in the INTERGROWTH-21 st Preterm Postnatal Growth Standards. The cohort overlapped between the 3rd and 97th percentiles of the World Health Organization motor development milestones. We estimated that the variance among study sites explains only 5.5% of the total variability in the length of the children between birth and 2 years of age, although the variance among individuals within a study site explains 42.9% (ie, 8 times the amount explained by the variation among sites). An increase of 8.9 cm in adult height over mean parental height is estimated to occur in the cohort from low-middle income countries, provided that children continue to have adequate health, environmental, and nutritional conditions. The cohort enrolled in the INTERGROWTH-21 st standards remained healthy with adequate growth and motor development up to 2 years of age, which supports its appropriateness for the construction of international fetal and preterm postnatal growth standards. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  20. Control system architecture: The standard and non-standard models

    International Nuclear Information System (INIS)

    Thuot, M.E.; Dalesio, L.R.

    1993-01-01

    Control system architecture development has followed the advances in computer technology through mainframes to minicomputers to micros and workstations. This technology advance and increasingly challenging accelerator data acquisition and automation requirements have driven control system architecture development. In summarizing the progress of control system architecture at the last International Conference on Accelerator and Large Experimental Physics Control Systems (ICALEPCS) B. Kuiper asserted that the system architecture issue was resolved and presented a open-quotes standard modelclose quotes. The open-quotes standard modelclose quotes consists of a local area network (Ethernet or FDDI) providing communication between front end microcomputers, connected to the accelerator, and workstations, providing the operator interface and computational support. Although this model represents many present designs, there are exceptions including reflected memory and hierarchical architectures driven by requirements for widely dispersed, large channel count or tightly coupled systems. This paper describes the performance characteristics and features of the open-quotes standard modelclose quotes to determine if the requirements of open-quotes non-standardclose quotes architectures can be met. Several possible extensions to the open-quotes standard modelclose quotes are suggested including software as well as the hardware architectural features

  1. Inter-Trial Gait Variability Reduction Using Continous Curve Registration

    National Research Council Canada - National Science Library

    Sadeghi, H

    2001-01-01

    Timing in peak gait values shifts slightly between gait trials. When gait data are averaged, some of the standard deviation can be associated to this inter-trial variability unless normalization is carried out beforehand...

  2. Platelet function, anthropometric and metabolic variables in Nigerian ...

    African Journals Online (AJOL)

    Platelet function, anthropometric and metabolic variables in Nigerian Type 2 Diabetic patients. ... (BSA) were assessed as indices of anthropometry, fasting blood sugar (FBS), plasma cholesterol and triglycerides (TAG) were determined using standard method and platelet aggregation test was done on the whole blood.

  3. An evaluation of FIA's stand age variable

    Science.gov (United States)

    John D. Shaw

    2015-01-01

    The Forest Inventory and Analysis Database (FIADB) includes a large number of measured and computed variables. The definitions of measured variables are usually well-documented in FIA field and database manuals. Some computed variables, such as live basal area of the condition, are equally straightforward. Other computed variables, such as individual tree volume,...

  4. SpecDB: The AAVSO’s Public Repository for Spectra of Variable Stars

    Science.gov (United States)

    Kafka, Stella; Weaver, John; Silvis, George; Beck, Sara

    2018-01-01

    SpecDB is the American Association of Variable Star Observers (AAVSO) spectral database. Accessible to any astronomer with the capability to perform spectroscopy, SpecDB provides an unprecedented scientific opportunity for amateur and professional astronomers around the globe. Backed by the Variable Star Index, one of the most utilized variable star catalogs, SpecDB is expected to become one of the world leading databases of its kind. Once verified by a team of expert spectroscopists, an observer can upload spectra of variable stars target easily and efficiently. Uploaded spectra can then be searched for, previewed, and downloaded for inclusion in publications. Close community development and involvement will ensure a user-friendly and versatile database, compatible with the needs of 21st century astrophysics. Observations of 1D spectra are submitted as FITS files. All spectra are required to be preprocessed for wavelength calibration and dark subtraction; Bias and flat are strongly recommended. First time observers are required to submit a spectrum of a standard (non-variable) star to be checked for errors in technique or equipment. Regardless of user validation, FITS headers must include several value cards detailing the observation, as well as information regarding the observer, equipment, and observing site in accordance with existing AAVSO records. This enforces consistency and provides necessary details for follow up analysis. Requirements are provided to users in a comprehensive guidebook and accompanying technical manual. Upon submission, FITS headers are automatically checked for errors and any anomalies are immediately fed back to the user. Successful candidates can then submit at will, including multiple simultaneous submissions. All published observations can be searched and interactively previewed. Community involvement will be enhanced by an associated forum where users can discuss observation techniques and suggest improvements to the database.

  5. 24-Hour Blood Pressure Variability Assessed by Average Real Variability: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Mena, Luis J; Felix, Vanessa G; Melgarejo, Jesus D; Maestre, Gladys E

    2017-10-19

    Although 24-hour blood pressure (BP) variability (BPV) is predictive of cardiovascular outcomes independent of absolute BP levels, it is not regularly assessed in clinical practice. One possible limitation to routine BPV assessment is the lack of standardized methods for accurately estimating 24-hour BPV. We conducted a systematic review to assess the predictive power of reported BPV indexes to address appropriate quantification of 24-hour BPV, including the average real variability (ARV) index. Studies chosen for review were those that presented data for 24-hour BPV in adults from meta-analysis, longitudinal or cross-sectional design, and examined BPV in terms of the following issues: (1) methods used to calculate and evaluate ARV; (2) assessment of 24-hour BPV determined using noninvasive ambulatory BP monitoring; (3) multivariate analysis adjusted for covariates, including some measure of BP; (4) association of 24-hour BPV with subclinical organ damage; and (5) the predictive value of 24-hour BPV on target organ damage and rate of cardiovascular events. Of the 19 assessed studies, 17 reported significant associations between high ARV and the presence and progression of subclinical organ damage, as well as the incidence of hard end points, such as cardiovascular events. In all these cases, ARV remained a significant independent predictor ( P <0.05) after adjustment for BP and other clinical factors. In addition, increased ARV in systolic BP was associated with risk of all cardiovascular events (hazard ratio, 1.18; 95% confidence interval, 1.09-1.27). Only 2 cross-sectional studies did not find that high ARV was a significant risk factor. Current evidence suggests that ARV index adds significant prognostic information to 24-hour ambulatory BP monitoring and is a useful approach for studying the clinical value of BPV. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  6. Comparison of air-standard rectangular cycles with different specific heat models

    International Nuclear Information System (INIS)

    Wang, Chao; Chen, Lingen; Ge, Yanlin; Sun, Fengrui

    2016-01-01

    Highlights: • Air-standard rectangular cycle models are built and investigated. • Finite-time thermodynamics is applied. • Different dissipation models and variable specific heats models are adopted. • Performance characteristics of different cycle models are compared. - Abstract: In this paper, performance comparison of air-standard rectangular cycles with constant specific heat (SH), linear variable SH and non-linear variable SH are conducted by using finite time thermodynamics. The power output and efficiency of each cycle model and the characteristic curves of power output versus compression ratio, efficiency versus compression ratio, as well as power output versus efficiency are obtained by taking heat transfer loss (HTL) and friction loss (FL) into account. The influences of HTL, FL and SH on cycle performance are analyzed by detailed numerical examples.

  7. Randomized Trial of a Lifestyle Physical Activity Intervention for Breast Cancer Survivors: Effects on Transtheoretical Model Variables.

    Science.gov (United States)

    Scruggs, Stacie; Mama, Scherezade K; Carmack, Cindy L; Douglas, Tommy; Diamond, Pamela; Basen-Engquist, Karen

    2018-01-01

    This study examined whether a physical activity intervention affects transtheoretical model (TTM) variables that facilitate exercise adoption in breast cancer survivors. Sixty sedentary breast cancer survivors were randomized to a 6-month lifestyle physical activity intervention or standard care. TTM variables that have been shown to facilitate exercise adoption and progress through the stages of change, including self-efficacy, decisional balance, and processes of change, were measured at baseline, 3 months, and 6 months. Differences in TTM variables between groups were tested using repeated measures analysis of variance. The intervention group had significantly higher self-efficacy ( F = 9.55, p = .003) and perceived significantly fewer cons of exercise ( F = 5.416, p = .025) at 3 and 6 months compared with the standard care group. Self-liberation, counterconditioning, and reinforcement management processes of change increased significantly from baseline to 6 months in the intervention group, and self-efficacy and reinforcement management were significantly associated with improvement in stage of change. The stage-based physical activity intervention increased use of select processes of change, improved self-efficacy, decreased perceptions of the cons of exercise, and helped participants advance in stage of change. These results point to the importance of using a theory-based approach in interventions to increase physical activity in cancer survivors.

  8. Maintenance Staffing Standards for Zero-Based Budgeting.

    Science.gov (United States)

    Adams, Matthew C.; And Others

    1998-01-01

    Discusses school preventive maintenance and the variables associated with maintenance staffing standards that address a zero-based budgeting environment. Explores preventive-maintenance measurement for staffing requirements, defines staffing levels and job descriptions, and outlines the factors to consider when creating a maintenance program and…

  9. Scaling prediction errors to reward variability benefits error-driven learning in humans.

    Science.gov (United States)

    Diederen, Kelly M J; Schultz, Wolfram

    2015-09-01

    Effective error-driven learning requires individuals to adapt learning to environmental reward variability. The adaptive mechanism may involve decays in learning rate across subsequent trials, as shown previously, and rescaling of reward prediction errors. The present study investigated the influence of prediction error scaling and, in particular, the consequences for learning performance. Participants explicitly predicted reward magnitudes that were drawn from different probability distributions with specific standard deviations. By fitting the data with reinforcement learning models, we found scaling of prediction errors, in addition to the learning rate decay shown previously. Importantly, the prediction error scaling was closely related to learning performance, defined as accuracy in predicting the mean of reward distributions, across individual participants. In addition, participants who scaled prediction errors relative to standard deviation also presented with more similar performance for different standard deviations, indicating that increases in standard deviation did not substantially decrease "adapters'" accuracy in predicting the means of reward distributions. However, exaggerated scaling beyond the standard deviation resulted in impaired performance. Thus efficient adaptation makes learning more robust to changing variability. Copyright © 2015 the American Physiological Society.

  10. Variability in large-scale wind power generation: Variability in large-scale wind power generation

    Energy Technology Data Exchange (ETDEWEB)

    Kiviluoma, Juha [VTT Technical Research Centre of Finland, Espoo Finland; Holttinen, Hannele [VTT Technical Research Centre of Finland, Espoo Finland; Weir, David [Energy Department, Norwegian Water Resources and Energy Directorate, Oslo Norway; Scharff, Richard [KTH Royal Institute of Technology, Electric Power Systems, Stockholm Sweden; Söder, Lennart [Royal Institute of Technology, Electric Power Systems, Stockholm Sweden; Menemenlis, Nickie [Institut de recherche Hydro-Québec, Montreal Canada; Cutululis, Nicolaos A. [DTU, Wind Energy, Roskilde Denmark; Danti Lopez, Irene [Electricity Research Centre, University College Dublin, Dublin Ireland; Lannoye, Eamonn [Electric Power Research Institute, Palo Alto California USA; Estanqueiro, Ana [LNEG, Laboratorio Nacional de Energia e Geologia, UESEO, Lisbon Spain; Gomez-Lazaro, Emilio [Renewable Energy Research Institute and DIEEAC/EDII-AB, Castilla-La Mancha University, Albacete Spain; Zhang, Qin [State Grid Corporation of China, Beijing China; Bai, Jianhua [State Grid Energy Research Institute Beijing, Beijing China; Wan, Yih-Huei [National Renewable Energy Laboratory, Transmission and Grid Integration Group, Golden Colorado USA; Milligan, Michael [National Renewable Energy Laboratory, Transmission and Grid Integration Group, Golden Colorado USA

    2015-10-25

    The paper demonstrates the characteristics of wind power variability and net load variability in multiple power systems based on real data from multiple years. Demonstrated characteristics include probability distribution for different ramp durations, seasonal and diurnal variability and low net load events. The comparison shows regions with low variability (Sweden, Spain and Germany), medium variability (Portugal, Ireland, Finland and Denmark) and regions with higher variability (Quebec, Bonneville Power Administration and Electric Reliability Council of Texas in North America; Gansu, Jilin and Liaoning in China; and Norway and offshore wind power in Denmark). For regions with low variability, the maximum 1 h wind ramps are below 10% of nominal capacity, and for regions with high variability, they may be close to 30%. Wind power variability is mainly explained by the extent of geographical spread, but also higher capacity factor causes higher variability. It was also shown how wind power ramps are autocorrelated and dependent on the operating output level. When wind power was concentrated in smaller area, there were outliers with high changes in wind output, which were not present in large areas with well-dispersed wind power.

  11. Measuring Variability in the Presence of Noise

    Science.gov (United States)

    Welsh, W. F.

    Quantitative measurements of a variable signal in the presence of noise requires very careful attention to subtle affects which can easily bias the measurements. This is not limited to the low-count rate regime, nor is the bias error necessarily small. In this talk I will mention some of the dangers in applying standard techniques which are appropriate for high signal to noise data but fail in the cases where the S/N is low. I will discuss methods for correcting the bias in the these cases, both for periodic and non-periodic variability, and will introduce the concept of the ``filtered de-biased RMS''. I will also illustrate some common abuses of power spectrum interpretation. All of these points will be illustrated with examples from recent work on CV and AGN variability.

  12. Correlation between muscle electrical impedance data and standard neurophysiologic parameters after experimental neurogenic injury

    International Nuclear Information System (INIS)

    Ahad, M; Rutkove, S B

    2010-01-01

    Previous work has shown that electrical impedance measurements of muscle can assist in quantifying the degree of muscle atrophy resulting from neuronal injury, with impedance values correlating strongly with standard clinical parameters. However, the relationship between such data and neurophysiologic measurements is unexplored. In this study, 24 Wistar rats underwent sciatic crush, with measurement of the 2–1000 kHz impedance spectrum, standard electrophysiological measures, including nerve conduction studies, needle electromyography, and motor unit number estimation (MUNE) before and after sciatic crush, with animals assessed weekly for 4 weeks post-injury. All electrical impedance values, including a group of 'collapsed' variables, in which the spectral characteristics were reduced to single values, showed reductions as high as 47.2% after sciatic crush, paralleling and correlating with changes in compound motor action potential amplitude, conduction velocity and most closely to MUNE, but not to the presence of fibrillation potentials observed on needle electromyography. These results support the concept that localized impedance measurements can serve as surrogate makers of nerve injury; these measurements may be especially useful in assessing nerve injury impacting proximal or axial muscles where standard quantitative neurophysiologic methods such as nerve conduction or MUNE cannot be readily performed

  13. Environmental lead exposure is associated with visit-to-visit systolic blood pressure variability in the US adults.

    Science.gov (United States)

    Faramawi, Mohammed F; Delongchamp, Robert; Lin, Yu-Sheng; Liu, Youcheng; Abouelenien, Saly; Fischbach, Lori; Jadhav, Supriya

    2015-04-01

    The association between environmental lead exposure and blood pressure variability, an important risk factor for cardiovascular disease, is unexplored and unknown. The objective of the study was to test the hypothesis that lead exposure is associated with blood pressure variability. American participants 17 years of age or older from National Health and Nutrition Examination Survey III were included in the analysis. Participants' blood lead concentrations expressed as micrograms per deciliter were determined. The standard deviations of visit-to-visit systolic and diastolic blood pressure were calculated to determine blood pressure variability. Multivariable regression analyses adjusted for age, gender, race, smoking and socioeconomic status were employed. The participants' mean age and mean blood lead concentration were 42.72 years and 3.44 mcg/dl, respectively. Systolic blood pressure variability was significantly associated with environmental lead exposure after adjusting for the effect of the confounders. The unadjusted and adjusted means of visit-to-visit systolic blood pressure variability and the β coefficient of lead exposure were 3.44, 3.33 mcg/dl, β coefficient = 0.07, P variability. Screening adults with fluctuating blood pressure for lead exposure could be warranted.

  14. Intraindividual variability in reaction time before and after neoadjuvant chemotherapy in women diagnosed with breast cancer.

    Science.gov (United States)

    Yao, Christie; Rich, Jill B; Tirona, Kattleya; Bernstein, Lori J

    2017-12-01

    Women treated with chemotherapy for breast cancer experience subtle cognitive deficits. Research has focused on mean performance level, yet recent work suggests that within-person variability in reaction time performance may underlie cognitive symptoms. We examined intraindividual variability (IIV) in women diagnosed with breast cancer and treated with neoadjuvant chemotherapy. Patients (n = 28) were assessed at baseline before chemotherapy (T1), approximately 1 month after chemotherapy but prior to surgery (T2), and after surgery about 9 months post chemotherapy (T3). Healthy women of similar age and education (n = 20) were assessed at comparable time intervals. Using a standardized regression-based approach, we examined changes in mean performance level and IIV (eg, intraindividual standard deviation) on a Stroop task and self-report measures of cognitive function from T1 to T2 and T1 to T3. At T1, women with breast cancer were more variable than controls as task complexity increased. Change scores from T1 to T2 were similar between groups on all Stroop performance measures. From T1 to T3, controls improved more than women with breast cancer. IIV was more sensitive than mean reaction time in capturing group differences. Additional analyses showed increased cognitive symptoms reported by women with breast cancer from T1 to T3. Specifically, change in language symptoms was positively correlated with change in variability. Women with breast cancer declined in attention and inhibitory control relative to pretreatment performance. Future studies should include measures of variability, because they are an important sensitive indicator of change in cognitive function. Copyright © 2016 John Wiley & Sons, Ltd.

  15. Variability in the Anthropometric Status of Four South Mrican ...

    African Journals Online (AJOL)

    1974-03-30

    optimal' nutrition and undernutrition. It is shown that confidence limits based on a central value of the standard deviation (a) do not take into account the increasing variability with age noted in most parameters in populations.

  16. The Value of Data and Metadata Standardization for Interoperability in Giovanni

    Science.gov (United States)

    Smit, C.; Hegde, M.; Strub, R. F.; Bryant, K.; Li, A.; Petrenko, M.

    2017-12-01

    Giovanni (https://giovanni.gsfc.nasa.gov/giovanni/) is a data exploration and visualization tool at the NASA Goddard Earth Sciences Data Information Services Center (GES DISC). It has been around in one form or another for more than 15 years. Giovanni calculates simple statistics and produces 22 different visualizations for more than 1600 geophysical parameters from more than 90 satellite and model products. Giovanni relies on external data format standards to ensure interoperability, including the NetCDF CF Metadata Conventions. Unfortunately, these standards were insufficient to make Giovanni's internal data representation truly simple to use. Finding and working with dimensions can be convoluted with the CF Conventions. Furthermore, the CF Conventions are silent on machine-friendly descriptive metadata such as the parameter's source product and product version. In order to simplify analyzing disparate earth science data parameters in a unified way, we developed Giovanni's internal standard. First, the format standardizes parameter dimensions and variables so they can be easily found. Second, the format adds all the machine-friendly metadata Giovanni needs to present our parameters to users in a consistent and clear manner. At a glance, users can grasp all the pertinent information about parameters both during parameter selection and after visualization. This poster gives examples of how our metadata and data standards, both external and internal, have both simplified our code base and improved our users' experiences.

  17. Use of a Deuterated Internal Standard with Pyrolysis-GC/MS Dimeric Marker Analysis to Quantify Tire Tread Particles in the Environment

    Directory of Open Access Journals (Sweden)

    Julie M. Panko

    2012-11-01

    Full Text Available Pyrolysis(pyr-GC/MS analysis of characteristic thermal decomposition fragments has been previously used for qualitative fingerprinting of organic sources in environmental samples. A quantitative pyr-GC/MS method based on characteristic tire polymer pyrolysis products was developed for tread particle quantification in environmental matrices including soil, sediment, and air. The feasibility of quantitative pyr-GC/MS analysis of tread was confirmed in a method evaluation study using artificial soil spiked with known amounts of cryogenically generated tread. Tread concentration determined by blinded analyses was highly correlated (r2 ³ 0.88 with the known tread spike concentration. Two critical refinements to the initial pyrolysis protocol were identified including use of an internal standard and quantification by the dimeric markers vinylcyclohexene and dipentene, which have good specificity for rubber polymer with no other appreciable environmental sources. A novel use of deuterated internal standards of similar polymeric structure was developed to correct the variable analyte recovery caused by sample size, matrix effects, and ion source variability. The resultant quantitative pyr-GC/MS protocol is reliable and transferable between laboratories.

  18. Use of a deuterated internal standard with pyrolysis-GC/MS dimeric marker analysis to quantify tire tread particles in the environment.

    Science.gov (United States)

    Unice, Kenneth M; Kreider, Marisa L; Panko, Julie M

    2012-11-08

    Pyrolysis(pyr)-GC/MS analysis of characteristic thermal decomposition fragments has been previously used for qualitative fingerprinting of organic sources in environmental samples. A quantitative pyr-GC/MS method based on characteristic tire polymer pyrolysis products was developed for tread particle quantification in environmental matrices including soil, sediment, and air. The feasibility of quantitative pyr-GC/MS analysis of tread was confirmed in a method evaluation study using artificial soil spiked with known amounts of cryogenically generated tread. Tread concentration determined by blinded analyses was highly correlated (r2 ≥ 0.88) with the known tread spike concentration. Two critical refinements to the initial pyrolysis protocol were identified including use of an internal standard and quantification by the dimeric markers vinylcyclohexene and dipentene, which have good specificity for rubber polymer with no other appreciable environmental sources. A novel use of deuterated internal standards of similar polymeric structure was developed to correct the variable analyte recovery caused by sample size, matrix effects, and ion source variability. The resultant quantitative pyr-GC/MS protocol is reliable and transferable between laboratories.

  19. Several real variables

    CERN Document Server

    Kantorovitz, Shmuel

    2016-01-01

    This undergraduate textbook is based on lectures given by the author on the differential and integral calculus of functions of several real variables. The book has a modern approach and includes topics such as: •The p-norms on vector space and their equivalence •The Weierstrass and Stone-Weierstrass approximation theorems •The differential as a linear functional; Jacobians, Hessians, and Taylor's theorem in several variables •The Implicit Function Theorem for a system of equations, proved via Banach’s Fixed Point Theorem •Applications to Ordinary Differential Equations •Line integrals and an introduction to surface integrals This book features numerous examples, detailed proofs, as well as exercises at the end of sections. Many of the exercises have detailed solutions, making the book suitable for self-study. Several Real Variables will be useful for undergraduate students in mathematics who have completed first courses in linear algebra and analysis of one real variable.

  20. Several complex variables

    International Nuclear Information System (INIS)

    Field, M.J.

    1976-01-01

    Topics discussed include the elementary of holomorphic functions of several complex variables; the Weierstrass preparation theorem; meromorphic functions, holomorphic line bundles and divisors; elliptic operators on compact manifolds; hermitian connections; the Hodge decomposition theorem. ( author)

  1. EU-US standards harmonization task group report : status of ITS communication standards.

    Science.gov (United States)

    2012-11-01

    Harmonization Task Groups 1 and 3 (HTG1 and 3) were established by the EU-US International Standards Harmonization Working Group to attempt to harmonize standards (including ISO, CEN, ETSI, IEEE) on security (HTG1) and communications protocols (HTG3)...

  2. Comparison of different standards used in radioimmunoassay for atrial natriuretic factor (ANF)

    DEFF Research Database (Denmark)

    Rasmussen, Peter Have; Nielsen, M. Damkjær; Giese, J.

    1991-01-01

    , estimates of the ANF content in human plasma samples with different standard preparations as the reference showed a considerable variability. With the international standard as the gold reference (plasma ANF concentration 100%) the apparent plasma ANF concentrations measured with the other reference......Six different standards for determination of atrial natriuretic factor (ANF) in human plasma samples have been compared using our radio-immunoassay for ANF: International standard 85/669, National Biological Standard Boards, UK; Bachem standard, Torrance, USA; Bachem standard, Bubendorf......, Switzerland; Bissendorf standard, Wedemark, Germany; Peninsula standard, Belmont, USA; UCB-Bioproducts standard, Brussels, Belgium, Standard curves obtained with different preparations were in parallel but showed considerable quantitative differences. Standard curves referring to the Bissendorf standard...

  3. Spatial modelling of marine organisms in Forsmark and Oskarshamn. Including calculation of physical predictor variables

    Energy Technology Data Exchange (ETDEWEB)

    Carlen, Ida; Nikolopoulos, Anna; Isaeus, Martin (AquaBiota Water Research, Stockholm (SE))

    2007-06-15

    GIS grids (maps) of marine parameters were created using point data from previous site investigations in the Forsmark and Oskarshamn areas. The proportion of global radiation reaching the sea bottom in Forsmark and Oskarshamn was calculated in ArcView, using Secchi depth measurements and the digital elevation models for the respective area. The number of days per year when the incoming light exceeds 5 MJ/m2 at the bottom was then calculated using the result of the previous calculations together with measured global radiation. Existing modelled grid-point data on bottom and pelagic temperature for Forsmark were interpolated to create surface covering grids. Bottom and pelagic temperature grids for Oskarshamn were calculated using point measurements to achieve yearly averages for a few points and then using regressions with existing grids to create new maps. Phytoplankton primary production in Forsmark was calculated using point measurements of chlorophyll and irradiance, and a regression with a modelled grid of Secchi depth. Distribution of biomass of macrophyte communities in Forsmark and Oskarshamn was calculated using spatial modelling in GRASP, based on field data from previous surveys. Physical parameters such as those described above were used as predictor variables. Distribution of biomass of different functional groups of fish in Forsmark was calculated using spatial modelling based on previous surveys and with predictor variables such as physical parameters and results from macrophyte modelling. All results are presented as maps in the report. The quality of the modelled predictions varies as a consequence of the quality and amount of the input data, the ecology and knowledge of the predicted phenomena, and by the modelling technique used. A substantial part of the variation is not described by the models, which should be expected for biological modelling. Therefore, the resulting grids should be used with caution and with this uncertainty kept in mind. All

  4. Spatial modelling of marine organisms in Forsmark and Oskarshamn. Including calculation of physical predictor variables

    International Nuclear Information System (INIS)

    Carlen, Ida; Nikolopoulos, Anna; Isaeus, Martin

    2007-06-01

    GIS grids (maps) of marine parameters were created using point data from previous site investigations in the Forsmark and Oskarshamn areas. The proportion of global radiation reaching the sea bottom in Forsmark and Oskarshamn was calculated in ArcView, using Secchi depth measurements and the digital elevation models for the respective area. The number of days per year when the incoming light exceeds 5 MJ/m2 at the bottom was then calculated using the result of the previous calculations together with measured global radiation. Existing modelled grid-point data on bottom and pelagic temperature for Forsmark were interpolated to create surface covering grids. Bottom and pelagic temperature grids for Oskarshamn were calculated using point measurements to achieve yearly averages for a few points and then using regressions with existing grids to create new maps. Phytoplankton primary production in Forsmark was calculated using point measurements of chlorophyll and irradiance, and a regression with a modelled grid of Secchi depth. Distribution of biomass of macrophyte communities in Forsmark and Oskarshamn was calculated using spatial modelling in GRASP, based on field data from previous surveys. Physical parameters such as those described above were used as predictor variables. Distribution of biomass of different functional groups of fish in Forsmark was calculated using spatial modelling based on previous surveys and with predictor variables such as physical parameters and results from macrophyte modelling. All results are presented as maps in the report. The quality of the modelled predictions varies as a consequence of the quality and amount of the input data, the ecology and knowledge of the predicted phenomena, and by the modelling technique used. A substantial part of the variation is not described by the models, which should be expected for biological modelling. Therefore, the resulting grids should be used with caution and with this uncertainty kept in mind. All

  5. Evaluating Living Standard Indicators

    Directory of Open Access Journals (Sweden)

    Birčiaková Naďa

    2015-09-01

    Full Text Available This paper deals with the evaluation of selected available indicators of living standards, divided into three groups, namely economic, environmental, and social. We have selected six countries of the European Union for analysis: Bulgaria, the Czech Republic, Hungary, Luxembourg, France, and Great Britain. The aim of this paper is to evaluate indicators measuring living standards and suggest the most important factors which should be included in the final measurement. We have tried to determine what factors influence each indicator and what factors affect living standards. We have chosen regression analysis as our main method. From the study of factors, we can deduce their impact on living standards, and thus the value of indicators of living standards. Indicators with a high degree of reliability include the following factors: size and density of population, health care and spending on education. Emissions of carbon dioxide in the atmosphere also have a certain lower degree of reliability.

  6. Standard recommended practice for examination of fuel element cladding including the determination of the mechanical properties

    International Nuclear Information System (INIS)

    Anon.

    1975-01-01

    Guidelines are provided for the post-irradiation examination of fuel cladding and to achieve better correlation and interpretation of the data in the field of radiation effects. The recommended practice is applicable to metal cladding of all types of fuel elements. The tests cited are suitable for determining mechanical properties of the fuel elements cladding. Various ASTM standards and test methods are cited

  7. Enhancing translation: guidelines for standard pre-clinical experiments in mdx mice.

    Science.gov (United States)

    Willmann, Raffaella; De Luca, Annamaria; Benatar, Michael; Grounds, Miranda; Dubach, Judith; Raymackers, Jean-Marc; Nagaraju, Kanneboyina

    2012-01-01

    Duchenne Muscular Dystrophy is an X-linked disorder that affects boys and leads to muscle wasting and death due to cardiac involvement and respiratory complications. The cause is the absence of dystrophin, a large structural protein indispensable for muscle cell function and viability. The mdx mouse has become the standard animal model for pre-clinical evaluation of potential therapeutic treatments. Recent years have seen a rapid increase in the number of experimental compounds being evaluated in the mdx mouse. There is, however, much variability in the design of these pre-clinical experimental studies. This has made it difficult to interpret and compare published data from different laboratories and to evaluate the potential of a treatment for application to patients. The authors therefore propose the introduction of a standard study design for the mdx mouse model. Several aspects, including animal care, sampling times and choice of tissues, as well as recommended endpoints and methodologies are addressed and, for each aspect, a standard procedure is proposed. Testing of all new molecules/drugs using a widely accepted and agreed upon standard experimental protocol would greatly improve the power of pre-clinical experimentations and help identifying promising therapies for the translation into clinical trials for boys with Duchenne Muscular Dystrophy. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. WKB wave function for many-variable systems

    International Nuclear Information System (INIS)

    Sakita, B.; Tzani, R.

    1986-01-01

    The WKB method is a non-perturbative semi-classical method in quantum mechanics. The method for a system of one degree of freedom is well known and described in standard textbooks. The method for a system with many degrees of freedom especially for quantum fields is more involved. There exist two methods: Feynman path integral and Schrodinger wave function. The Feynman path integral WKB method is essentially a stationary phase approximation for Feynman path integrals. The WKB Schrodinger wave function method is on the other hand an extension of the standard WKB to many-variable systems

  9. Surfing wave climate variability

    Science.gov (United States)

    Espejo, Antonio; Losada, Iñigo J.; Méndez, Fernando J.

    2014-10-01

    International surfing destinations are highly dependent on specific combinations of wind-wave formation, thermal conditions and local bathymetry. Surf quality depends on a vast number of geophysical variables, and analyses of surf quality require the consideration of the seasonal, interannual and long-term variability of surf conditions on a global scale. A multivariable standardized index based on expert judgment is proposed for this purpose. This index makes it possible to analyze surf conditions objectively over a global domain. A summary of global surf resources based on a new index integrating existing wave, wind, tides and sea surface temperature databases is presented. According to general atmospheric circulation and swell propagation patterns, results show that west-facing low to middle-latitude coasts are more suitable for surfing, especially those in the Southern Hemisphere. Month-to-month analysis reveals strong seasonal variations in the occurrence of surfable events, enhancing the frequency of such events in the North Atlantic and the North Pacific. Interannual variability was investigated by comparing occurrence values with global and regional modes of low-frequency climate variability such as El Niño and the North Atlantic Oscillation, revealing their strong influence at both the global and the regional scale. Results of the long-term trends demonstrate an increase in the probability of surfable events on west-facing coasts around the world in recent years. The resulting maps provide useful information for surfers, the surf tourism industry and surf-related coastal planners and stakeholders.

  10. Gold-standard for computer-assisted morphological sperm analysis.

    Science.gov (United States)

    Chang, Violeta; Garcia, Alejandra; Hitschfeld, Nancy; Härtel, Steffen

    2017-04-01

    Published algorithms for classification of human sperm heads are based on relatively small image databases that are not open to the public, and thus no direct comparison is available for competing methods. We describe a gold-standard for morphological sperm analysis (SCIAN-MorphoSpermGS), a dataset of sperm head images with expert-classification labels in one of the following classes: normal, tapered, pyriform, small or amorphous. This gold-standard is for evaluating and comparing known techniques and future improvements to present approaches for classification of human sperm heads for semen analysis. Although this paper does not provide a computational tool for morphological sperm analysis, we present a set of experiments for comparing sperm head description and classification common techniques. This classification base-line is aimed to be used as a reference for future improvements to present approaches for human sperm head classification. The gold-standard provides a label for each sperm head, which is achieved by majority voting among experts. The classification base-line compares four supervised learning methods (1- Nearest Neighbor, naive Bayes, decision trees and Support Vector Machine (SVM)) and three shape-based descriptors (Hu moments, Zernike moments and Fourier descriptors), reporting the accuracy and the true positive rate for each experiment. We used Fleiss' Kappa Coefficient to evaluate the inter-expert agreement and Fisher's exact test for inter-expert variability and statistical significant differences between descriptors and learning techniques. Our results confirm the high degree of inter-expert variability in the morphological sperm analysis. Regarding the classification base line, we show that none of the standard descriptors or classification approaches is best suitable for tackling the problem of sperm head classification. We discovered that the correct classification rate was highly variable when trying to discriminate among non-normal sperm

  11. IRAS variables as galactic structure tracers - Classification of the bright variables

    Science.gov (United States)

    Allen, L. E.; Kleinmann, S. G.; Weinberg, M. D.

    1993-01-01

    The characteristics of the 'bright infrared variables' (BIRVs), a sample consisting of the 300 brightest stars in the IRAS Point Source Catalog with IRAS variability index VAR of 98 or greater, are investigated with the purpose of establishing which of IRAS variables are AGB stars (e.g., oxygen-rich Miras and carbon stars, as was assumed by Weinberg (1992)). Results of the analysis of optical, infrared, and microwave spectroscopy of these stars indicate that, out of 88 stars in the BIRV sample identified with cataloged variables, 86 can be classified as Miras. Results of a similar analysis performed for a color-selected sample of stars, using the color limits employed by Habing (1988) to select AGB stars, showed that, out of 52 percent of classified stars, 38 percent are non-AGB stars, including H II regions, planetary nebulae, supergiants, and young stellar objects, indicating that studies using color-selected samples are subject to misinterpretation.

  12. Extent of Implementing Accreditation and Quality Assurance Standards in Azal University of Human Development from the Faculty Members’ Perspective

    Directory of Open Access Journals (Sweden)

    Mohammed Zain Saleh ALSadi

    2017-10-01

    Full Text Available The research aimed to find out how far accreditation and quality assurance standards in Azal University for human development are implemented from the perspective of faculty members. To achieve this objective, to the researchers adopted descriptive analytical approach techniques. The research population was all the teaching staff at the university; and the sample consisted of (94 faculty members, (48.45 % of the total number of the population. A questionnaire was designed to collect data relevant to the testing of the research objectives. The questionnaire consisted of two parts: the first included the personal data, while the second part included the standards of accreditation and quality assurance. The study revealed the following results: The mean of implementing standards as a whole was (3.44, the standard deviation (0.76, and the extent of the using the standards was (high. There were no significant differences between the research participants’ responses about the extent of using the standards due to the variables (gender – Qualification – college type – years of teaching experience. In light of the results of the study, a set of recommendations were presented, including the need to provide the necessary requirements for implementing accreditation and quality assurance standards, whether material, human or financial resources, and creating a positive conducive learning environment to be suitable and ready for a complete implementation of quality standards. One of the main suggestions made by the research was to conduct a similar study on government and private universities and community colleges in Yemen. Keywords: Quality Assurance and accreditation; Azal University of Human Development.

  13. Neoclassical transport including collisional nonlinearity.

    Science.gov (United States)

    Candy, J; Belli, E A

    2011-06-10

    In the standard δf theory of neoclassical transport, the zeroth-order (Maxwellian) solution is obtained analytically via the solution of a nonlinear equation. The first-order correction δf is subsequently computed as the solution of a linear, inhomogeneous equation that includes the linearized Fokker-Planck collision operator. This equation admits analytic solutions only in extreme asymptotic limits (banana, plateau, Pfirsch-Schlüter), and so must be solved numerically for realistic plasma parameters. Recently, numerical codes have appeared which attempt to compute the total distribution f more accurately than in the standard ordering by retaining some nonlinear terms related to finite-orbit width, while simultaneously reusing some form of the linearized collision operator. In this work we show that higher-order corrections to the distribution function may be unphysical if collisional nonlinearities are ignored.

  14. Food Standards are Good – for Middle-Class Farmers

    DEFF Research Database (Denmark)

    Hansen, Henrik; Trifkovic, Neda

    2014-01-01

    We estimate the causal effect of food standards on Vietnamese pangasius farmers’ wellbeing measured by per capita consumption expenditure. We estimate both the average effects and the local average treatment effects on poorer and richer farmers by instrumental variable quantile regression. Our...... results indicate that large returns can be accrued from food standards, but only for the upper middle-class farmers, i.e., those between the 50% and 85% quantiles of the expenditure distribution. Overall, our result points to an exclusionary impact of standards for the poorest farmers while the richest do...

  15. Generalized Network Psychometrics : Combining Network and Latent Variable Models

    NARCIS (Netherlands)

    Epskamp, S.; Rhemtulla, M.; Borsboom, D.

    2017-01-01

    We introduce the network model as a formal psychometric model, conceptualizing the covariance between psychometric indicators as resulting from pairwise interactions between observable variables in a network structure. This contrasts with standard psychometric models, in which the covariance between

  16. Influence of sleep apnea severity on blood pressure variability of patients with hypertension.

    Science.gov (United States)

    Steinhorst, Ana P; Gonçalves, Sandro C; Oliveira, Ana T; Massierer, Daniela; Gus, Miguel; Fuchs, Sandra C; Moreira, Leila B; Martinez, Denis; Fuchs, Flávio D

    2014-05-01

    Obstructive sleep apnea (OSA) is a risk factor for the development of hypertension and cardiovascular disease. Apnea overloads the autonomic cardiovascular control system and may influence blood pressure variability, a risk for vascular damage independent of blood pressure levels. This study investigates the hypothesis that blood pressure variability is associated with OSA. In a cross-sectional study, 107 patients with hypertension underwent 24-h ambulatory blood pressure monitoring and level III polysomnography to detect sleep apnea. Pressure variability was assessed by the first derivative of blood pressure over time, the time rate index, and by the standard deviation of blood pressure measurements. The association between the apnea-hypopnea index and blood pressure variability was tested by univariate and multivariate methods. The 57 patients with apnea were older, had higher blood pressure, and had longer duration of hypertension than the 50 patients without apnea. Patients with apnea-hypopnea index (AHI) ≥ 10 had higher blood pressure variability assessed by the standard deviation than patients with AHI variability assessed by the time rate index presented a trend for association during sleep (P = 0.07). Daytime blood pressure variability was not associated with the severity of sleep apnea. Sleep apnea increases nighttime blood pressure variability in patients with hypertension and may be another pathway linking sleep abnormalities to cardiovascular disease.

  17. Evaluation of measurement reproducibility using the standard-sites data, 1994 Fernald field characterization demonstration project

    International Nuclear Information System (INIS)

    Rautman, C.A.

    1996-02-01

    The US Department of Energy conducted the 1994 Fernald (Ohio) field characterization demonstration project to evaluate the performance of a group of both industry-standard and proposed alternative technologies in describing the nature and extent of uranium contamination in surficial soils. Detector stability and measurement reproducibility under actual operating conditions encountered in the field is critical to establishing the credibility of the proposed alternative characterization methods. Comparability of measured uranium activities to those reported by conventional, US Environmental Protection Agency (EPA)-certified laboratory methods is also required. The eleven (11) technologies demonstrated included (1) EPA-standard soil sampling and laboratory mass-spectroscopy analyses, and currently-accepted field-screening techniques using (2) sodium-iodide scintillometers, (3) FIDLER low-energy scintillometers, and (4) a field-portable x-ray fluorescence spectrometer. Proposed advanced characterization techniques included (5) alpha-track detectors, (6) a high-energy beta scintillometer, (7) electret ionization chambers, (8) and (9) a high-resolution gamma-ray spectrometer in two different configurations, (10) a field-adapted laser ablation-inductively coupled plasma-atomic emission spectroscopy (ICP-AES) technique, and (11) a long-range alpha detector. Measurement reproducibility and the accuracy of each method were tested by acquiring numerous replicate measurements of total uranium activity at each of two ''standard sites'' located within the main field demonstration area. Meteorological variables including temperature, relative humidity. and 24-hour rainfall quantities were also recorded in conjunction with the standard-sites measurements

  18. Evolving spiking networks with variable resistive memories.

    Science.gov (United States)

    Howard, Gerard; Bull, Larry; de Lacy Costello, Ben; Gale, Ella; Adamatzky, Andrew

    2014-01-01

    Neuromorphic computing is a brainlike information processing paradigm that requires adaptive learning mechanisms. A spiking neuro-evolutionary system is used for this purpose; plastic resistive memories are implemented as synapses in spiking neural networks. The evolutionary design process exploits parameter self-adaptation and allows the topology and synaptic weights to be evolved for each network in an autonomous manner. Variable resistive memories are the focus of this research; each synapse has its own conductance profile which modifies the plastic behaviour of the device and may be altered during evolution. These variable resistive networks are evaluated on a noisy robotic dynamic-reward scenario against two static resistive memories and a system containing standard connections only. The results indicate that the extra behavioural degrees of freedom available to the networks incorporating variable resistive memories enable them to outperform the comparative synapse types.

  19. Variability in Pretest-Posttest Correlation Coefficients by Student Achievement Level. NCEE 2011-4033

    Science.gov (United States)

    Cole, Russell; Haimson, Joshua; Perez-Johnson, Irma; May, Henry

    2011-01-01

    State assessments are increasingly used as outcome measures for education evaluations. The scaling of state assessments produces variability in measurement error, with the conditional standard error of measurement increasing as average student ability moves toward the tails of the achievement distribution. This report examines the variability in…

  20. DCC DIFFUSE Standards Frameworks: A Standards Path through the Curation Lifecycle

    Directory of Open Access Journals (Sweden)

    Sarah Higgins

    2009-10-01

    Full Text Available Normal 0 false false false MicrosoftInternetExplorer4 DCC DIFFUSE Standards Frameworks aims to offer domain specific advice on standards relevant to digital preservation and curation, to help curators identify which standards they should be using and where they can be appropriately implemented, to ensure authoritative digital material. The Project uses the DCC Curation Lifecycle Model and Web 2.0 technology, to visually present standards frameworks for a number of disciplines. The Digital Curation Centre (DCC is actively working with a different relevant organisations to present searchable frameworks of standards, for a number of domains. These include digital repositories, records management, the geo-information sector, archives and the museum sector. Other domains, such as e-science, will shortly be investigated.

  1. Symmetry breaking, mixing, instability, and low-frequency variability in a minimal Lorenz-like system.

    Science.gov (United States)

    Lucarini, Valerio; Fraedrich, Klaus

    2009-08-01

    Starting from the classical Saltzman two-dimensional convection equations, we derive via a severe spectral truncation a minimal 10 ODE system which includes the thermal effect of viscous dissipation. Neglecting this process leads to a dynamical system which includes a decoupled generalized Lorenz system. The consideration of this process breaks an important symmetry and couples the dynamics of fast and slow variables, with the ensuing modifications to the structural properties of the attractor and of the spectral features. When the relevant nondimensional number (Eckert number Ec) is different from zero, an additional time scale of O(Ec(-1)) is introduced in the system, as shown with standard multiscale analysis and made clear by several numerical evidences. Moreover, the system is ergodic and hyperbolic, the slow variables feature long-term memory with 1/f(3/2) power spectra, and the fast variables feature amplitude modulation. Increasing the strength of the thermal-viscous feedback has a stabilizing effect, as both the metric entropy and the Kaplan-Yorke attractor dimension decrease monotonically with Ec. The analyzed system features very rich dynamics: it overcomes some of the limitations of the Lorenz system and might have prototypical value in relevant processes in complex systems dynamics, such as the interaction between slow and fast variables, the presence of long-term memory, and the associated extreme value statistics. This analysis shows how neglecting the coupling of slow and fast variables only on the basis of scale analysis can be catastrophic. In fact, this leads to spurious invariances that affect essential dynamical properties (ergodicity, hyperbolicity) and that cause the model losing ability in describing intrinsically multiscale processes.

  2. Evaluation of variable advisory speed limits in work zones.

    Science.gov (United States)

    2013-08-01

    Variable advisory speed limit (VASL) systems could be effective at both urban and rural work zones, at both uncongested and congested sites. At uncongested urban work zones, the average speeds with VASL were lower than without VASL. But the standard ...

  3. Gap-crossing behavior in a standardized and a nonstandardized jumping stone configuration

    NARCIS (Netherlands)

    Sporrel, Karlijn; Caljouw, Simone R.; Withagen, Rob

    2017-01-01

    Over the last years, the omnipresent standardization of playgrounds - the distances between, for example, jumping stones tend to be equal - has been criticized by both scientists and architects. First, it has been argued that standardization fails to do justice to the variability in the children's

  4. Variability in carbon exchange of European croplands

    DEFF Research Database (Denmark)

    Eddy J, Moors; Jacobs, Cor; Jans, Wilma

    2010-01-01

    The estimated net ecosystem exchange (NEE) of CO2 based on measurements at 17 flux sites in Europe for 45 cropping periods showed an average loss of -38 gC m-2 per cropping period. The cropping period is defined as the period after sowing or planting until harvest. The variability taken as the st......The estimated net ecosystem exchange (NEE) of CO2 based on measurements at 17 flux sites in Europe for 45 cropping periods showed an average loss of -38 gC m-2 per cropping period. The cropping period is defined as the period after sowing or planting until harvest. The variability taken...... as the standard deviation of these cropping periods was 251 gC m-2. These numbers do not include lateral inputs such as the carbon content of applied manure, nor the carbon exchange out of the cropping period. Both are expected to have a major effect on the C budget of high energy summer crops such as maize. NEE...... and gross primary production (GPP) can be estimated by crop net primary production based on inventories of biomass at these sites, independent of species and regions. NEE can also be estimated by the product of photosynthetic capacity and the number of days with the average air temperature >5 °C. Yield...

  5. Evaluating variability and uncertainty in radiological impact assessment using SYMBIOSE

    International Nuclear Information System (INIS)

    Simon-Cornu, M.; Beaugelin-Seiller, K.; Boyer, P.; Calmon, P.; Garcia-Sanchez, L.; Mourlon, C.; Nicoulaud, V.; Sy, M.; Gonze, M.A.

    2015-01-01

    SYMBIOSE is a modelling platform that accounts for variability and uncertainty in radiological impact assessments, when simulating the environmental fate of radionuclides and assessing doses to human populations. The default database of SYMBIOSE is partly based on parameter values that are summarized within International Atomic Energy Agency (IAEA) documents. To characterize uncertainty on the transfer parameters, 331 Probability Distribution Functions (PDFs) were defined from the summary statistics provided within the IAEA documents (i.e. sample size, minimal and maximum values, arithmetic and geometric means, standard and geometric standard deviations) and are made available as spreadsheet files. The methods used to derive the PDFs without complete data sets, but merely the summary statistics, are presented. Then, a simple case-study illustrates the use of the database in a second-order Monte Carlo calculation, separating parametric uncertainty and inter-individual variability. - Highlights: • Parametric uncertainty in radioecology was derived from IAEA documents. • 331 Probability Distribution Functions were defined for transfer parameters. • Parametric uncertainty and inter-individual variability were propagated

  6. Intelligent control for large-scale variable speed variable pitch wind turbines

    Institute of Scientific and Technical Information of China (English)

    Xinfang ZHANG; Daping XU; Yibing LIU

    2004-01-01

    Large-scale wind turbine generator systems have strong nonlinear multivariable characteristics with many uncertain factors and disturbances.Automatic control is crucial for the efficiency and reliability of wind turbines.On the basis of simplified and proper model of variable speed variable pitch wind turbines,the effective wind speed is estimated using extended Kalman filter.Intelligent control schemes proposed in the paper include two loops which operate in synchronism with each other.At below-rated wind speed,the inner loop adopts adaptive fuzzy control based on variable universe for generator torque regulation to realize maximum wind energy capture.At above-rated wind speed, a controller based on least square support vector machine is proposed to adjust pitch angle and keep rated output power.The simulation shows the effectiveness of the intelligent control.

  7. Building Standards based Science Information Systems: A Survey of ISO and other standards

    Science.gov (United States)

    King, Todd; Walker, Raymond

    Science Information systems began with individual researchers maintaining personal collec-tions of data and managing them by using ad hoc, specialized approaches. Today information systems are an enterprise consisting of federated systems that manage and distribute both historical and contemporary data from distributed sources. Information systems have many components. Among these are metadata models, metadata registries, controlled vocabularies and ontologies which are used to describe entities and resources. Other components include services to exchange information and data; tools to populate the system and tools to utilize available resources. When constructing information systems today a variety of standards can be useful. The benefit of adopting standards is clear; it can shorten the design cycle, enhance software reuse and enable interoperability. We look at standards from the International Stan-dards Organization (ISO), International Telecommunication Union (ITU), Organization for the Advancement of Structured Information Standards (OASIS), Internet Engineering Task Force (IETF), American National Standards Institute (ANSI) which have influenced the develop-ment of information systems in the Heliophysics and Planetary sciences. No standard can solve the needs of every community. Individual disciplines often must fill the gap between general purpose standards and the unique needs of the discipline. To this end individual science dis-ciplines are developing standards, Examples include the International Virtual Observatory Al-liance (IVOA), Planetary Data System (PDS)/ International Planetary Data Alliance (IPDA), Dublin-Core Science, and the Space Physics Archive Search and Extract (SPASE) consortium. This broad survey of ISO and other standards provides some guidance for the development information systems. The development of the SPASE data model is reviewed and provides some insights into the value of applying appropriate standards and is used to illustrate

  8. Urban Stormwater Quality: Linking Pesticide Variability To Our Sustainable Water Future

    Science.gov (United States)

    Rippy, M.; Deletic, A.; Gernjak, W.

    2015-12-01

    Climate change and global population growth demand creative, multidisciplinary, and multi-benefit approaches for sustaining adequate fresh water resources and protecting ecosystem health. Currently, a driving factor of aquatic ecosystem degradation (stormwater) is also one of the largest untapped urban freshwater resources. This suggests that ecosystem protection and potable water security might both be achieved via treating and capturing stormwater for human use (e.g., potable substitution). The viability of such a scheme, however, depends on 1) initial stormwater quality (e.g., the contaminants present and their associated human/environmental health risks), 2) the spatial and temporal variability of contaminants in stormwater, and 3) the capacity of existing technologies to treat those contaminants to fit for purpose standards. Here we present results from a four year study of urban stormwater conducted across ten catchments and four states in Australia that addresses these three issues relative to stormwater pesticides. In total, 19 pesticides were detected across all sites and times. In general, pesticide concentrations were lower than has been reported in other countries, including the United States, Canada and Europe. This is reflected in few exceedences of public health (< 1%) and aquatic ecosystem standards (0% for invertebrates and fish, < 1% for algae and plants). Interestingly, pesticide patterns were found to be stable across seasons, and years, but varied across catchments. These catchment-specific fingerprints may reflect preferential commercial product use, as they map closely to co-occurrence patterns in registered Australian products. Importantly, the presence of catchment-specific pesticide variability has clear management implications; namely, urban stormwater must be managed at the catchment level and target local contaminant suites in order to best achieve desired human use and environmental protection standards.

  9. Natural circulation under variable primary mass inventories at BETHSY facility

    International Nuclear Information System (INIS)

    Bazin, P.; Clement, P.; Deruaz, R.

    1989-01-01

    BETHSY is a high pressure integral test facility which models a 3 loop Framatome PWR with the intent of studying PWR accidents. The BETHSY programme includes both accident transients and tests under successive steady state conditions. So far, tests of the latter type have been especially devoted to situations where natural circulation takes place in the primary coolant system (PCS). Tests 4.1a and 4.1a TC, the results of which are introduced, deal with PCS natural circulation patterns and related heat transport mechanisms under two different core power levels (2 and 5% of nominal power), variable primary mass inventory (100% to 30-40% according to core power) and at two different steam generator liquid levels (standard value and 1 meter). (orig.)

  10. Electrochemical state and internal variables estimation using a reduced-order physics-based model of a lithium-ion cell and an extended Kalman filter

    Energy Technology Data Exchange (ETDEWEB)

    Stetzel, KD; Aldrich, LL; Trimboli, MS; Plett, GL

    2015-03-15

    This paper addresses the problem of estimating the present value of electrochemical internal variables in a lithium-ion cell in real time, using readily available measurements of cell voltage, current, and temperature. The variables that can be estimated include any desired set of reaction flux and solid and electrolyte potentials and concentrations at any set of one-dimensional spatial locations, in addition to more standard quantities such as state of charge. The method uses an extended Kalman filter along with a one-dimensional physics-based reduced-order model of cell dynamics. Simulations show excellent and robust predictions having dependable error bounds for most internal variables. (C) 2014 Elsevier B.V. All rights reserved.

  11. 1998 federal technical standards workshop: Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-10-01

    The theme for the 1998 workshop was Standards Management -- A World of Change and Opportunities. The workshop`s goal was to further the implementation of the National Technology Transfer and Advancement Act of 1995 (Public Law 104-113) through the sharing of standards management success stories, lessons learned, and emerging initiatives within the Executive Branch of the Federal Government. The target audience for this workshop included agency/department and contractor personnel and representatives of standards developing organizations that either used technical standards in their work for the Federal Government of participated in standards writing/management activities in support of the missions and programs of Federal agencies/departments. As with previous standards workshops sponsored by the DOE, views on the technical subject areas under the workshop theme were solicited from and provided by agency Standards Executives and standards program managers, voluntary standards organizations, and the private sector. This report includes vugraphs of the presentations.

  12. IMPACT OF THE CONVERGENCE PROCESS TO INTERNATIONAL FINANCIAL REPORTING STANDARDS ON THE VALUE RELEVANCE OF FINANCIAL INFORMATION

    Directory of Open Access Journals (Sweden)

    Marcelo Alvaro da Silva Macedo

    2012-11-01

    Full Text Available Law 11.638/07 marked the start of a series of changes in the laws that regulate Brazilian accounting practices. The main reason for these changes is the convergence process of local with international accounting standards. As a result of Law 11.638/07, the legal precedent was established to achieve convergence. In that context, the aim of this study is to analyze the impact of the convergence process with international accounting standards on the relevance of financial information, based on data for 2007, without and with the alterations Law 11.638/07 introduced and according to the CPC Pronouncements, applicable as from 2008 onwards. Therefore, a value relevance study is used, applying regression analysis to annual stock price information (dependent variable and net profit per share (NPPS and net equity per share (NEPS as independent variables. The main results show that financial information on NPPS and NEPS for 2007, with and without the legal alterations, are relevant for the capital market. A comparison between both regressions used in the analysis, however, shows an information gain for financial information that includes the changes introduced in the first phase of the accounting convergence process with the international standards.

  13. Can consistent benchmarking within a standardized pain management concept decrease postoperative pain after total hip arthroplasty? A prospective cohort study including 367 patients.

    Science.gov (United States)

    Benditz, Achim; Greimel, Felix; Auer, Patrick; Zeman, Florian; Göttermann, Antje; Grifka, Joachim; Meissner, Winfried; von Kunow, Frederik

    2016-01-01

    The number of total hip replacement surgeries has steadily increased over recent years. Reduction in postoperative pain increases patient satisfaction and enables better mobilization. Thus, pain management needs to be continuously improved. Problems are often caused not only by medical issues but also by organization and hospital structure. The present study shows how the quality of pain management can be increased by implementing a standardized pain concept and simple, consistent, benchmarking. All patients included in the study had undergone total hip arthroplasty (THA). Outcome parameters were analyzed 24 hours after surgery by means of the questionnaires from the German-wide project "Quality Improvement in Postoperative Pain Management" (QUIPS). A pain nurse interviewed patients and continuously assessed outcome quality parameters. A multidisciplinary team of anesthetists, orthopedic surgeons, and nurses implemented a regular procedure of data analysis and internal benchmarking. The health care team was informed of any results, and suggested improvements. Every staff member involved in pain management participated in educational lessons, and a special pain nurse was trained in each ward. From 2014 to 2015, 367 patients were included. The mean maximal pain score 24 hours after surgery was 4.0 (±3.0) on an 11-point numeric rating scale, and patient satisfaction was 9.0 (±1.2). Over time, the maximum pain score decreased (mean 3.0, ±2.0), whereas patient satisfaction significantly increased (mean 9.8, ±0.4; p benchmarking a standardized pain management concept. But regular benchmarking, implementation of feedback mechanisms, and staff education made the pain management concept even more successful. Multidisciplinary teamwork and flexibility in adapting processes seem to be highly important for successful pain management.

  14. Night-to-night arousal variability and interscorer reliability of arousal measurements.

    Science.gov (United States)

    Loredo, J S; Clausen, J L; Ancoli-Israel, S; Dimsdale, J E

    1999-11-01

    Measurement of arousals from sleep is clinically important, however, their definition is not well standardized, and little data exist on reliability. The purpose of this study is to determine factors that affect arousal scoring reliability and night-to-night arousal variability. The night-to-night arousal variability and interscorer reliability was assessed in 20 subjects with and without obstructive sleep apnea undergoing attended polysomnography during two consecutive nights. Five definitions of arousal were studied, assessing duration of electroencephalographic (EEG) frequency changes, increases in electromyographic (EMG) activity and leg movement, association with respiratory events, as well as the American Sleep Disorders Association (ASDA) definition of arousals. NA. NA. NA. Interscorer reliability varied with the definition of arousal and ranged from an Intraclass correlation (ICC) of 0.19 to 0.92. Arousals that included increases in EMG activity or leg movement had the greatest reliability, especially when associated with respiratory events (ICC 0.76 to 0.92). The ASDA arousal definition had high interscorer reliability (ICC 0.84). Reliability was lowest for arousals consisting of EEG changes lasting <3 seconds (ICC 0.19 to 0.37). The within subjects night-to-night arousal variability was low for all arousal definitions In a heterogeneous population, interscorer arousal reliability is enhanced by increases in EMG activity, leg movements, and respiratory events and decreased by short duration EEG arousals. The arousal index night-to-night variability was low for all definitions.

  15. Trendy solutions: Why do states adopt Sustainable Energy Portfolio Standards?

    Energy Technology Data Exchange (ETDEWEB)

    Chandler, Jess [Georgia Institute of Technology, 685 Cherry Street, Atlanta, GA 30332-0345 (United States)], E-mail: jess.chandler@gatech.edu

    2009-08-15

    Thirty-four states had adopted Sustainable Energy Portfolio Standards (SEPS) or similar goals by the end of 2008, with 14 adoptions since 2006. There appears to be something trendy about SEPS and states may adopt SEPS when internal variables would indicate otherwise. This analysis extends the current discussion of SEPS adoption beyond internal variables, relying on innovation and diffusion theory. Logistic regression with SEPS adoption as the dependent variable is used to test internal determinants and diffusion measures for the years 1997-2008. Of the internal determinants variables, affluence and government ideology were found to be positive and significant. The results show that regional and neighbor diffusion variables are significant in SEPS adoption decisions-even when accounting for ideological distance from previous adopters.

  16. Trendy solutions. Why do states adopt Sustainable Energy Portfolio Standards?

    Energy Technology Data Exchange (ETDEWEB)

    Chandler, Jess [Georgia Institute of Technology, 685 Cherry Street, Atlanta, GA 30332-0345 (United States)

    2009-08-15

    Thirty-four states had adopted Sustainable Energy Portfolio Standards (SEPS) or similar goals by the end of 2008, with 14 adoptions since 2006. There appears to be something trendy about SEPS and states may adopt SEPS when internal variables would indicate otherwise. This analysis extends the current discussion of SEPS adoption beyond internal variables, relying on innovation and diffusion theory. Logistic regression with SEPS adoption as the dependent variable is used to test internal determinants and diffusion measures for the years 1997-2008. Of the internal determinants variables, affluence and government ideology were found to be positive and significant. The results show that regional and neighbor diffusion variables are significant in SEPS adoption decisions - even when accounting for ideological distance from previous adopters. (author)

  17. Trendy solutions: Why do states adopt Sustainable Energy Portfolio Standards?

    International Nuclear Information System (INIS)

    Chandler, Jess

    2009-01-01

    Thirty-four states had adopted Sustainable Energy Portfolio Standards (SEPS) or similar goals by the end of 2008, with 14 adoptions since 2006. There appears to be something trendy about SEPS and states may adopt SEPS when internal variables would indicate otherwise. This analysis extends the current discussion of SEPS adoption beyond internal variables, relying on innovation and diffusion theory. Logistic regression with SEPS adoption as the dependent variable is used to test internal determinants and diffusion measures for the years 1997-2008. Of the internal determinants variables, affluence and government ideology were found to be positive and significant. The results show that regional and neighbor diffusion variables are significant in SEPS adoption decisions-even when accounting for ideological distance from previous adopters.

  18. 40 CFR 60.2220 - What must I include in the deviation report?

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 6 2010-07-01 2010-07-01 false What must I include in the deviation... PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for... Recordkeeping and Reporting § 60.2220 What must I include in the deviation report? In each report required under...

  19. EU-US standards harmonization task group report : feedback to standards development organizations - security

    Science.gov (United States)

    2012-11-12

    Harmonization Task Groups 1 and 3 (HTG1 and 3) were established by the EU-US International Standards Harmonization Working Group to attempt to harmonize standards (including ISO, CEN, ETSI, IEEE) on security (HTG1) and communications protocols (HTG3)...

  20. From Transition Systems to Variability Models and from Lifted Model Checking Back to UPPAAL

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar; Wasowski, Andrzej

    2017-01-01

    efficient lifted (family-based) model checking for real-time variability models. This reduces the cost of maintaining specialized family-based real-time model checkers. Real-time variability models can be model checked using the standard UPPAAL. We have implemented abstractions as syntactic source...

  1. Tax Mechanism of Influence on the Financial Component of Russians’ Living Standards

    Directory of Open Access Journals (Sweden)

    Leyla Akifovna Mytareva

    2016-12-01

    Full Text Available In a socially-oriented country the development standard is determined by the living standards of population. The article is devoted to a comprehensive presentation of tax mechanism influencing the quality of Russians’ life, based on the interdependence of tax revenue and spending. The article comprehensively presented and explained variable combination of tax techniques and tools, influencing the financial component of the living standard of the population (individuals not engaged in entrepreneurial activities, including: the type and level of tax required and elective elements of the tax, tax residency, tax audits and combating tax evasion. The author presents the elements of tax mechanism of influence on the financial component of the living standards of Russians. As the main indicator for evaluating the impact of the tax mechanism on the living standards, the author proposed the indicator of tax burden, calculated both as the total size and as a structure: the objects of taxation (income, property and indirect taxation and tax levels (Federal, regional and local. The author points to a slight increase in tax burden of the Russians since 2006 and 2015, against a significant growth of the amount of tax paid by them and the amount of cash income; predominance of income and Federal taxes in the structure of tax burden; a slight change in the structure of the tax burden on taxable items and tax rates.

  2. Variability in large-scale wind power generation

    DEFF Research Database (Denmark)

    Kiviluoma, Juha; Holttinen, Hannele; Weir, David

    2016-01-01

    The paper demonstrates the characteristics of wind power variability and net load variability in multiple power systems based on real data from multiple years. Demonstrated characteristics include probability distribution for different ramp durations, seasonal and diurnal variability and low net ...... with well-dispersed wind power. Copyright © 2015 John Wiley & Sons, Ltd....

  3. Sources of variability and systematic error in mouse timing behavior.

    Science.gov (United States)

    Gallistel, C R; King, Adam; McDonald, Robert

    2004-01-01

    In the peak procedure, starts and stops in responding bracket the target time at which food is expected. The variability in start and stop times is proportional to the target time (scalar variability), as is the systematic error in the mean center (scalar error). The authors investigated the source of the error and the variability, using head poking in the mouse, with target intervals of 5 s, 15 s, and 45 s, in the standard procedure, and in a variant with 3 different target intervals at 3 different locations in a single trial. The authors conclude that the systematic error is due to the asymmetric location of start and stop decision criteria, and the scalar variability derives primarily from sources other than memory.

  4. Daily affect variability and context-specific alcohol consumption.

    Science.gov (United States)

    Mohr, Cynthia D; Arpin, Sarah; McCabe, Cameron T

    2015-11-01

    Research explored the effects of variability in negative and positive affect on alcohol consumption, specifying daily fluctuation in affect as a critical form of emotion dysregulation. Using daily process methodology allows for a more objective calculation of affect variability relative to traditional self-reports. The present study models within-person negative and positive affect variabilities as predictors of context-specific consumption (i.e. solitary vs. social drinking), controlling for mean levels of affect. A community sample of moderate-to-heavy drinkers (n = 47; 49% women) from a US metropolitan area reported on affect and alcohol consumption thrice daily for 30 days via a handheld electronic interviewer. Within-person affect variability was calculated using daily standard deviations in positive and negative affect. Within person, greater negative and positive variabilities are related to greater daily solitary and social consumption. Across study days, mean levels of negative and positive affect variabilities related to greater social consumption between persons; yet, aggregated negative affect variability was related to less solitary consumption. Results affirm affect variability as a unique predictor of alcohol consumption, independent of mean affect levels. Yet, it is important to differentiate social context of consumption, as well as type of affect variability, particularly at the between-person level. These distinctions help clarify inconsistencies in the self-medication literature regarding associations between average levels of affect and consumption. Importantly, consistent within-person relationships for both variabilities support arguments that both negative and positive affect variabilities are detrimental and reflect an inability to regulate emotional experience. © 2015 Australasian Professional Society on Alcohol and other Drugs.

  5. The uncertainty of reference standards--a guide to understanding factors impacting uncertainty, uncertainty calculations, and vendor certifications.

    Science.gov (United States)

    Gates, Kevin; Chang, Ning; Dilek, Isil; Jian, Huahua; Pogue, Sherri; Sreenivasan, Uma

    2009-10-01

    Certified solution standards are widely used in forensic toxicological, clinical/diagnostic, and environmental testing. Typically, these standards are purchased as ampouled solutions with a certified concentration. Vendors present concentration and uncertainty differently on their Certificates of Analysis. Understanding the factors that impact uncertainty and which factors have been considered in the vendor's assignment of uncertainty are critical to understanding the accuracy of the standard and the impact on testing results. Understanding these variables is also important for laboratories seeking to comply with ISO/IEC 17025 requirements and for those preparing reference solutions from neat materials at the bench. The impact of uncertainty associated with the neat material purity (including residual water, residual solvent, and inorganic content), mass measurement (weighing techniques), and solvent addition (solution density) on the overall uncertainty of the certified concentration is described along with uncertainty calculations.

  6. On Solutions for Linear and Nonlinear Schrödinger Equations with Variable Coefficients: A Computational Approach

    Directory of Open Access Journals (Sweden)

    Gabriel Amador

    2016-05-01

    Full Text Available In this work, after reviewing two different ways to solve Riccati systems, we are able to present an extensive list of families of integrable nonlinear Schrödinger (NLS equations with variable coefficients. Using Riccati equations and similarity transformations, we are able to reduce them to the standard NLS models. Consequently, we can construct bright-, dark- and Peregrine-type soliton solutions for NLS with variable coefficients. As an important application of solutions for the Riccati equation with parameters, by means of computer algebra systems, it is shown that the parameters change the dynamics of the solutions. Finally, we test numerical approximations for the inhomogeneous paraxial wave equation by the Crank-Nicolson scheme with analytical solutions found using Riccati systems. These solutions include oscillating laser beams and Laguerre and Gaussian beams.

  7. 12 YEARS OF X-RAY VARIABILITY IN M31 GLOBULAR CLUSTERS, INCLUDING 8 BLACK HOLE CANDIDATES, AS SEEN BY CHANDRA

    International Nuclear Information System (INIS)

    Barnard, R.; Garcia, M.; Murray, S. S.

    2012-01-01

    We examined 134 Chandra observations of the population of X-ray sources associated with globular clusters (GCs) in the central region of M31. These are expected to be X-ray binary systems (XBs), consisting of a neutron star or black hole accreting material from a close companion. We created long-term light curves for these sources, correcting for background, interstellar absorption, and instrumental effects. We tested for variability by examining the goodness of fit for the best-fit constant intensity. We also created structure functions (SFs) for every object in our sample, the first time this technique has been applied to XBs. We found significant variability in 28 out of 34 GCs and GC candidates; the other 6 sources had 0.3-10 keV luminosities fainter than ∼2 × 10 36 erg s –1 , limiting our ability to detect similar variability. The SFs of XBs with 0.3-10 keV luminosities ∼2-50 × 10 36 erg s –1 generally showed considerably more variability than the published ensemble SF of active galactic nuclei (AGNs). Our brightest XBs were mostly consistent with the AGN SF; however, their 2-10 keV fluxes could be matched by <1 AGN per square degree. These encouraging results suggest that examining the long-term light curves of other X-ray sources in the field may provide an important distinction between X-ray binaries and background galaxies, as the X-ray emission spectra from these two classes of X-ray sources are similar. Additionally, we identify 3 new black hole candidates (BHCs) using additional XMM-Newton data, bringing the total number of M31 GC BHCs to 9, with 8 covered in this survey.

  8. Approaches for developing a sizing method for stand-alone PV systems with variable demand

    Energy Technology Data Exchange (ETDEWEB)

    Posadillo, R. [Grupo de Investigacion en Energias y Recursos Renovables, Dpto. de Fisica Aplicada, E.P.S., Universidad de Cordoba, Avda. Menendez Pidal s/n, 14004 Cordoba (Spain); Lopez Luque, R. [Grupo de Investigacion de Fisica para las Energias y Recursos Renovables, Dpto. de Fisica Aplicada. Edificio C2 Campus de Rabanales, 14071 Cordoba (Spain)

    2008-05-15

    Accurate sizing is one of the most important aspects to take into consideration when designing a stand-alone photovoltaic system (SAPV). Various methods, which differ in terms of their simplicity or reliability, have been developed for this purpose. Analytical methods, which seek functional relationships between variables of interest to the sizing problem, are one of these approaches. A series of rational considerations are presented in this paper with the aim of shedding light upon the basic principles and results of various sizing methods proposed by different authors. These considerations set the basis for a new analytical method that has been designed for systems with variable monthly energy demands. Following previous approaches, the method proposed is based on the concept of loss of load probability (LLP) - a parameter that is used to characterize system design. The method includes information on the standard deviation of loss of load probability ({sigma}{sub LLP}) and on two new parameters: annual number of system failures (f) and standard deviation of annual number of failures ({sigma}{sub f}). The method proves useful for sizing a PV system in a reliable manner and serves to explain the discrepancies found in the research on systems with LLP<10{sup -2}. We demonstrate that reliability depends not only on the sizing variables and on the distribution function of solar radiation, but on the minimum value as well, which in a given location and with a monthly average clearness index, achieves total solar radiation on the receiver surface. (author)

  9. Variable selection and estimation for longitudinal survey data

    KAUST Repository

    Wang, Li

    2014-09-01

    There is wide interest in studying longitudinal surveys where sample subjects are observed successively over time. Longitudinal surveys have been used in many areas today, for example, in the health and social sciences, to explore relationships or to identify significant variables in regression settings. This paper develops a general strategy for the model selection problem in longitudinal sample surveys. A survey weighted penalized estimating equation approach is proposed to select significant variables and estimate the coefficients simultaneously. The proposed estimators are design consistent and perform as well as the oracle procedure when the correct submodel was known. The estimating function bootstrap is applied to obtain the standard errors of the estimated parameters with good accuracy. A fast and efficient variable selection algorithm is developed to identify significant variables for complex longitudinal survey data. Simulated examples are illustrated to show the usefulness of the proposed methodology under various model settings and sampling designs. © 2014 Elsevier Inc.

  10. A standard for test reliability in group research.

    Science.gov (United States)

    Ellis, Jules L

    2013-03-01

    Many authors adhere to the rule that test reliabilities should be at least .70 or .80 in group research. This article introduces a new standard according to which reliabilities can be evaluated. This standard is based on the costs or time of the experiment and of administering the test. For example, if test administration costs are 7 % of the total experimental costs, the efficient value of the reliability is .93. If the actual reliability of a test is equal to this efficient reliability, the test size maximizes the statistical power of the experiment, given the costs. As a standard in experimental research, it is proposed that the reliability of the dependent variable be close to the efficient reliability. Adhering to this standard will enhance the statistical power and reduce the costs of experiments.

  11. Increased Short-Term Beat-To-Beat Variability of QT Interval in Patients with Acromegaly

    Science.gov (United States)

    Orosz, Andrea; Csajbók, Éva; Czékus, Csilla; Gavallér, Henriette; Magony, Sándor; Valkusz, Zsuzsanna; Várkonyi, Tamás T.; Nemes, Attila; Baczkó, István; Forster, Tamás; Wittmann, Tibor; Papp, Julius Gy.; Varró, András; Lengyel, Csaba

    2015-01-01

    Cardiovascular diseases, including ventricular arrhythmias are responsible for increased mortality in patients with acromegaly. Acromegaly may cause repolarization abnormalities such as QT prolongation and impairment of repolarization reserve enhancing liability to arrhythmia. The aim of this study was to determine the short-term beat-to-beat QT variability in patients with acromegaly. Thirty acromegalic patients (23 women and 7 men, mean age±SD: 55.7±10.4 years) were compared with age- and sex-matched volunteers (mean age 51.3±7.6 years). Cardiac repolarization parameters including frequency corrected QT interval, PQ and QRS intervals, duration of terminal part of T waves (Tpeak-Tend) and short-term variability of QT interval were evaluated. All acromegalic patients and controls underwent transthoracic echocardiographic examination. Autonomic function was assessed by means of five standard cardiovascular reflex tests. Comparison of the two groups revealed no significant differences in the conventional ECG parameters of repolarization (QT: 401.1±30.6 ms vs 389.3±16.5 ms, corrected QT interval: 430.1±18.6 ms vs 425.6±17.3 ms, QT dispersion: 38.2±13.2 ms vs 36.6±10.2 ms; acromegaly vs control, respectively). However, short-term beat-to-beat QT variability was significantly increased in acromegalic patients (4.23±1.03 ms vs 3.02±0.80, Pacromegaly in spite of unchanged conventional parameters of ventricular repolarization. This enhanced temporal QT variability may be an early indicator of increased liability to arrhythmia. PMID:25915951

  12. Soil variability in mountain areas

    OpenAIRE

    Zanini, E.; Freppaz, M.; Stanchi, S.; Bonifacio, E.; Egli, M.

    2015-01-01

    The high spatial variability of soils is a relevant issue at local and global scales, and determines the complexity of soil ecosystem functions and services. This variability derives from strong dependencies of soil ecosystems on parent materials, climate, relief and biosphere, including human impact. Although present in all environments, the interactions of soils with these forming factors are particularly striking in mountain areas.

  13. Variability in human body size

    Science.gov (United States)

    Annis, J. F.

    1978-01-01

    The range of variability found among homogeneous groups is described and illustrated. Those trends that show significantly marked differences between sexes and among a number of racial/ethnic groups are also presented. Causes of human-body size variability discussed include genetic endowment, aging, nutrition, protective garments, and occupation. The information is presented to aid design engineers of space flight hardware and equipment.

  14. Can co-activation reduce kinematic variability? A simulation study.

    NARCIS (Netherlands)

    Selen, L.P.J.; Beek, P.J.; van Dieen, J.H.

    2005-01-01

    Impedance modulation has been suggested as a means to suppress the effects of internal 'noise' on movement kinematics. We investigated this hypothesis in a neuro-musculo-skeletal model. A prerequisite is that the muscle model produces realistic force variability. We found that standard Hill-type

  15. Search for rapid spectral variability in Psi(9) Aurigae

    International Nuclear Information System (INIS)

    Ghosh, K.K.

    1989-01-01

    Observations of Psi(9) Aur on five nights between January 29 and February 3, 1988 were conducted as part of a search for rapid spectral variability in Be stars. In addition, a series of H-alpha profiles with a time resolution of about 45 s was obtained for the star. A method for obtaining the standard deviation in continuum counts measurements is proposed. The estimated value of the standard deviation of the measured equivalent widths of the H-alpha profiles was obtained using the method of Chalabaev and Maillard (1983). Rapid variations of the standard deviations of continuum counts and H-alpha equivalent widths were not observed. For the continuum counts measurement standard deviations a few hourly variations and two night-to-night variations were found. 16 refs

  16. High-resolution H -band Spectroscopy of Be Stars with SDSS-III/APOGEE. II. Line Profile and Radial Velocity Variability

    Energy Technology Data Exchange (ETDEWEB)

    Chojnowski, S. Drew; Holtzman, Jon A. [Apache Point Observatory and New Mexico State University, P.O. Box 59, Sunspot, NM, 88349-0059 (United States); Wisniewski, John P. [Department of Physics and Astronomy, The University of Oklahoma, 440 W. Brooks Street, Norman, OK 73019 (United States); Whelan, David G. [Department of Physics, Austin College, 900 N. Grand Avenue, Sherman, TX 75090 (United States); Labadie-Bartz, Jonathan; Pepper, Joshua [Department of Physics, Lehigh University, Bethlehem, PA 18015 (United States); Fernandes, Marcelo Borges [Observatório Nacional, Rua General José Cristino 77, 20921-400, São Cristovão, Rio de Janeiro (Brazil); Lin, Chien-Cheng [Key Laboratory for Research in Galaxies and Cosmology, Shanghai Astronomical Observatory, Chinese Academy of Sciences, 80 Nandan Road Shanghai 200030 (China); Majewski, Steven R. [Department of Astronomy, University of Virginia, P.O. Box 400325, Charlottesville, VA 22904-4325 (United States); Stringfellow, Guy S. [Center for Astrophysics and Space Astronomy, Department of Astrophysical and Planetary Sciences, University of Colorado, 389 UCB, Boulder, Colorado 80309-0389 (United States); Mennickent, Ronald E.; Tang, Baitian [Departamento de Astronomía, Universidad de Concepción, Concepción (Chile); Roman-Lopes, Alexandre [Departamento de Física, Facultad de Ciencias, Universidad de La Serena, Cisternas 1200, La Serena (Chile); Hearty, Fred R. [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States); Zasowski, Gail [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD, 21218 (United States)

    2017-04-01

    We report on the H -band spectral variability of classical Be stars observed over the course of the Apache Point Galactic Evolution Experiment (APOGEE), one of four subsurveys comprising SDSS-III. As described in the first paper of this series, the APOGEE B-type emission-line (ABE) star sample was culled from the large number of blue stars observed as telluric standards during APOGEE observations. In this paper, we explore the multi-epoch ABE sample, consisting of 1100 spectra for 213 stars. These “snapshots” of the circumstellar disk activity have revealed a wealth of temporal variability including, but not limited to, gradual disappearance of the line emission and vice versa over both short and long timescales. Other forms of variability include variation in emission strength, emission peak intensity ratios, and emission peak separations. We also analyze radial velocities (RVs) of the emission lines for a subsample of 162 stars with sufficiently strong features, and we discuss on a case-by-case basis whether the RV variability exhibited by some stars is caused by binary motion versus dynamical processes in the circumstellar disks. Ten systems are identified as convincing candidates for binary Be stars with as of yet undetected companions.

  17. High-resolution H -band Spectroscopy of Be Stars with SDSS-III/APOGEE. II. Line Profile and Radial Velocity Variability

    International Nuclear Information System (INIS)

    Chojnowski, S. Drew; Holtzman, Jon A.; Wisniewski, John P.; Whelan, David G.; Labadie-Bartz, Jonathan; Pepper, Joshua; Fernandes, Marcelo Borges; Lin, Chien-Cheng; Majewski, Steven R.; Stringfellow, Guy S.; Mennickent, Ronald E.; Tang, Baitian; Roman-Lopes, Alexandre; Hearty, Fred R.; Zasowski, Gail

    2017-01-01

    We report on the H -band spectral variability of classical Be stars observed over the course of the Apache Point Galactic Evolution Experiment (APOGEE), one of four subsurveys comprising SDSS-III. As described in the first paper of this series, the APOGEE B-type emission-line (ABE) star sample was culled from the large number of blue stars observed as telluric standards during APOGEE observations. In this paper, we explore the multi-epoch ABE sample, consisting of 1100 spectra for 213 stars. These “snapshots” of the circumstellar disk activity have revealed a wealth of temporal variability including, but not limited to, gradual disappearance of the line emission and vice versa over both short and long timescales. Other forms of variability include variation in emission strength, emission peak intensity ratios, and emission peak separations. We also analyze radial velocities (RVs) of the emission lines for a subsample of 162 stars with sufficiently strong features, and we discuss on a case-by-case basis whether the RV variability exhibited by some stars is caused by binary motion versus dynamical processes in the circumstellar disks. Ten systems are identified as convincing candidates for binary Be stars with as of yet undetected companions.

  18. Transformations for a generalized variable-coefficient Korteweg-de Vries model from blood vessels, Bose-Einstein condensates, rods and positons with symbolic computation

    International Nuclear Information System (INIS)

    Tian Bo; Wei Guangmei; Zhang Chunyi; Shan Wenrui; Gao Yitian

    2006-01-01

    The variable-coefficient Korteweg-de Vries (KdV)-typed models, although often hard to be studied, are of current interest in describing various real situations. Under investigation hereby is a large class of the generalized variable-coefficient KdV models with external-force and perturbed/dissipative terms. Recent examples of this class include those in blood vessels and circulatory system, arterial dynamics, trapped Bose-Einstein condensates related to matter waves and nonlinear atom optics, Bose gas of impenetrable bosons with longitudinal confinement, rods of compressible hyperelastic material and semiconductor heterostructures with positonic phenomena. In this Letter, based on symbolic computation, four transformations are proposed from this class either to the cylindrical or standard KdV equation when the respective constraint holds. The constraints have nothing to do with the external-force term. Under those transformations, such analytic solutions as those with the Airy, Hermit and Jacobian elliptic functions can be obtained, including the solitonic profiles. The roles for the perturbed and external-force terms to play are observed and discussed. Investigations on this class can be performed through the properties of solutions of cylindrical and standard KdV equations

  19. State Standards and State Assessment Systems: A Guide to Alignment. Series on Standards and Assessments.

    Science.gov (United States)

    La Marca, Paul M.; Redfield, Doris; Winter, Phoebe C.

    Alignment of content standards, performance standards, and assessments is crucial. This guide contains information to assist states and districts in aligning their assessment systems to their content and performance standards. It includes a review of current literature, both published and fugitive. The research is woven together with a few basic…

  20. A New Browser-based, Ontology-driven Tool for Generating Standardized, Deep Descriptions of Geoscience Models

    Science.gov (United States)

    Peckham, S. D.; Kelbert, A.; Rudan, S.; Stoica, M.

    2016-12-01

    Standardized metadata for models is the key to reliable and greatly simplified coupling in model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System). This model metadata also helps model users to understand the important details that underpin computational models and to compare the capabilities of different models. These details include simplifying assumptions on the physics, governing equations and the numerical methods used to solve them, discretization of space (the grid) and time (the time-stepping scheme), state variables (input or output), model configuration parameters. This kind of metadata provides a "deep description" of a computational model that goes well beyond other types of metadata (e.g. author, purpose, scientific domain, programming language, digital rights, provenance, execution) and captures the science that underpins a model. While having this kind of standardized metadata for each model in a repository opens up a wide range of exciting possibilities, it is difficult to collect this information and a carefully conceived "data model" or schema is needed to store it. Automated harvesting and scraping methods can provide some useful information, but they often result in metadata that is inaccurate or incomplete, and this is not sufficient to enable the desired capabilities. In order to address this problem, we have developed a browser-based tool called the MCM Tool (Model Component Metadata) which runs on notebooks, tablets and smart phones. This tool was partially inspired by the TurboTax software, which greatly simplifies the necessary task of preparing tax documents. It allows a model developer or advanced user to provide a standardized, deep description of a computational geoscience model, including hydrologic models. Under the hood, the tool uses a new ontology for models built on the CSDMS Standard Names, expressed as a collection of RDF files (Resource Description Framework). This ontology is based on core concepts

  1. The influence of solar wind variability on magnetospheric ULF wave power

    International Nuclear Information System (INIS)

    Pokhotelov, D.; Rae, I.J.; Mann, I.R.

    2015-01-01

    Magnetospheric ultra-low frequency (ULF) oscillations in the Pc 4-5 frequency range play an important role in the dynamics of Earth's radiation belts, both by enhancing the radial diffusion through incoherent interactions and through the coherent drift-resonant interactions with trapped radiation belt electrons. The statistical distributions of magnetospheric ULF wave power are known to be strongly dependent on solar wind parameters such as solar wind speed and interplanetary magnetic field (IMF) orientation. Statistical characterisation of ULF wave power in the magnetosphere traditionally relies on average solar wind-IMF conditions over a specific time period. In this brief report, we perform an alternative characterisation of the solar wind influence on magnetospheric ULF wave activity through the characterisation of the solar wind driver by its variability using the standard deviation of solar wind parameters rather than a simple time average. We present a statistical study of nearly one solar cycle (1996-2004) of geosynchronous observations of magnetic ULF wave power and find that there is significant variation in ULF wave powers as a function of the dynamic properties of the solar wind. In particular, we find that the variability in IMF vector, rather than variabilities in other parameters (solar wind density, bulk velocity and ion temperature), plays the strongest role in controlling geosynchronous ULF power. We conclude that, although time-averaged bulk properties of the solar wind are a key factor in driving ULF powers in the magnetosphere, the solar wind variability can be an important contributor as well. This highlights the potential importance of including solar wind variability especially in studies of ULF wave dynamics in order to assess the efficiency of solar wind-magnetosphere coupling.

  2. Sleep and Physiological Dysregulation: A Closer Look at Sleep Intraindividual Variability.

    Science.gov (United States)

    Bei, Bei; Seeman, Teresa E; Carroll, Judith E; Wiley, Joshua F

    2017-09-01

    Variable daily sleep (ie, higher intraindividual variability; IIV) is associated with negative health consequences, but potential physiological mechanisms are poorly understood. This study examined how the IIV of sleep timing, duration, and quality is associated with physiological dysregulation, with diurnal cortisol trajectories as a proximal outcome and allostatic load (AL) as a multisystem distal outcome. Participants are 436 adults (Mage ± standard deviation = 54.1 ± 11.7, 60.3% women) from the Midlife in the United States study. Sleep was objectively assessed using 7-day actigraphy. Diurnal cortisol was measured via saliva samples (four/day for 4 consecutive days). AL was measured using 23 biomarkers from seven systems (inflammatory, hypothalamic-pituitary-adrenal axis, metabolic glucose and lipid, cardiovascular, parasympathetic, sympathetic) using a validated bifactor model. Linear and quadratic effects of sleep IIV were estimated using a validated Bayesian model. Controlling for covariates, more variable sleep timing (p = .04 for risetime, p = .097 for bedtime) and total sleep time (TST; p = .02), but not mean sleep variables, were associated with flatter cortisol diurnal slope. More variable sleep onset latency and wake after sleep onset, later average bedtime, and shorter TST were associated with higher AL adjusting for age and sex (p-values sleep patterns were associated with blunted diurnal cortisol trajectories but not with higher multisystem physiological dysregulation. The associations between sleep IIV and overall health are likely complex, including multiple biopsychosocial determinants and require further investigation. © Sleep Research Society 2017. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved. For permissions, please e-mail journals.permissions@oup.com.

  3. Calculus of one variable

    CERN Document Server

    Grossman, Stanley I

    1986-01-01

    Calculus of One Variable, Second Edition presents the essential topics in the study of the techniques and theorems of calculus.The book provides a comprehensive introduction to calculus. It contains examples, exercises, the history and development of calculus, and various applications. Some of the topics discussed in the text include the concept of limits, one-variable theory, the derivatives of all six trigonometric functions, exponential and logarithmic functions, and infinite series.This textbook is intended for use by college students.

  4. Variability in Wechsler Adult Intelligence Scale-IV subtest performance across age.

    Science.gov (United States)

    Wisdom, Nick M; Mignogna, Joseph; Collins, Robert L

    2012-06-01

    Normal Wechsler Adult Intelligence Scale (WAIS)-IV performance relative to average normative scores alone can be an oversimplification as this fails to recognize disparate subtest heterogeneity that occurs with increasing age. The purpose of the present study is to characterize the patterns of raw score change and associated variability on WAIS-IV subtests across age groupings. Raw WAIS-IV subtest means and standard deviations for each age group were tabulated from the WAIS-IV normative manual along with the coefficient of variation (CV), a measure of score dispersion calculated by dividing the standard deviation by the mean and multiplying by 100. The CV further informs the magnitude of variability represented by each standard deviation. Raw mean scores predictably decreased across age groups. Increased variability was noted in Perceptual Reasoning and Processing Speed Index subtests, as Block Design, Matrix Reasoning, Picture Completion, Symbol Search, and Coding had CV percentage increases ranging from 56% to 98%. In contrast, Working Memory and Verbal Comprehension subtests were more homogeneous with Digit Span, Comprehension, Information, and Similarities percentage of the mean increases ranging from 32% to 43%. Little change in the CV was noted on Cancellation, Arithmetic, Letter/Number Sequencing, Figure Weights, Visual Puzzles, and Vocabulary subtests (test limitations as well as further our understanding of cognitive domains which remain relatively steady versus those which steadily decline.

  5. Hand Fatigue Analysis Using Quantitative Evaluation of Variability in Drawing Patterns

    Directory of Open Access Journals (Sweden)

    mohamadali Sanjari

    2015-02-01

    Full Text Available Background & aim: Muscle fatigue is defined as the reduced power generation capacity of a muscle or muscle group after activity which can lead to a variety of lesions. The purpose of the present study was to define the fatigue analysis by quantitative analysis using drawing patterns. Methods: the present cross-sectional study was conducted on 37 healthy volunteers (6 men and 31 women aged 18-30 years. Before & immediately after a fatigue protocol, quantitative assessment of hand drawing skills was performed by drawing repeated, overlapping, and concentric circles. The test was conducted in three sessions with an interval of 48-72 hours. Drawing was recorded by a digital tablet. Data were statistically analyzed using paired t-test and repeated measure ANOVA. Result: In drawing time series data analysis, at fatigue level of 100%, the variables standard deviation along x axis (SDx, standard deviation of velocity on both x and y axis (SDVx and SDVy and resultant vector velocity standard deviation (SDVR, showed significant differences after fatigue (P<0.05. In comparison of variables after the three fatigue levels, SDx showed significant difference (P<0.05. Conclusions: structurally full fatigue showed significant differences with other levels of fatigue, so it contributed to significant variability in drawing parameters. The method used in the present study recognized the fatigue in high frequency motion as well.

  6. Variable-Period Undulators For Synchrotron Radiation

    Science.gov (United States)

    Shenoy, Gopal; Lewellen, John; Shu, Deming; Vinokurov, Nikolai

    2005-02-22

    A new and improved undulator design is provided that enables a variable period length for the production of synchrotron radiation from both medium-energy and high-energy storage rings. The variable period length is achieved using a staggered array of pole pieces made up of high permeability material, permanent magnet material, or an electromagnetic structure. The pole pieces are separated by a variable width space. The sum of the variable width space and the pole width would therefore define the period of the undulator. Features and advantages of the invention include broad photon energy tunability, constant power operation and constant brilliance operation.

  7. Variable-Period Undulators for Synchrotron Radiation

    Energy Technology Data Exchange (ETDEWEB)

    Shenoy, Gopal; Lewellen, John; Shu, Deming; Vinokurov, Nikolai

    2005-02-22

    A new and improved undulator design is provided that enables a variable period length for the production of synchrotron radiation from both medium-energy and high energy storage rings. The variable period length is achieved using a staggered array of pole pieces made up of high permeability material, permanent magnet material, or an electromagnetic structure. The pole pieces are separated by a variable width space. The sum of the variable width space and the pole width would therefore define the period of the undulator. Features and advantages of the invention include broad photon energy tunability, constant power operation and constant brilliance operation.

  8. CytometryML: a data standard which has been designed to interface with other standards

    Science.gov (United States)

    Leif, Robert C.

    2007-02-01

    Because of the differences in the requirements, needs, and past histories including existing standards of the creating organizations, a single encompassing cytology-pathology standard will not, in the near future, replace the multiple existing or under development standards. Except for DICOM and FCS, these standardization efforts are all based on XML. CytometryML is a collection of XML schemas, which are based on the Digital Imaging and Communications in Medicine (DICOM) and Flow Cytometry Standard (FCS) datatypes. The CytometryML schemas contain attributes that link them to the DICOM standard and FCS. Interoperability with DICOM has been facilitated by, wherever reasonable, limiting the difference between CytometryML and the previous standards to syntax. In order to permit the Resource Description Framework, RDF, to reference the CytometryML datatypes, id attributes have been added to many CytometryML elements. The Laboratory Digital Imaging Project (LDIP) Data Exchange Specification and the Flowcyt standards development effort employ RDF syntax. Documentation from DICOM has been reused in CytometryML. The unity of analytical cytology was demonstrated by deriving a microscope type and a flow cytometer type from a generic cytometry instrument type. The feasibility of incorporating the Flowcyt gating schemas into CytometryML has been demonstrated. CytometryML is being extended to include many of the new DICOM Working Group 26 datatypes, which describe patients, specimens, and analytes. In situations where multiple standards are being created, interoperability can be facilitated by employing datatypes based on a common set of semantics and building in links to standards that employ different syntax.

  9. Variable Coding and Modulation Experiment Using NASA's Space Communication and Navigation Testbed

    Science.gov (United States)

    Downey, Joseph A.; Mortensen, Dale J.; Evans, Michael A.; Tollis, Nicholas S.

    2016-01-01

    National Aeronautics and Space Administration (NASA)'s Space Communication and Navigation Testbed on the International Space Station provides a unique opportunity to evaluate advanced communication techniques in an operational system. The experimental nature of the Testbed allows for rapid demonstrations while using flight hardware in a deployed system within NASA's networks. One example is variable coding and modulation, which is a method to increase data-throughput in a communication link. This paper describes recent flight testing with variable coding and modulation over S-band using a direct-to-earth link between the SCaN Testbed and the Glenn Research Center. The testing leverages the established Digital Video Broadcasting Second Generation (DVB-S2) standard to provide various modulation and coding options. The experiment was conducted in a challenging environment due to the multipath and shadowing caused by the International Space Station structure. Performance of the variable coding and modulation system is evaluated and compared to the capacity of the link, as well as standard NASA waveforms.

  10. Definition of a process standardization framework for a scientific journal based on a literature review and its performance objectives

    Directory of Open Access Journals (Sweden)

    José Augusto Campos Garcia

    2014-12-01

    Full Text Available This work presents the results of a study that aimed to establish a framework to standardize the process of a scientific journal. It has a team that performs operational routines regulated by standards (external and patterns (internal and external. The high turnover rate of the supportive team has generated information loss and increased service variability. The research started from the assumption that the process standardization (which includes the formalization could be a way to reduce this secondary effect. Standardization techniques were identified through a literature review of the main national and international databases of journals and congresses. The identified standardization techniques were analyzed considering the number of times they appeared in the papers analyzed and performance objectives proposed by Slack et al. (2009. As result of this research, a framework was obtained for the standardization of processes adapted to the needs of the journal studied. The model is feasible to be used more widely, given its structural similarity to the one proposed by Campos (2004, a Brazilian model that is a reference in the field.

  11. Variability in effective radiating area and output power of new ultrasound transducers at 3 MHz.

    Science.gov (United States)

    Johns, Lennart D; Straub, Stephen J; Howard, Samuel M

    2007-01-01

    Spatial average intensity (SAI) is often used by clinicians to gauge therapeutic ultrasound dosage, yet SAI measures are not directly regulated by US Food and Drug Administration (FDA) standards. Current FDA guidelines permit a possible 50% to 150% minimum to maximum range of SAI values, potentially contributing to variability in clinical outcomes. To measure clinical values that describe ultrasound transducers and to determine the degree of intramanufacturer and intermanufacturer variability in effective radiating area, power, and SAI when the transducer is functioning at 3 MHz. A descriptive and interferential approach was taken to this quasi-experimental design. Measurement laboratory. Sixty-six 5-cm(2) ultrasound transducers were purchased from 6 different manufacturers. All transducers were calibrated and then assessed using standardized measurement techniques; SAI was normalized to account for variability in effective radiating area, resulting in an nSAI. Effective radiating area, power, and nSAI. All manufacturers with the exception of Omnisound (P = .534) showed a difference between the reported and measured effective radiating area values (P nSAI (P < .05) than all other manufacturers functioning at 3 MHz. Intramanufacturer variability in SAI ranged from 16% to 35%, and intermanufacturer variability ranged from 22% to 61%. Clinicians should consider treatment values of each individual transducer, regardless of the manufacturer. In addition, clinicians should scrutinize the power calibration and recalibration record of the transducer and adjust clinical settings as needed for the desired level of heating. Our data may aid in explaining the reported heating differences among transducers from different manufacturers. Stricter FDA standards regarding effective radiating area and total power are needed, and standards regulating SAI should be established.

  12. Sea-Level Trend Uncertainty With Pacific Climatic Variability and Temporally-Correlated Noise

    Science.gov (United States)

    Royston, Sam; Watson, Christopher S.; Legrésy, Benoît; King, Matt A.; Church, John A.; Bos, Machiel S.

    2018-03-01

    Recent studies have identified climatic drivers of the east-west see-saw of Pacific Ocean satellite altimetry era sea level trends and a number of sea-level trend and acceleration assessments attempt to account for this. We investigate the effect of Pacific climate variability, together with temporally-correlated noise, on linear trend error estimates and determine new time-of-emergence (ToE) estimates across the Indian and Pacific Oceans. Sea-level trend studies often advocate the use of auto-regressive (AR) noise models to adequately assess formal uncertainties, yet sea level often exhibits colored but non-AR(1) noise. Standard error estimates are over- or under-estimated by an AR(1) model for much of the Indo-Pacific sea level. Allowing for PDO and ENSO variability in the trend estimate only reduces standard errors across the tropics and we find noise characteristics are largely unaffected. Of importance for trend and acceleration detection studies, formal error estimates remain on average up to 1.6 times those from an AR(1) model for long-duration tide gauge data. There is an even chance that the observed trend from the satellite altimetry era exceeds the noise in patches of the tropical Pacific and Indian Oceans and the south-west and north-east Pacific gyres. By including climate indices in the trend analysis, the time it takes for the observed linear sea-level trend to emerge from the noise reduces by up to 2 decades.

  13. Standard NIM instrumentation system

    International Nuclear Information System (INIS)

    1990-05-01

    NIM is a standard modular instrumentation system that is in wide use throughout the world. As the NIM system developed and accommodations were made to a dynamic instrumentation field and a rapidly advancing technology, additions, revisions and clarifications were made. These were incorporated into the standard in the form of addenda and errata. This standard is a revision of the NIM document, AEC Report TID-20893 (Rev. 4) dated July 1974. It includes all the addenda and errata items that were previously issued as well as numerous additional items to make the standard current with modern technology and manufacturing practice

  14. Non-standard patch test

    Directory of Open Access Journals (Sweden)

    Astri Adelia

    2018-06-01

    Full Text Available In managing contact dermatitis, identification of the causative agent is essential to prevent recurrent complaints. Patch test is the gold standard to identify the causative agent. Nowadays, there are many patch test standard materials available in the market, but do not include all the materials that potentially cause contact dermatitis. Patch test using patient’s own products or later we refer to as non-standard materials, is very helpful in identifying the causative agents of contact dermatitis. Guidance is needed in producing non-standard patch test materials in order to avoid test results discrepancy.

  15. Implementation of Electrical Simulation Model for IEC Standard Type-3A Generator

    DEFF Research Database (Denmark)

    Subramanian, Chandrasekaran; Casadei, Domenico; Tani, Angelo

    2013-01-01

    This paper describes the implementation of electrical simulation model for IEC 61400-27-1 standard Type-3A generator. A general overview of the different wind electric generators(WEG) types are given and the main focused on Type-3A WEG standard models, namely a model for a variable speed wind tur...

  16. Evaluation of solid particle number and black carbon for very low particulate matter emissions standards in light-duty vehicles.

    Science.gov (United States)

    Chang, M-C Oliver; Shields, J Erin

    2017-06-01

    To reliably measure at the low particulate matter (PM) levels needed to meet California's Low Emission Vehicle (LEV III) 3- and 1-mg/mile particulate matter (PM) standards, various approaches other than gravimetric measurement have been suggested for testing purposes. In this work, a feasibility study of solid particle number (SPN, d50 = 23 nm) and black carbon (BC) as alternatives to gravimetric PM mass was conducted, based on the relationship of these two metrics to gravimetric PM mass, as well as the variability of each of these metrics. More than 150 Federal Test Procedure (FTP-75) or Supplemental Federal Test Procedure (US06) tests were conducted on 46 light-duty vehicles, including port-fuel-injected and direct-injected gasoline vehicles, as well as several light-duty diesel vehicles equipped with diesel particle filters (LDD/DPF). For FTP tests, emission variability of gravimetric PM mass was found to be slightly less than that of either SPN or BC, whereas the opposite was observed for US06 tests. Emission variability of PM mass for LDD/DPF was higher than that of both SPN and BC, primarily because of higher PM mass measurement uncertainties (background and precision) near or below 0.1 mg/mile. While strong correlations were observed from both SPN and BC to PM mass, the slopes are dependent on engine technologies and driving cycles, and the proportionality between the metrics can vary over the course of the test. Replacement of the LEV III PM mass emission standard with one other measurement metric may imperil the effectiveness of emission reduction, as a correlation-based relationship may evolve over future technologies for meeting stringent greenhouse standards. Solid particle number and black carbon were suggested in place of PM mass for the California LEV III 1-mg/mile FTP standard. Their equivalence, proportionality, and emission variability in comparison to PM mass, based on a large light-duty vehicle fleet examined, are dependent on engine

  17. EU-US standards harmonization task group report : feedback to ITS standards development organizations communications.

    Science.gov (United States)

    2012-11-01

    Harmonization Task Groups 1 and 3 (HTG1 and 3) were established by the EU-US International Standards Harmonization Working Group to attempt to harmonize standards (including ISO, CEN, ETSI, IEEE) on security (HTG1) and communications protocols (HTG3)...

  18. Truck Drivers And Risk Of STDs Including HIV

    Directory of Open Access Journals (Sweden)

    Bansal R.K

    1995-01-01

    Full Text Available Research Question: Whether long distance truck drivers are at a higher risk of contracting and transmitting STDs including HIV? Objectives: i To study the degree of knowledge of HIV and AIDS among long- distance truck drivers. ii Assess their sexual behaviour including condom use. iii Explore their prevailing social influences and substance abuse patterns. iv Explore their treatment seeking bahaviour as regards STDs. v Deduce their risk of contracting and transmitting STDs including HIV. Study Design: Cross- sectional interview. Setting: Transport Nagar, Indore (M.P Participants: 210 senior drivers (First drivers and 210 junior drivers (Second drivers. Study Variables: Extra-Marital sexual intercourse, condom usage, past and present history of STDs, treatment and counseling, substance abuse, social â€" cultural milieu. Outcome Variables: Risk of contraction of STDs. Statistical Analysis: Univariate analysis. Results: 94% of the drivers were totally ignorant about AIDS. 82.9% and 43.8 % of the senior and junior drivers had a history of extra- marital sex and of these only 2 regularly used condoms. 13.8% and 3.3 % of the senior and junior drivers had a past or present history suggestive of STD infection. Alcohol and Opium were regularly used by them. Conclusion: The studied drivers are at a high risk of contracting and transmitting STDs including HIV.

  19. Penalized variable selection in competing risks regression.

    Science.gov (United States)

    Fu, Zhixuan; Parikh, Chirag R; Zhou, Bingqing

    2017-07-01

    Penalized variable selection methods have been extensively studied for standard time-to-event data. Such methods cannot be directly applied when subjects are at risk of multiple mutually exclusive events, known as competing risks. The proportional subdistribution hazard (PSH) model proposed by Fine and Gray (J Am Stat Assoc 94:496-509, 1999) has become a popular semi-parametric model for time-to-event data with competing risks. It allows for direct assessment of covariate effects on the cumulative incidence function. In this paper, we propose a general penalized variable selection strategy that simultaneously handles variable selection and parameter estimation in the PSH model. We rigorously establish the asymptotic properties of the proposed penalized estimators and modify the coordinate descent algorithm for implementation. Simulation studies are conducted to demonstrate the good performance of the proposed method. Data from deceased donor kidney transplants from the United Network of Organ Sharing illustrate the utility of the proposed method.

  20. Efficient Business Service Consumption by Customization with Variability Modelling

    Directory of Open Access Journals (Sweden)

    Michael Stollberg

    2010-07-01

    Full Text Available The establishment of service orientation in industry determines the need for efficient engineering technologies that properly support the whole life cycle of service provision and consumption. A central challenge is adequate support for the efficient employment of komplex services in their individual application context. This becomes particularly important for large-scale enterprise technologies where generic services are designed for reuse in several business scenarios. In this article we complement our work regarding Service Variability Modelling presented in a previous publication. There we presented an approach for the customization of services for individual application contexts by creating simplified variants, based on model-driven variability management. That work presents our revised service variability metamodel, new features of the variability tools and an applicability study, which reveals that substantial improvements on the efficiency of standard business service consumption under both usability and economic aspects can be achieved.

  1. SELECTING QUASARS BY THEIR INTRINSIC VARIABILITY

    International Nuclear Information System (INIS)

    Schmidt, Kasper B.; Rix, Hans-Walter; Jester, Sebastian; Hennawi, Joseph F.; Marshall, Philip J.; Dobler, Gregory

    2010-01-01

    We present a new and simple technique for selecting extensive, complete, and pure quasar samples, based on their intrinsic variability. We parameterize the single-band variability by a power-law model for the light-curve structure function, with amplitude A and power-law index γ. We show that quasars can be efficiently separated from other non-variable and variable sources by the location of the individual sources in the A-γ plane. We use ∼60 epochs of imaging data, taken over ∼5 years, from the SDSS stripe 82 (S82) survey, where extensive spectroscopy provides a reference sample of quasars, to demonstrate the power of variability as a quasar classifier in multi-epoch surveys. For UV-excess selected objects, variability performs just as well as the standard SDSS color selection, identifying quasars with a completeness of 90% and a purity of 95%. In the redshift range 2.5 < z < 3, where color selection is known to be problematic, variability can select quasars with a completeness of 90% and a purity of 96%. This is a factor of 5-10 times more pure than existing color selection of quasars in this redshift range. Selecting objects from a broad griz color box without u-band information, variability selection in S82 can afford completeness and purity of 92%, despite a factor of 30 more contaminants than quasars in the color-selected feeder sample. This confirms that the fraction of quasars hidden in the 'stellar locus' of color space is small. To test variability selection in the context of Pan-STARRS 1 (PS1) we created mock PS1 data by down-sampling the S82 data to just six epochs over 3 years. Even with this much sparser time sampling, variability is an encouragingly efficient classifier. For instance, a 92% pure and 44% complete quasar candidate sample is attainable from the above griz-selected catalog. Finally, we show that the presented A-γ technique, besides selecting clean and pure samples of quasars (which are stochastically varying objects), is also

  2. Cutting forces during turning with variable depth of cut

    Directory of Open Access Journals (Sweden)

    M. Sadílek

    2016-03-01

    The proposed research for the paper is an experimental work – measuring cutting forces and monitoring of the tool wear on the cutting edge. It compares the turning where standard roughing cycle is used and the turning where the proposed roughing cycle with variable depth of cut is applied.

  3. Stochastic Optimal Estimation with Fuzzy Random Variables and Fuzzy Kalman Filtering

    Institute of Scientific and Technical Information of China (English)

    FENG Yu-hu

    2005-01-01

    By constructing a mean-square performance index in the case of fuzzy random variable, the optimal estimation theorem for unknown fuzzy state using the fuzzy observation data are given. The state and output of linear discrete-time dynamic fuzzy system with Gaussian noise are Gaussian fuzzy random variable sequences. An approach to fuzzy Kalman filtering is discussed. Fuzzy Kalman filtering contains two parts: a real-valued non-random recurrence equation and the standard Kalman filtering.

  4. Air Force standards for nickel hydrogen battery

    Science.gov (United States)

    Hwang, Warren; Milden, Martin

    1994-01-01

    The topics discussed are presented in viewgraph form and include Air Force nickel hydrogen standardization goals, philosophy, project outline, cell level standardization, battery level standardization, and schedule.

  5. Standard Guide for Testing Polymer Matrix Composite Materials

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2011-01-01

    1.1 This guide summarizes the application of ASTM standard test methods (and other supporting standards) to continuous-fiber reinforced polymer matrix composite materials. The most commonly used or most applicable ASTM standards are included, emphasizing use of standards of Committee D30 on Composite Materials. 1.2 This guide does not cover all possible standards that could apply to polymer matrix composites and restricts discussion to the documented scope. Commonly used but non-standard industry extensions of test method scopes, such as application of static test methods to fatigue testing, are not discussed. A more complete summary of general composite testing standards, including non-ASTM test methods, is included in the Composite Materials Handbook (MIL-HDBK-17). Additional specific recommendations for testing textile (fabric, braided) composites are contained in Guide D6856. 1.3 This guide does not specify a system of measurement; the systems specified within each of the referenced standards shall appl...

  6. Knee extension torque variability after exercise in ACL reconstructed knees.

    Science.gov (United States)

    Goetschius, John; Kuenze, Christopher M; Hart, Joseph M

    2015-08-01

    The purpose of this study was to compare knee extension torque variability in patients with ACL reconstructed knees before and after exercise. Thirty two patients with an ACL reconstructed knee (ACL-R group) and 32 healthy controls (control group) completed measures of maximal isometric knee extension torque (90° flexion) at baseline and following a 30-min exercise protocol (post-exercise). Exercise included 30-min of repeated cycles of inclined treadmill walking and hopping tasks. Dependent variables were the coefficient of variation (CV) and raw-change in CV (ΔCV): CV = (torque standard deviation/torque mean x 100), ΔCV = (post-exercise - baseline). There was a group-by-time interaction (p = 0.03) on CV. The ACL-R group demonstrated greater CV than the control group at baseline (ACL-R = 1.07 ± 0.55, control = 0.79 ± 0.42, p = 0.03) and post-exercise (ACL-R = 1.60 ± 0.91, control = 0.94 ± 0.41, p = 0.001). ΔCV was greater (p = 0.03) in the ACL-R group (0.52 ± 0.82) than control group (0.15 ± 0.46). CV significantly increased from baseline to post-exercise (p = 0.001) in the ACL-R group, while the control group did not (p = 0.06). The ACL-R group demonstrated greater knee extension torque variability than the control group. Exercise increased torque variability more in the ACL-R group than control group. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  7. Interhospital Variability in Perioperative Red Blood Cell Ordering Patterns in United States Pediatric Surgical Patients.

    Science.gov (United States)

    Thompson, Rachel M; Thurm, Cary W; Rothstein, David H

    2016-10-01

    To evaluate perioperative red blood cell (RBC) ordering and interhospital variability patterns in pediatric patients undergoing surgical interventions at US children's hospitals. This is a multicenter cross-sectional study of children aged blood type and crossmatch were included when done on the day before or the day of the surgical procedure. The RBC transfusions included were those given on the day of or the day after surgery. The type and crossmatch-to-transfusion ratio (TCTR) was calculated for each surgical procedure. An adjusted model for interhospital variability was created to account for variation in patient population by age, sex, race/ethnicity, payer type, and presence/number of complex chronic conditions (CCCs) per patient. A total of 357 007 surgical interventions were identified across all participating hospitals. Blood type and crossmatch was performed 55 632 times, and 13 736 transfusions were provided, for a TCTR of 4:1. There was an association between increasing age and TCTR (R(2) = 0.43). Patients with multiple CCCs had lower TCTRs, with a stronger relationship (R(2) = 0.77). There was broad variability in adjusted TCTRs among hospitals (range, 2.5-25). The average TCTR in US children's hospitals was double that of adult surgical data, and was associated with wide interhospital variability. Age and the presence of CCCs markedly influenced this ratio. Studies to evaluate optimal preoperative RBC ordering and standardization of practices could potentially decrease unnecessary costs and wasted blood. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. THE CHANDRA VARIABLE GUIDE STAR CATALOG

    International Nuclear Information System (INIS)

    Nichols, Joy S.; Lauer, Jennifer L.; Morgan, Douglas L.; Sundheim, Beth A.; Henden, Arne A.; Huenemoerder, David P.; Martin, Eric

    2010-01-01

    Variable stars have been identified among the optical-wavelength light curves of guide stars used for pointing control of the Chandra X-ray Observatory. We present a catalog of these variable stars along with their light curves and ancillary data. Variability was detected to a lower limit of 0.02 mag amplitude in the 4000-10000 A range using the photometrically stable Aspect Camera on board the Chandra spacecraft. The Chandra Variable Guide Star Catalog (VGUIDE) contains 827 stars, of which 586 are classified as definitely variable and 241 are identified as possibly variable. Of the 586 definite variable stars, we believe 319 are new variable star identifications. Types of variables in the catalog include eclipsing binaries, pulsating stars, and rotating stars. The variability was detected during the course of normal verification of each Chandra pointing and results from analysis of over 75,000 guide star light curves from the Chandra mission. The VGUIDE catalog represents data from only about 9 years of the Chandra mission. Future releases of VGUIDE will include newly identified variable guide stars as the mission proceeds. An important advantage of the use of space data to identify and analyze variable stars is the relatively long observations that are available. The Chandra orbit allows for observations up to 2 days in length. Also, guide stars were often used multiple times for Chandra observations, so many of the stars in the VGUIDE catalog have multiple light curves available from various times in the mission. The catalog is presented as both online data associated with this paper and as a public Web interface. Light curves with data at the instrumental time resolution of about 2 s, overplotted with the data binned at 1 ks, can be viewed on the public Web interface and downloaded for further analysis. VGUIDE is a unique project using data collected during the mission that would otherwise be ignored. The stars available for use as Chandra guide stars are

  9. Validity of (Ultra-)Short Recordings for Heart Rate Variability Measurements

    NARCIS (Netherlands)

    Muñoz Venegas, Loretto; van Roon, Arie; Riese, Harriette; Thio, Chris; Oostenbroek, Emma; Westrik, Iris; de Geus, Eco J. C.; Gansevoort, Ron; Lefrandt, Joop; Nolte, Ilja M.; Snieder, Harold

    2015-01-01

    Objectives In order to investigate the applicability of routine 10s electrocardiogram (ECG) recordings for time-domain heart rate variability (HRV) calculation we explored to what extent these (ultra-)short recordings capture the "actual" HRV. Methods The standard deviation of normal-to-normal

  10. The variable and chaotic nature of professional golf performance.

    Science.gov (United States)

    Stöckl, Michael; Lamb, Peter F

    2018-05-01

    In golf, unlike most other sports, individual performance is not the result of direct interactions between players. Instead decision-making and performance is influenced by numerous constraining factors affecting each shot. This study looked at the performance of PGA TOUR golfers in 2011 in terms of stability and variability on a shot-by-shot basis. Stability and variability were assessed using Recurrence Quantification Analysis (RQA) and standard deviation, respectively. About 10% of all shots comprised short stable phases of performance (3.7 ± 1.1 shots per stable phase). Stable phases tended to consist of shots of typical performance, rather than poor or exceptional shots; this finding was consistent for all shot categories. Overall, stability measures were not correlated with tournament performance. Variability across all shots was not related to tournament performance; however, variability in tee shots and short approach shots was higher than for other shot categories. Furthermore, tee shot variability was related to tournament standing: decreased variability was associated with better tournament ranking. The findings in this study showed that PGA TOUR golf performance is chaotic. Further research on amateur golf performance is required to determine whether the structure of amateur golf performance is universal.

  11. Models of simulation and prediction of the behavior of dengue in four Colombian cities, including climate like modulating variable of the disease

    International Nuclear Information System (INIS)

    Garcia Giraldo, Jairo A; Boshell, Jose Francisco

    2004-01-01

    ARIMA-type models are proposed to simulate the behavior of dengue and to make apparent the relations with the climatic variability in four localities of Colombia. The climatic variable was introduced into the models as an index that modulates the behavior of the disease. It was obtained by means of a multivariate analysis of principal components. The investigation was carried out with information corresponding to the epidemiological weeks from January 1997 to December 2000, for both the number of disease cases and the data corresponding to the meteorological variables. The study shows that the variations of the climate between the previous 9 to 14 weeks have influence on the appearance of new cases of dengue. In particular, the precipitation in these weeks was seen to be greater when in later periods the disease presented epidemic characteristics than the precipitation in those weeks preceded the disease within endemic limits

  12. Classification and prediction of port variables

    Energy Technology Data Exchange (ETDEWEB)

    Molina Serrano, B.

    2016-07-01

    Many variables are included in planning and management of port terminals. They can beeconomic, social, environmental and institutional. Agent needs to know relationshipbetween these variables to modify planning conditions. Use of Bayesian Networks allowsfor classifying, predicting and diagnosing these variables. Bayesian Networks allow forestimating subsequent probability of unknown variables, basing on know variables.In planning level, it means that it is not necessary to know all variables because theirrelationships are known. Agent can know interesting information about how port variablesare connected. It can be interpreted as cause-effect relationship. Bayesian Networks can beused to make optimal decisions by introduction of possible actions and utility of theirresults.In proposed methodology, a data base has been generated with more than 40 port variables.They have been classified in economic, social, environmental and institutional variables, inthe same way that smart port studies in Spanish Port System make. From this data base, anetwork has been generated using a non-cyclic conducted grafo which allows for knowingport variable relationships - parents-children relationships-. Obtained network exhibits thateconomic variables are – in cause-effect terms- cause of rest of variable typologies.Economic variables represent parent role in the most of cases. Moreover, whenenvironmental variables are known, obtained network allows for estimating subsequentprobability of social variables.It has been concluded that Bayesian Networks allow for modeling uncertainty in aprobabilistic way, even when number of variables is high as occurs in planning andmanagement of port terminals. (Author)

  13. Standards and Professional Development

    Science.gov (United States)

    Zengler, Cynthia J.

    2017-01-01

    The purpose of this paper is to describe the professional development that has taken place in conjunction with Ohio adopting the College and Career Readiness (CCR) Standards. The professional development (PD) has changed over time to include not only training on the new standards and lesson plans but training on the concepts defined in the…

  14. Assessing compositional variability through graphical analysis and Bayesian statistical approaches: case studies on transgenic crops.

    Science.gov (United States)

    Harrigan, George G; Harrison, Jay M

    2012-01-01

    New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.

  15. Importance of international standards on hydrogen technologies

    International Nuclear Information System (INIS)

    Bose, T.K.; Gingras, S.

    2001-01-01

    This presentation provided some basic information regarding standards and the International Organization for Standardization (ISO). It also explained the importance of standardization activities, particularly ISO/TC 197 which applies to hydrogen technologies. Standards are established by consensus. They define the minimum requirements that will ensure that products and services are reliable and effective. Standards contribute to the elimination of technical barriers to trade (TBT). The harmonization of standards around the world is desirable in a free trade environment. The influence of the TBT on international standardization was discussed with particular reference to the objectives of ISO/TC 197 hydrogen technologies. One of the priorities for ISO/TC 197 is a hydrogen fuel infrastructure which includes refuelling stations, fuelling connectors, and storage technologies for gaseous and liquid hydrogen. Other priorities include an agreement between the International Electrotechnical Commission (IEC) and the ISO, in particular the IEC/TC 105 and ISO/TC 197 for the development of fuel cell standards. The international standards that have been published thus far include ISO 13984:1999 for liquid hydrogen, land vehicle fuelling system interface, and ISO 14687:1999 for hydrogen fuel product specification. Standards are currently under development for: liquid hydrogen; airport hydrogen fuelling facilities; gaseous hydrogen blends; basic considerations for the safety of hydrogen systems; gaseous hydrogen and hydrogen blends; and gaseous hydrogen for land vehicle filling connectors. It was concluded that the widespread use of hydrogen is dependent on international standardization

  16. The Future of Geospatial Standards

    Science.gov (United States)

    Bermudez, L. E.; Simonis, I.

    2016-12-01

    The OGC is an international not-for-profit standards development organization (SDO) committed to making quality standards for the geospatial community. A community of more than 500 member organizations with more than 6,000 people registered at the OGC communication platform drives the development of standards that are freely available for anyone to use and to improve sharing of the world's geospatial data. OGC standards are applied in a variety of application domains including Environment, Defense and Intelligence, Smart Cities, Aviation, Disaster Management, Agriculture, Business Development and Decision Support, and Meteorology. Profiles help to apply information models to different communities, thus adapting to particular needs of that community while ensuring interoperability by using common base models and appropriate support services. Other standards address orthogonal aspects such as handling of Big Data, Crowd-sourced information, Geosemantics, or container for offline data usage. Like most SDOs, the OGC develops and maintains standards through a formal consensus process under the OGC Standards Program (OGC-SP) wherein requirements and use cases are discussed in forums generally open to the public (Domain Working Groups, or DWGs), and Standards Working Groups (SWGs) are established to create standards. However, OGC is unique among SDOs in that it also operates the OGC Interoperability Program (OGC-IP) to provide real-world testing of existing and proposed standards. The OGC-IP is considered the experimental playground, where new technologies are researched and developed in a user-driven process. Its goal is to prototype, test, demonstrate, and promote OGC Standards in a structured environment. Results from the OGC-IP often become requirements for new OGC standards or identify deficiencies in existing OGC standards that can be addressed. This presentation will provide an analysis of the work advanced in the OGC consortium including standards and testbeds

  17. Individualized anemia management reduces hemoglobin variability in hemodialysis patients.

    Science.gov (United States)

    Gaweda, Adam E; Aronoff, George R; Jacobs, Alfred A; Rai, Shesh N; Brier, Michael E

    2014-01-01

    One-size-fits-all protocol-based approaches to anemia management with erythropoiesis-stimulating agents (ESAs) may result in undesired patterns of hemoglobin variability. In this single-center, double-blind, randomized controlled trial, we tested the hypothesis that individualized dosing of ESA improves hemoglobin variability over a standard population-based approach. We enrolled 62 hemodialysis patients and followed them over a 12-month period. Patients were randomly assigned to receive ESA doses guided by the Smart Anemia Manager algorithm (treatment) or by a standard protocol (control). Dose recommendations, performed on a monthly basis, were validated by an expert physician anemia manager. The primary outcome was the percentage of hemoglobin concentrations between 10 and 12 g/dl over the follow-up period. A total of 258 of 356 (72.5%) hemoglobin concentrations were between 10 and 12 g/dl in the treatment group, compared with 208 of 336 (61.9%) in the control group; 42 (11.8%) hemoglobin concentrations were hemoglobin concentrations were >12 g/dl in the treatment group compared with 46 (13.4%) in the control group. The median ESA dosage per patient was 2000 IU/wk in both groups. Five participants received 6 transfusions (21 U) in the treatment group, compared with 8 participants and 13 transfusions (31 U) in the control group. These results suggest that individualized ESA dosing decreases total hemoglobin variability compared with a population protocol-based approach. As hemoglobin levels are declining in hemodialysis patients, decreasing hemoglobin variability may help reduce the risk of transfusions in this population.

  18. A new approach for modelling variability in residential construction projects

    Directory of Open Access Journals (Sweden)

    Mehrdad Arashpour

    2013-06-01

    Full Text Available The construction industry is plagued by long cycle times caused by variability in the supply chain. Variations or undesirable situations are the result of factors such as non-standard practices, work site accidents, inclement weather conditions and faults in design. This paper uses a new approach for modelling variability in construction by linking relative variability indicators to processes. Mass homebuilding sector was chosen as the scope of the analysis because data is readily available. Numerous simulation experiments were designed by varying size of capacity buffers in front of trade contractors, availability of trade contractors, and level of variability in homebuilding processes. The measurements were shown to lead to an accurate determination of relationships between these factors and production parameters. The variability indicator was found to dramatically affect the tangible performance measures such as home completion rates. This study provides for future analysis of the production homebuilding sector, which may lead to improvements in performance and a faster product delivery to homebuyers.

  19. A new approach for modelling variability in residential construction projects

    Directory of Open Access Journals (Sweden)

    Mehrdad Arashpour

    2013-06-01

    Full Text Available The construction industry is plagued by long cycle times caused by variability in the supply chain. Variations or undesirable situations are the result of factors such as non-standard practices, work site accidents, inclement weather conditions and faults in design. This paper uses a new approach for modelling variability in construction by linking relative variability indicators to processes. Mass homebuilding sector was chosen as the scope of the analysis because data is readily available. Numerous simulation experiments were designed by varying size of capacity buffers in front of trade contractors, availability of trade contractors, and level of variability in homebuilding processes. The measurements were shown to lead to an accurate determination of relationships between these factors and production parameters. The variability indicator was found to dramatically affect the tangible performance measures such as home completion rates. This study provides for future analysis of the production homebuilding sector, which may lead to improvements in performance and a faster product delivery to homebuyers. 

  20. Diagnostic accuracy research in glaucoma is still incompletely reported: An application of Standards for Reporting of Diagnostic Accuracy Studies (STARD 2015.

    Directory of Open Access Journals (Sweden)

    Manuele Michelessi

    Full Text Available Research has shown a modest adherence of diagnostic test accuracy (DTA studies in glaucoma to the Standards for Reporting of Diagnostic Accuracy Studies (STARD. We have applied the updated 30-item STARD 2015 checklist to a set of studies included in a Cochrane DTA systematic review of imaging tools for diagnosing manifest glaucoma.Three pairs of reviewers, including one senior reviewer who assessed all studies, independently checked the adherence of each study to STARD 2015. Adherence was analyzed on an individual-item basis. Logistic regression was used to evaluate the effect of publication year and impact factor on adherence.We included 106 DTA studies, published between 2003-2014 in journals with a median impact factor of 2.6. Overall adherence was 54.1% for 3,286 individual rating across 31 items, with a mean of 16.8 (SD: 3.1; range 8-23 items per study. Large variability in adherence to reporting standards was detected across individual STARD 2015 items, ranging from 0 to 100%. Nine items (1: identification as diagnostic accuracy study in title/abstract; 6: eligibility criteria; 10: index test (a and reference standard (b definition; 12: cut-off definitions for index test (a and reference standard (b; 14: estimation of diagnostic accuracy measures; 21a: severity spectrum of diseased; 23: cross-tabulation of the index and reference standard results were adequately reported in more than 90% of the studies. Conversely, 10 items (3: scientific and clinical background of the index test; 11: rationale for the reference standard; 13b: blinding of index test results; 17: analyses of variability; 18; sample size calculation; 19: study flow diagram; 20: baseline characteristics of participants; 28: registration number and registry; 29: availability of study protocol; 30: sources of funding were adequately reported in less than 30% of the studies. Only four items showed a statistically significant improvement over time: missing data (16, baseline

  1. Instrumental variables I: instrumental variables exploit natural variation in nonexperimental data to estimate causal relationships.

    Science.gov (United States)

    Rassen, Jeremy A; Brookhart, M Alan; Glynn, Robert J; Mittleman, Murray A; Schneeweiss, Sebastian

    2009-12-01

    The gold standard of study design for treatment evaluation is widely acknowledged to be the randomized controlled trial (RCT). Trials allow for the estimation of causal effect by randomly assigning participants either to an intervention or comparison group; through the assumption of "exchangeability" between groups, comparing the outcomes will yield an estimate of causal effect. In the many cases where RCTs are impractical or unethical, instrumental variable (IV) analysis offers a nonexperimental alternative based on many of the same principles. IV analysis relies on finding a naturally varying phenomenon, related to treatment but not to outcome except through the effect of treatment itself, and then using this phenomenon as a proxy for the confounded treatment variable. This article demonstrates how IV analysis arises from an analogous but potentially impossible RCT design, and outlines the assumptions necessary for valid estimation. It gives examples of instruments used in clinical epidemiology and concludes with an outline on estimation of effects.

  2. Cortical Brain Atrophy and Intra-Individual Variability in Neuropsychological Test Performance in HIV Disease

    Science.gov (United States)

    HINES, Lindsay J.; MILLER, Eric N.; HINKIN, Charles H.; ALGER, Jeffery R.; BARKER, Peter; GOODKIN, Karl; MARTIN, Eileen M.; MARUCA, Victoria; RAGIN, Ann; SACKTOR, Ned; SANDERS, Joanne; SELNES, Ola; BECKER, James T.

    2015-01-01

    Objective To characterize the relationship between dispersion-based intra-individual variability (IIVd) in neuropsychological test performance and brain volume among HIV seropositive and seronegative men and to determine the effects of cardiovascular risk and HIV infection on this relationship. Methods Magnetic Resonance Imaging (MRI) was used to acquire high-resolution neuroanatomic data from 147 men age 50 and over, including 80 HIV seropositive (HIV+) and 67 seronegative controls (HIV−) in this cross-sectional cohort study. Voxel Based Morphometry was used to derive volumetric measurements at the level of the individual voxel. These brain structure maps were analyzed using Statistical Parametric Mapping (SPM2). IIVd was measured by computing intra-individual standard deviations (ISD’s) from the standardized performance scores of five neuropsychological tests: Wechsler Memory Scale-III Visual Reproduction I and II, Logical Memory I and II, Wechsler Adult Intelligence Scale-III Letter Number Sequencing. Results Total gray matter (GM) volume was inversely associated with IIVd. Among all subjects, IIVd -related GM atrophy was observed primarily in: 1) the inferior frontal gyrus bilaterally, the left inferior temporal gyrus extending to the supramarginal gyrus, spanning the lateral sulcus; 2) the right superior parietal lobule and intraparietal sulcus; and, 3) dorsal/ventral regions of the posterior section of the transverse temporal gyrus. HIV status, biological, and cardiovascular disease (CVD) variables were not linked to IIVd -related GM atrophy. Conclusions IIVd in neuropsychological test performance may be a sensitive marker of cortical integrity in older adults, regardless of HIV infection status or CVD risk factors, and degree of intra-individual variability links with volume loss in specific cortical regions; independent of mean-level performance on neuropsychological tests. PMID:26303224

  3. Nonlinear method for including the mass uncertainty of standards and the system measurement errors in the fitting of calibration curves

    International Nuclear Information System (INIS)

    Pickles, W.L.; McClure, J.W.; Howell, R.H.

    1978-01-01

    A sophisticated nonlinear multiparameter fitting program was used to produce a best fit calibration curve for the response of an x-ray fluorescence analyzer to uranium nitrate, freeze dried, 0.2% accurate, gravimetric standards. The program is based on unconstrained minimization subroutine, VA02A. The program considers the mass values of the gravimetric standards as parameters to be fit along with the normal calibration curve parameters. The fitting procedure weights with the system errors and the mass errors in a consistent way. The resulting best fit calibration curve parameters reflect the fact that the masses of the standard samples are measured quantities with a known error. Error estimates for the calibration curve parameters can be obtained from the curvature of the ''Chi-Squared Matrix'' or from error relaxation techniques. It was shown that nondispersive XRFA of 0.1 to 1 mg freeze-dried UNO 3 can have an accuracy of 0.2% in 1000 s. 5 figures

  4. Is Your Biobank Up to Standards? A Review of the National Canadian Tissue Repository Network Required Operational Practice Standards and the Controlled Documents of a Certified Biobank.

    Science.gov (United States)

    Hartman, Victoria; Castillo-Pelayo, Tania; Babinszky, Sindy; Dee, Simon; Leblanc, Jodi; Matzke, Lise; O'Donoghue, Sheila; Carpenter, Jane; Carter, Candace; Rush, Amanda; Byrne, Jennifer; Barnes, Rebecca; Mes-Messons, Anne-Marie; Watson, Peter

    2018-02-01

    Ongoing quality management is an essential part of biobank operations and the creation of high quality biospecimen resources. Adhering to the standards of a national biobanking network is a way to reduce variability between individual biobank processes, resulting in cross biobank compatibility and more consistent support for health researchers. The Canadian Tissue Repository Network (CTRNet) implemented a set of required operational practices (ROPs) in 2011 and these serve as the standards and basis for the CTRNet biobank certification program. A review of these 13 ROPs covering 314 directives was conducted after 5 years to identify areas for revision and update, leading to changes to 7/314 directives (2.3%). A review of all internal controlled documents (including policies, standard operating procedures and guides, and forms for actions and processes) used by the BC Cancer Agency's Tumor Tissue Repository (BCCA-TTR) to conform to these ROPs was then conducted. Changes were made to 20/106 (19%) of BCCA-TTR documents. We conclude that a substantial fraction of internal controlled documents require updates at regular intervals to accommodate changes in best practices. Reviewing documentation is an essential aspect of keeping up to date with best practices and ensuring the quality of biospecimens and data managed by biobanks.

  5. Inlet-engine matching for SCAR including application of a bicone variable geometry inlet. [Supersonic Cruise Aircraft Research

    Science.gov (United States)

    Wasserbauer, J. F.; Gerstenmaier, W. H.

    1978-01-01

    Airflow characteristics of variable cycle engines (VCE) designed for Mach 2.32 can have transonic airflow requirements as high as 1.6 times the cruise airflow. This is a formidable requirement for conventional, high performance, axisymmetric, translating centerbody mixed compression inlets. An alternate inlet is defined where the second cone of a two cone centerbody collapses to the initial cone angle to provide a large off-design airflow capability, and incorporates modest centerbody translation to minimize spillage drag. Estimates of transonic spillage drag are competitive with those of conventional translating centerbody inlets. The inlet's cruise performance exhibits very low bleed requirements with good recovery and high angle of attack capability.

  6. 76 FR 59014 - Standard for the Flammability of Mattresses and Mattress Pads; Technical Amendment

    Science.gov (United States)

    2011-09-23

    ... of the Standard; rather, SRM usage ensures continuity of a reliably high PFLB with low variability in... demonstrates that the PFLB performance of commercial cigarettes is subject to significant variability that can... industry has sufficient test data to support the hypothesis that RIP cigarettes consistently self...

  7. Short-timescale variability in cataclysmic binaries

    International Nuclear Information System (INIS)

    Cordova, F.A.; Mason, K.O.

    1982-01-01

    Rapid variability, including flickering and pulsations, has been detected in cataclysmic binaries at optical and x-ray frequencies. In the case of the novalike variable TT Arietis, simultaneous observations reveal that the x-ray and optical flickering activity is strongly correlated, while short period pulsations are observed that occur at the same frequencies in both wavelength bands

  8. Slit-scanning technique using standard cell sorter instruments for analyzing and sorting nonacrocentric human chromosomes, including small ones

    NARCIS (Netherlands)

    Rens, W.; van Oven, C. H.; Stap, J.; Jakobs, M. E.; Aten, J. A.

    1994-01-01

    We have investigated the performance of two types of standard flow cell sorter instruments, a System 50 Cytofluorograph and a FACSTar PLUS cell sorter, for the on-line centromeric index (CI) analysis of human chromosomes. To optimize the results, we improved the detection efficiency for centromeres

  9. Study on distributed generation algorithm of variable precision concept lattice based on ontology heterogeneous database

    Science.gov (United States)

    WANG, Qingrong; ZHU, Changfeng

    2017-06-01

    Integration of distributed heterogeneous data sources is the key issues under the big data applications. In this paper the strategy of variable precision is introduced to the concept lattice, and the one-to-one mapping mode of variable precision concept lattice and ontology concept lattice is constructed to produce the local ontology by constructing the variable precision concept lattice for each subsystem, and the distributed generation algorithm of variable precision concept lattice based on ontology heterogeneous database is proposed to draw support from the special relationship between concept lattice and ontology construction. Finally, based on the standard of main concept lattice of the existing heterogeneous database generated, a case study has been carried out in order to testify the feasibility and validity of this algorithm, and the differences between the main concept lattice and the standard concept lattice are compared. Analysis results show that this algorithm above-mentioned can automatically process the construction process of distributed concept lattice under the heterogeneous data sources.

  10. ENDF/B-5 Standards Data Library (including modifications made in 1986). Summary of contents and documentation

    International Nuclear Information System (INIS)

    DayDay, N.; Lemmel, H.D.

    1986-01-01

    This document summarizes the contents and documentation of the ENDF/B-5 Standards Data Library (EN5-ST) released in September 1979. The library contains complete evaluations for all significant neutron reactions in the energy range 10 -5 eV to 20 MeV for H-1, He-3, Li-6, B-10, C-12, Au-197 and U-235 isotopes. In 1986 the files for C-12, Au-197 and U-235 were slightly modified. The entire library or selective retrievals from it can be obtained free of charge from the IAEA Nuclear Data Section. (author)

  11. ENDF/B-5 Standards Data Library (including modifications made in 1986). Summary of contents and documentation

    Energy Technology Data Exchange (ETDEWEB)

    DayDay, N; Lemmel, H D

    1986-05-01

    This document summarizes the contents and documentation of the ENDF/B-5 Standards Data Library (EN5-ST) released in September 1979. The library contains complete evaluations for all significant neutron reactions in the energy range 10{sup -5}eV to 20 MeV for H-1, He-3, Li-6, B-10, C-12, Au-197 and U-235 isotopes. In 1986 the files for C-12, Au-197 and U-235 were slightly modified. The entire library or selective retrievals from it can be obtained free of charge from the IAEA Nuclear Data Section. (author) Refs, figs, tabs

  12. The influence of solar wind variability on magnetospheric ULF wave power

    Directory of Open Access Journals (Sweden)

    D. Pokhotelov

    2015-06-01

    Full Text Available Magnetospheric ultra-low frequency (ULF oscillations in the Pc 4–5 frequency range play an important role in the dynamics of Earth's radiation belts, both by enhancing the radial diffusion through incoherent interactions and through the coherent drift-resonant interactions with trapped radiation belt electrons. The statistical distributions of magnetospheric ULF wave power are known to be strongly dependent on solar wind parameters such as solar wind speed and interplanetary magnetic field (IMF orientation. Statistical characterisation of ULF wave power in the magnetosphere traditionally relies on average solar wind–IMF conditions over a specific time period. In this brief report, we perform an alternative characterisation of the solar wind influence on magnetospheric ULF wave activity through the characterisation of the solar wind driver by its variability using the standard deviation of solar wind parameters rather than a simple time average. We present a statistical study of nearly one solar cycle (1996–2004 of geosynchronous observations of magnetic ULF wave power and find that there is significant variation in ULF wave powers as a function of the dynamic properties of the solar wind. In particular, we find that the variability in IMF vector, rather than variabilities in other parameters (solar wind density, bulk velocity and ion temperature, plays the strongest role in controlling geosynchronous ULF power. We conclude that, although time-averaged bulk properties of the solar wind are a key factor in driving ULF powers in the magnetosphere, the solar wind variability can be an important contributor as well. This highlights the potential importance of including solar wind variability especially in studies of ULF wave dynamics in order to assess the efficiency of solar wind–magnetosphere coupling.

  13. Financial development and investment market integration: An approach of underlying financial variables & indicators for corporate governance growth empirical approach

    Directory of Open Access Journals (Sweden)

    Vojinovič Borut

    2005-01-01

    Full Text Available Financial development is correlated with several underlying regulatory variables (such as indicators of investor protection, market transparency variables for corporate governance growth and rules for capital market development, which are under the control of national legislators and EU directives. This paper provides estimates of the relationship between financial market development and corporate growth and assesses the impact of financial market integration on this relationship with reference to European Union (EU countries. The regression results obtained using this panel support the hypothesis that financial development promotes growth particularly in industries that are more financially dependent on external finance. For policy purposes, analyzing changes in these regulatory variables may be a more interesting exercise than analyzing integration of the financial systems themselves. Since assuming that EU countries will raise its regulatory and legal standards to the U.S. standards appears unrealistic, in this case we examine a scenario where EU countries raise their standards to the highest current EU standard.

  14. Factors influencing incidence of acute grade 2 morbidity in conformal and standard radiation treatment of prostate cancer

    International Nuclear Information System (INIS)

    Hanks, Gerald E.; Schultheiss, Timothy E.; Hunt, Margie A.; Epstein, Barry

    1995-01-01

    Purpose: The fundament hypothesis of conformal radiation therapy is that tumor control can be increased by using conformal treatment techniques that allow a higher tumor dose while maintaining an acceptable level of complications. To test this hypothesis, it is necessary first to estimate the incidence of morbidity for both standard and conformal fields. In this study, we examine factors that influence the incidence of acute grade 2 morbidity in patients treated with conformal and standard radiation treatment for prostate cancer. Methods and Materials: Two hundred and forty-seven consecutive patients treated with conformal technique are combined with and compared to 162 consecutive patients treated with standard techniques. The conformal technique includes special immobilization by a cast, careful identification of the target volume in three dimensions, localization of the inferior border of the prostate using the retrograde urethrogram, and individually shaped portals that conform to the Planning Target Volume (PTV). Univariate analysis compares differences in the incidence of RTOG-EORTC grade two acute morbidity by technique, T stage, age, irradiated volume, and dose. Multivariate logistic regression includes these same variables. Results: In nearly all categories, the conformal treatment group experienced significantly fewer acute grade 2 complications than the standard treatment group. Only volume (prostate ± whole pelvis) and technique (conformal vs. standard) were significantly related to incidence of morbidity on multivariate analysis. When dose is treated as a continuous variable (rather than being dichotomized into two levels), a trend is observed on multivariate analysis, but it does not reach significant levels. The incidence of acute grade 2 morbidity in patients 65 years or older is significantly reduced by use of the conformal technique. Conclusion: The conformal technique is associated with fewer grade 2 acute toxicities for all patients. This

  15. Changes in heart rate variability and QT variability during the first trimester of pregnancy.

    Science.gov (United States)

    Carpenter, R E; D'Silva, L A; Emery, S J; Uzun, O; Rassi, D; Lewis, M J

    2015-03-01

    The risk of new-onset arrhythmia during pregnancy is high, presumably relating to changes in both haemodynamic and cardiac autonomic function. The ability to non-invasively assess an individual's risk of developing arrhythmia during pregnancy would therefore be clinically significant. We aimed to quantify electrocardiographic temporal characteristics during the first trimester of pregnancy and to compare these with non-pregnant controls. Ninety-nine pregnant women and sixty-three non-pregnant women underwent non-invasive cardiovascular and haemodynamic assessment during a protocol consisting of various physiological states (postural manoeurvres, light exercise and metronomic breathing). Variables measured included stroke volume, cardiac output, heart rate, heart rate variability, QT and QT variability and QTVI (a measure of the variability of QT relative to that of RR). Heart rate (p pregnancy only during the supine position (p pregnancy in all physiological states (p pregnancy in all states (p pregnancy is associated with substantial changes in heart rate variability, reflecting a reduction in parasympathetic tone and an increase in sympathetic activity. QTVI shifted to a less favourable value, reflecting a greater than normal amount of QT variability. QTVI appears to be a useful method for quantifying changes in QT variability relative to RR (or heart rate) variability, being sensitive not only to physiological state but also to gestational age. We support the use of non-invasive markers of cardiac electrical variability to evaluate the risk of arrhythmic events in pregnancy, and we recommend the use of multiple physiological states during the assessment protocol.

  16. 32 CFR 147.19 - The three standards.

    Science.gov (United States)

    2010-07-01

    ...) The investigation standard for“Q” access authorizations and for access to top secret (including top secret Special Access Programs) and Sensitive Compartmented Information; (c) The reinvestigation standard... authorizations and for access to confidential and secret (including all secret-level Special Access Programs not...

  17. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  18. Studies for the elaboration of standards of energy efficiency in constructions; Estudios para la elaboracion de normas de eficiencia energetica en edificaciones

    Energy Technology Data Exchange (ETDEWEB)

    Ramos Niembro, Gaudencio; Heard, Christopher [Instituto de Investigaciones Electricas, Temixco, Morelos (Mexico); Hernandez Pensado, Fernando [Comision Nacional para el Ahorro de Energia (Mexico)

    1999-07-01

    The obtaining of the first drafts of standard of energy efficiency implicate diverse research work previous to its elaboration, which include the justification and the analysis of the reason of the variables to consider, the corresponding cost-benefit study that gives viability to the standard and a procedure of calculation for its fulfillment. The present paper relates the diverse aspects that were analyzed for the justification, the revision and the obtaining of the first drafts of the first standards of energy efficiency in nonresidential constructions that are hoped to be implemented in Mexico. [Spanish] La obtencion de los anteproyectos de norma de eficiencia energetica implica diversos trabajos de investigacion previos a su elaboracion, los cuales incluyen la justificacion y el analisis del porque de las variables a considerar, el estudio costo- beneficio correspondiente, que da viabilidad a la norma y un procedimiento de calculo para su cumplimiento. El presente trabajo relata los diversos aspectos que se analizaron para la justificacion, la revision y la obtencion de los anteproyectos de las primeras normas de eficiencia energetica en edificaciones no residenciales que se esperan implantar en Mexico.

  19. Brown Dwarf Variability: What's Varying and Why?

    Science.gov (United States)

    Marley, Mark Scott

    2014-01-01

    Surveys by ground based telescopes, HST, and Spitzer have revealed that brown dwarfs of most spectral classes exhibit variability. The spectral and temporal signatures of the variability are complex and apparently defy simplistic classification which complicates efforts to model the changes. Important questions include understanding if clearings are forming in an otherwise uniform cloud deck or if thermal perturbations, perhaps associated with breaking gravity waves, are responsible. If clouds are responsible how long does it take for the atmospheric thermal profile to relax from a hot cloudy to a cooler cloudless state? If thermal perturbations are responsible then what atmospheric layers are varying? How do the observed variability timescales compare to atmospheric radiative, chemical, and dynamical timescales? I will address such questions by presenting modeling results for time-varying partly cloudy atmospheres and explore the importance of various atmospheric processes over the relevant timescales for brown dwarfs of a range of effective temperatures. Regardless of the origin of the observed variability, the complexity seen in the atmospheres of the field dwarfs hints at the variability that we may encounter in the next few years in directly imaged young Jupiters. Thus understanding the nature of variability in the field dwarfs, including sensitivity to gravity and metallicity, is of particular importance for exoplanet characterization.

  20. Adherence to a Standardized Order Form for Gastric Cancer in a Referral Chemotherapy Teaching Hospital, Mashhad, Iran

    Directory of Open Access Journals (Sweden)

    Mitra Asgarian

    2017-09-01

    Full Text Available Background: Standardized forms for prescription and medication administration are one solution to reduce medication errors in the chemotherapy process. Gastric cancer is the most common cancer in Iran. In this study, we have attempted to design and validate a standard printed chemotherapy form and evaluate adherence by oncologists and nurses to this form. Methods: We performed this cross-sectional study in a Mashhad, Iran teaching hospital from August 2015 until January 2016. A clinical pharmacist designed the chemotherapy form that included various demographic and clinical parameters and approved chemotherapy regimens for gastric cancer. Clinical oncologists that worked in this center validated the form. We included all eligible patients. A pharmacy student identified adherence by the oncologists and nurses to this form and probable medication errors. Results are mean ± standard deviation or number (percentages for nominal variables. Data analysis was performed using the SPSS 16.0 statistical package. Results:We evaluated 54 patients and a total of 249 chemotherapy courses. In 146 (58.63% chemotherapy sessions, the administered regimens lacked compatibility with the standard form. Approximately 66% of recorded errors occurred in the prescription phase and the remainder during the administration phase. The most common errors included improper dose (61% and wrong infusion time (34%. We observed that 37 dose calculation errors occurred in 32 chemotherapy sessions. Conclusions: In general, adherence by oncologists and nurses with the developed form for chemotherapy treatment of gastric cancer was not acceptable. These findings indicated the necessity for a standardized order sheet to simplify the chemotherapy process for the clinicians, and reduce prescription and administration errors.

  1. Evolution and Outbursts of Cataclysmic Variables

    Directory of Open Access Journals (Sweden)

    S.-B. Qian

    2015-02-01

    Full Text Available Mass transfer and accretion are very important to understand the evolution and observational properties of cataclysmic variables (CVs. Due to the lack of an accretion disk, eclipsing profiles of polars are the best source to study the character of mass transfer in CVs. By analyzing long-term photometric variations in the eclipsing polar HU Aqr, the property of mass transfer and accretion are investigated. The correlation between the brightness state change and the variation of the ingress profile suggests that both the accretion hot spot and the accretion stream are produced instantaneously. The observations clearly show that it is the variation of mass transfer causing the brightness state changes that is a direct evidence of variable mass transfer in a CV. It is shown that it is the local dark-spot activity near the L1 point to cause the change of the mass transfer rather than the activity cycles of the cool secondary star. Our results suggest that the evolution of CVs is more complex than that predicted by the standard model and we should consider the effect of variable mass accretion in nova and dwarf nova outbursts.

  2. NACP Site: Terrestrial Biosphere Model and Aggregated Flux Data in Standard Format

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set provides standardized output variables for gross primary productivity (GPP), net ecosystem exchange (NEE), leaf area index (LAI), ecosystem respiration...

  3. NACP Site: Terrestrial Biosphere Model and Aggregated Flux Data in Standard Format

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: This data set provides standardized output variables for gross primary productivity (GPP), net ecosystem exchange (NEE), leaf area index (LAI), ecosystem...

  4. Effects of central nervous system drugs on driving: speed variability versus standard deviation of lateral position as outcome measure of the on-the-road driving test.

    Science.gov (United States)

    Verster, Joris C; Roth, Thomas

    2014-01-01

    The on-the-road driving test in normal traffic is used to examine the impact of drugs on driving performance. This paper compares the sensitivity of standard deviation of lateral position (SDLP) and SD speed in detecting driving impairment. A literature search was conducted to identify studies applying the on-the-road driving test, examining the effects of anxiolytics, antidepressants, antihistamines, and hypnotics. The proportion of comparisons (treatment versus placebo) where a significant impairment was detected with SDLP and SD speed was compared. About 40% of 53 relevant papers did not report data on SD speed and/or SDLP. After placebo administration, the correlation between SDLP and SD speed was significant but did not explain much variance (r = 0.253, p = 0.0001). A significant correlation was found between ΔSDLP and ΔSD speed (treatment-placebo), explaining 48% of variance. When using SDLP as outcome measure, 67 significant treatment-placebo comparisons were found. Only 17 (25.4%) were significant when SD speed was used as outcome measure. Alternatively, for five treatment-placebo comparisons, a significant difference was found for SD speed but not for SDLP. Standard deviation of lateral position is a more sensitive outcome measure to detect driving impairment than speed variability.

  5. A novel synthetic quantification standard including virus and internal report targets: application for the detection and quantification of emerging begomoviruses on tomato.

    Science.gov (United States)

    Péréfarres, Frédéric; Hoareau, Murielle; Chiroleu, Frédéric; Reynaud, Bernard; Dintinger, Jacques; Lett, Jean-Michel

    2011-08-05

    Begomovirus is a genus of phytopathogenic single-stranded DNA viruses, transmitted by the whitefly Bemisia tabaci. This genus includes emerging and economically significant viruses such as those associated with Tomato Yellow Leaf Curl Disease, for which diagnostic tools are needed to prevent dispersion and new introductions. Five real-time PCRs with an internal tomato reporter gene were developed for accurate detection and quantification of monopartite begomoviruses, including two strains of the Tomato yellow leaf curl virus (TYLCV; Mld and IL strains), the Tomato leaf curl Comoros virus-like viruses (ToLCKMV-like viruses) and the two molecules of the bipartite Potato yellow mosaic virus. These diagnostic tools have a unique standard quantification, comprising the targeted viral and internal report amplicons. These duplex real-time PCRs were applied to artificially inoculated plants to monitor and compare their viral development. Real-time PCRs were optimized for accurate detection and quantification over a range of 2 × 10(9) to 2 × 10(3) copies of genomic viral DNA/μL for TYLCV-Mld, TYLCV-IL and PYMV-B and 2 × 10(8) to 2 × 10(3) copies of genomic viral DNA/μL for PYMV-A and ToLCKMV-like viruses. These real-time PCRs were applied to artificially inoculated plants and viral loads were compared at 10, 20 and 30 days post-inoculation. Different patterns of viral accumulation were observed between the bipartite and the monopartite begomoviruses. Interestingly, PYMV accumulated more viral DNA at each date for both genomic components compared to all the monopartite viruses. Also, PYMV reached its highest viral load at 10 dpi contrary to the other viruses (20 dpi). The accumulation kinetics of the two strains of emergent TYLCV differed from the ToLCKMV-like viruses in the higher quantities of viral DNA produced in the early phase of the infection and in the shorter time to reach this peak viral load. To detect and quantify a wide range of begomoviruses, five duplex

  6. A novel synthetic quantification standard including virus and internal report targets: application for the detection and quantification of emerging begomoviruses on tomato

    Directory of Open Access Journals (Sweden)

    Lett Jean-Michel

    2011-08-01

    Full Text Available Abstract Background Begomovirus is a genus of phytopathogenic single-stranded DNA viruses, transmitted by the whitefly Bemisia tabaci. This genus includes emerging and economically significant viruses such as those associated with Tomato Yellow Leaf Curl Disease, for which diagnostic tools are needed to prevent dispersion and new introductions. Five real-time PCRs with an internal tomato reporter gene were developed for accurate detection and quantification of monopartite begomoviruses, including two strains of the Tomato yellow leaf curl virus (TYLCV; Mld and IL strains, the Tomato leaf curl Comoros virus-like viruses (ToLCKMV-like viruses and the two molecules of the bipartite Potato yellow mosaic virus. These diagnostic tools have a unique standard quantification, comprising the targeted viral and internal report amplicons. These duplex real-time PCRs were applied to artificially inoculated plants to monitor and compare their viral development. Results Real-time PCRs were optimized for accurate detection and quantification over a range of 2 × 109 to 2 × 103 copies of genomic viral DNA/μL for TYLCV-Mld, TYLCV-IL and PYMV-B and 2 × 108 to 2 × 103 copies of genomic viral DNA/μL for PYMV-A and ToLCKMV-like viruses. These real-time PCRs were applied to artificially inoculated plants and viral loads were compared at 10, 20 and 30 days post-inoculation. Different patterns of viral accumulation were observed between the bipartite and the monopartite begomoviruses. Interestingly, PYMV accumulated more viral DNA at each date for both genomic components compared to all the monopartite viruses. Also, PYMV reached its highest viral load at 10 dpi contrary to the other viruses (20 dpi. The accumulation kinetics of the two strains of emergent TYLCV differed from the ToLCKMV-like viruses in the higher quantities of viral DNA produced in the early phase of the infection and in the shorter time to reach this peak viral load. Conclusions To detect and

  7. MR imaging of the articular cartilage of the knee with arthroscopy as gold standard: assessment of methodological quality of clinical studies

    International Nuclear Information System (INIS)

    Duchateau, Florence; Berg, Bruno C. vande

    2002-01-01

    The purpose of this study was to assess the methodological quality of articles addressing the value of MR imaging of the knee cartilage with arthroscopy as a standard. Relevant papers were selected after Medline review (MEDLINE database search including the terms ''cartilage'' ''knee'', ''MR'' and ''arthroscopy''). Two observers reviewed independently 29 selected articles to determine how each study had met 15 individual standards that had been previously developed to assess the methodological quality of clinical investigations. The following criteria were met in variable percentage of articles including adequate definition of purpose (100%), statistical analysis (90%), avoidance of verification bias (86%), patient population description (83%), reference standard (79%), review bias (79%), study design (66%), inclusion criteria (41%) and method of analysis (41.5%), avoidance of diagnostic-review bias (24%), exclusion criteria (21%), indeterminate examination results (17%), analysis criteria (14%), interobserver reliability (14%) and intraobserver reliability (7%). The assessment of the methodological quality of clinical investigations addressing the value of MR imaging in the evaluation of the articular cartilage of the knee with arthroscopy as the standard of reference demonstrated that several standards were rarely met in the literature. Efforts should be made to rely on clearly defined lesion criteria and to determine reliability of the observations. (orig.)

  8. Standard deviation and standard error of the mean.

    Science.gov (United States)

    Lee, Dong Kyu; In, Junyong; Lee, Sangseok

    2015-06-01

    In most clinical and experimental studies, the standard deviation (SD) and the estimated standard error of the mean (SEM) are used to present the characteristics of sample data and to explain statistical analysis results. However, some authors occasionally muddle the distinctive usage between the SD and SEM in medical literature. Because the process of calculating the SD and SEM includes different statistical inferences, each of them has its own meaning. SD is the dispersion of data in a normal distribution. In other words, SD indicates how accurately the mean represents sample data. However the meaning of SEM includes statistical inference based on the sampling distribution. SEM is the SD of the theoretical distribution of the sample means (the sampling distribution). While either SD or SEM can be applied to describe data and statistical results, one should be aware of reasonable methods with which to use SD and SEM. We aim to elucidate the distinctions between SD and SEM and to provide proper usage guidelines for both, which summarize data and describe statistical results.

  9. Implementation of the Brazilian primary standard for x-rays

    International Nuclear Information System (INIS)

    Peixoto, J.G.P.; Almeida, C.E.V. de

    2002-01-01

    In the field of ionizing radiation metrology, a primary standard of a given physical quantity is essentially an experimental set-up which allows one to attribute a numerical value to a particular sample of that quantity in terms of a unit given by an abstract definition. The absolute measurement of the radiation quantity air kerma, is performed with a free-air ionization chamber. A great deal of research to determine the absolute measurement resulted in different designs for primary standard free-air ionization chambers such as cilindrics or plane parallel chambers. The implementation of primary standard dosimetry with free-air ionization chambers is limited to the National Metrology Institutes - NMIs. Since 1975, the Bureau International des Poids et Mesures - BIPM has been conducting comparisons of NMIs primary free-air standard chambers in the medium energy x-rays range. These comparisons are carried out indirectly through the calibration at both the BIPM and at the NMI of one or more transfer ionization chambers at a series of four reference radiation qualities. The scientific work programme of the National Laboratory for Ionizing Radiation Metrology - LNMRI of the Institute of Radioprotection and Dosimetry - IRD, which belongs to the National Commission of Nuclear Energy - CNEN, includes the establishment of a primary standard for x-rays of medium energy x-ray range. This activity is justified by the demand to calibrate periodically Brazilian network of the secondary standards without losing quality of the measurement. The LNMRI decided to implement four reference radiation qualities establishing the use of a transfer chamber calibrated at BIPM. The LNMRI decided to implement the primary standard dosimetry using a free-air ionization chamber with variable volume, made by Victoreen, model 480. Parameters related to the measurement of the quantity air kerma were evaluated, such as: air absorption, scattering inside the ionization chamber, saturation, beam

  10. IAEA Safety Standards

    International Nuclear Information System (INIS)

    2016-09-01

    The IAEA Safety Standards Series comprises publications of a regulatory nature covering nuclear safety, radiation protection, radioactive waste management, the transport of radioactive material, the safety of nuclear fuel cycle facilities and management systems. These publications are issued under the terms of Article III of the IAEA’s Statute, which authorizes the IAEA to establish “standards of safety for protection of health and minimization of danger to life and property”. Safety standards are categorized into: • Safety Fundamentals, stating the basic objective, concepts and principles of safety; • Safety Requirements, establishing the requirements that must be fulfilled to ensure safety; and • Safety Guides, recommending measures for complying with these requirements for safety. For numbering purposes, the IAEA Safety Standards Series is subdivided into General Safety Requirements and General Safety Guides (GSR and GSG), which are applicable to all types of facilities and activities, and Specific Safety Requirements and Specific Safety Guides (SSR and SSG), which are for application in particular thematic areas. This booklet lists all current IAEA Safety Standards, including those forthcoming

  11. SOFG: Standards requirements

    International Nuclear Information System (INIS)

    Gerganov, T.; Grigorov, S.; Kozhukharov, V.; Brashkova, N.

    2005-01-01

    It is well-known that Solid Oxide Fuel Cells will have industrial application in the nearest future. In this context, the problem of SOFC materials and SOFC systems standardization is of high level of priority. In the present study the attention is focused on the methods for physical and chemical characterization of the materials for SOFC components fabrication and about requirements on single SOFC cells tests. The status of the CEN, ISO, ASTM (ANSI, ASSN) and JIS class of standards has been verified. Standards regarding the test methods for physical-chemical characterization of vitreous materials (as sealing SOFC component), ceramic materials (as electrodes and electrolyte components, including alternative materials used) and metallic materials (interconnect components) are subject of overview. It is established that electrical, mechanical, surface and interfacial phenomena, chemical durability and thermal corrosion behaviour are the key areas for standardization of the materials for SOFC components

  12. The effect of short-range spatial variability on soil sampling uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Perk, Marcel van der [Department of Physical Geography, Utrecht University, P.O. Box 80115, 3508 TC Utrecht (Netherlands)], E-mail: m.vanderperk@geo.uu.nl; De Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria [Agenzia per la Protezione dell' Ambiente e per i Servizi Tecnici (APAT), Servizio Laboratori, Misure ed Attivita di Campo, Via di Castel Romano, 100-00128 Roma (Italy); Fajgelj, Ales; Sansone, Umberto [International Atomic Energy Agency (IAEA), Agency' s Laboratories Seibersdorf, A-1400 Vienna (Austria); Jeran, Zvonka; Jacimovic, Radojko [Jozef Stefan Institute, Jamova 39, 1000 Ljubljana (Slovenia)

    2008-11-15

    This paper aims to quantify the soil sampling uncertainty arising from the short-range spatial variability of elemental concentrations in the topsoils of agricultural, semi-natural, and contaminated environments. For the agricultural site, the relative standard sampling uncertainty ranges between 1% and 5.5%. For the semi-natural area, the sampling uncertainties are 2-4 times larger than in the agricultural area. The contaminated site exhibited significant short-range spatial variability in elemental composition, which resulted in sampling uncertainties of 20-30%.

  13. The effect of short-range spatial variability on soil sampling uncertainty.

    Science.gov (United States)

    Van der Perk, Marcel; de Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Fajgelj, Ales; Sansone, Umberto; Jeran, Zvonka; Jaćimović, Radojko

    2008-11-01

    This paper aims to quantify the soil sampling uncertainty arising from the short-range spatial variability of elemental concentrations in the topsoils of agricultural, semi-natural, and contaminated environments. For the agricultural site, the relative standard sampling uncertainty ranges between 1% and 5.5%. For the semi-natural area, the sampling uncertainties are 2-4 times larger than in the agricultural area. The contaminated site exhibited significant short-range spatial variability in elemental composition, which resulted in sampling uncertainties of 20-30%.

  14. Climatological variability in regional air pollution

    International Nuclear Information System (INIS)

    Shannon, J.D.; Trexler, E.C. Jr.

    1995-01-01

    Although some air pollution modeling studies examine events that have already occurred (e.g., the Chernobyl plume) with relevant meteorological conditions largely known, most pollution modeling studies address expected or potential scenarios for the future. Future meteorological conditions, the major pollutant forcing function other than emissions, are inherently uncertain although much relevant information is contained in past observational data. For convenience in our discussions of regional pollutant variability unrelated to emission changes, we define meteorological variability as short-term (within-season) pollutant variability and climatological variability as year-to-year changes in seasonal averages and accumulations of pollutant variables. In observations and in some of our simulations the effects are confounded because for seasons of two different years both the mean and the within-season character of a pollutant variable may change. Effects of climatological and meteorological variability on means and distributions of air pollution parameters, particularly those related to regional visibility, are illustrated. Over periods of up to a decade climatological variability may mask or overstate improvements resulting from emission controls. The importance of including climatological uncertainties in assessing potential policies, particularly when based partly on calculated source-receptor relationships, is highlighted

  15. The Effect of Chicken Extract on Mood, Cognition and Heart Rate Variability

    Directory of Open Access Journals (Sweden)

    Hayley Young

    2015-01-01

    Full Text Available Chicken extract, which is rich in anserine and carnosine, has been widely taken in Asian countries as a traditional remedy with various aims, including attenuation of psychological fatigue. The effects of consuming BRAND’S Essence of Chicken (EOC or a placebo on 46 young adults’ responses to a standard psychological “stressor” were considered. Heart rate variability (HRV, cortisol responses, mood and cognition were measured at baseline and after ten days supplementation. EOC resulted in feeling less anxious, depressed and confused and more agreeable and clearheaded. A decrease in HRV was observed after EOC but only in females. Cognition and cortisol levels were not influenced by EOC. Findings suggest that EOC may be a promising supplement to improve mood in a healthy population.

  16. Identifying populations sensitive to environmental chemicals by simulating toxicokinetic variability.

    Science.gov (United States)

    Ring, Caroline L; Pearce, Robert G; Setzer, R Woodrow; Wetmore, Barbara A; Wambaugh, John F

    2017-09-01

    The thousands of chemicals present in the environment (USGAO, 2013) must be triaged to identify priority chemicals for human health risk research. Most chemicals have little of the toxicokinetic (TK) data that are necessary for relating exposures to tissue concentrations that are believed to be toxic. Ongoing efforts have collected limited, in vitro TK data for a few hundred chemicals. These data have been combined with biomonitoring data to estimate an approximate margin between potential hazard and exposure. The most "at risk" 95th percentile of adults have been identified from simulated populations that are generated either using standard "average" adult human parameters or very specific cohorts such as Northern Europeans. To better reflect the modern U.S. population, we developed a population simulation using physiologies based on distributions of demographic and anthropometric quantities from the most recent U.S. Centers for Disease Control and Prevention National Health and Nutrition Examination Survey (NHANES) data. This allowed incorporation of inter-individual variability, including variability across relevant demographic subgroups. Variability was analyzed with a Monte Carlo approach that accounted for the correlation structure in physiological parameters. To identify portions of the U.S. population that are more at risk for specific chemicals, physiologic variability was incorporated within an open-source high-throughput (HT) TK modeling framework. We prioritized 50 chemicals based on estimates of both potential hazard and exposure. Potential hazard was estimated from in vitro HT screening assays (i.e., the Tox21 and ToxCast programs). Bioactive in vitro concentrations were extrapolated to doses that produce equivalent concentrations in body tissues using a reverse dosimetry approach in which generic TK models are parameterized with: 1) chemical-specific parameters derived from in vitro measurements and predicted from chemical structure; and 2) with

  17. Use of Standard Deviations as Predictors in Models Using Large-Scale International Data Sets

    Science.gov (United States)

    Austin, Bruce; French, Brian; Adesope, Olusola; Gotch, Chad

    2017-01-01

    Measures of variability are successfully used in predictive modeling in research areas outside of education. This study examined how standard deviations can be used to address research questions not easily addressed using traditional measures such as group means based on index variables. Student survey data were obtained from the Organisation for…

  18. The role of food standards in development

    DEFF Research Database (Denmark)

    Trifkovic, Neda

    The thesis consists of three papers based on the original data collected through fieldwork in Mekong River Delta, Vietnam. It is focused on understanding the implications of modern agri-food sector restructuring for farmers in developing countries. The thesis particularly looks at (i) the impact...... — for Middle-Class Farmers, joint with Henrik Hansen, estimates the impact of food standards on farmers’ wellbeing using the data from the Vietnamese pangasius sector. In this paper we estimate both the average effect as well as the effects on poorer and richer farmers using the instrumental variable quantile...... regression. We find that large returns from food standards are possible but the gains are substantial only for the ‘middle-class’ farmers, occupying the range between 50% and 85% quantiles of the expenditure distribution. Overall, this result points to an exclusionary impact of food standards for the poorest...

  19. Robust Confidence Interval for a Ratio of Standard Deviations

    Science.gov (United States)

    Bonett, Douglas G.

    2006-01-01

    Comparing variability of test scores across alternate forms, test conditions, or subpopulations is a fundamental problem in psychometrics. A confidence interval for a ratio of standard deviations is proposed that performs as well as the classic method with normal distributions and performs dramatically better with nonnormal distributions. A simple…

  20. Designing neural networks that process mean values of random variables

    International Nuclear Information System (INIS)

    Barber, Michael J.; Clark, John W.

    2014-01-01

    We develop a class of neural networks derived from probabilistic models posed in the form of Bayesian networks. Making biologically and technically plausible assumptions about the nature of the probabilistic models to be represented in the networks, we derive neural networks exhibiting standard dynamics that require no training to determine the synaptic weights, that perform accurate calculation of the mean values of the relevant random variables, that can pool multiple sources of evidence, and that deal appropriately with ambivalent, inconsistent, or contradictory evidence. - Highlights: • High-level neural computations are specified by Bayesian belief networks of random variables. • Probability densities of random variables are encoded in activities of populations of neurons. • Top-down algorithm generates specific neural network implementation of given computation. • Resulting “neural belief networks” process mean values of random variables. • Such networks pool multiple sources of evidence and deal properly with inconsistent evidence

  1. Designing neural networks that process mean values of random variables

    Energy Technology Data Exchange (ETDEWEB)

    Barber, Michael J. [AIT Austrian Institute of Technology, Innovation Systems Department, 1220 Vienna (Austria); Clark, John W. [Department of Physics and McDonnell Center for the Space Sciences, Washington University, St. Louis, MO 63130 (United States); Centro de Ciências Matemáticas, Universidade de Madeira, 9000-390 Funchal (Portugal)

    2014-06-13

    We develop a class of neural networks derived from probabilistic models posed in the form of Bayesian networks. Making biologically and technically plausible assumptions about the nature of the probabilistic models to be represented in the networks, we derive neural networks exhibiting standard dynamics that require no training to determine the synaptic weights, that perform accurate calculation of the mean values of the relevant random variables, that can pool multiple sources of evidence, and that deal appropriately with ambivalent, inconsistent, or contradictory evidence. - Highlights: • High-level neural computations are specified by Bayesian belief networks of random variables. • Probability densities of random variables are encoded in activities of populations of neurons. • Top-down algorithm generates specific neural network implementation of given computation. • Resulting “neural belief networks” process mean values of random variables. • Such networks pool multiple sources of evidence and deal properly with inconsistent evidence.

  2. Short- and long-term variations in non-linear dynamics of heart rate variability

    DEFF Research Database (Denmark)

    Kanters, J K; Højgaard, M V; Agner, E

    1996-01-01

    OBJECTIVES: The purpose of the study was to investigate the short- and long-term variations in the non-linear dynamics of heart rate variability, and to determine the relationships between conventional time and frequency domain methods and the newer non-linear methods of characterizing heart rate...... rate and describes mainly linear correlations. Non-linear predictability is correlated with heart rate variability measured as the standard deviation of the R-R intervals and the respiratory activity expressed as power of the high-frequency band. The dynamics of heart rate variability changes suddenly...

  3. Variably insulating portable heater/cooler

    Science.gov (United States)

    Potter, T.F.

    1998-09-29

    A compact vacuum insulation panel is described comprising a chamber enclosed by two sheets of metal, glass-like spaces disposed in the chamber between the sidewalls, and a high-grade vacuum in the chamber includes apparatus and methods for enabling and disabling, or turning ``on`` and ``off`` the thermal insulating capability of the panel. One type of enabling and disabling apparatus and method includes a metal hydride for releasing hydrogen gas into the chamber in response to heat, and a hydrogen grate between the metal hydride and the chamber for selectively preventing and allowing return of the hydrogen gas to the metal hydride. Another type of enabling and disabling apparatus and method includes a variable emissivity coating on the sheets of metal in which the emissivity is controllably variable by heat or electricity. Still another type of enabling and disabling apparatus and method includes metal-to-metal contact devices that can be actuated to establish or break metal-to-metal heat paths or thermal short circuits between the metal sidewalls. 25 figs.

  4. South African address standard and initiatives towards an international address standard

    CSIR Research Space (South Africa)

    Cooper, Antony K

    2008-10-01

    Full Text Available ; visiting friends; and providing a reference context for presenting other information. The benefits of an international address standards include: enabling address interoperability across boundaries; reducing service delivery costs; enabling development...

  5. Current situation of International Organization for Standardization/Technical Committee 249 international standards of traditional Chinese medicine.

    Science.gov (United States)

    Liu, Yu-Qi; Wang, Yue-Xi; Shi, Nan-Nan; Han, Xue-Jie; Lu, Ai-Ping

    2017-05-01

    To review the current situation and progress of traditional Chinese medicine (TCM) international standards, standard projects and proposals in International Organization for Standardization (ISO)/ technical committee (TC) 249. ISO/TC 249 standards and standard projects on the ISO website were searched and new standard proposals information were collected from ISO/TC 249 National Mirror Committee in China. Then all the available data were summarized in 5 closely related items, including proposed time, proposed country, assigned working group (WG), current stage and classifification. In ISO/TC 249, there were 2 international standards, 18 standard projects and 24 new standard proposals proposed in 2014. These 44 standard subjects increased year by year since 2011. Twenty-nine of them were proposed by China, 15 were assigned to WG 4, 36 were in preliminary and preparatory stage and 8 were categorized into 4 fifields, 7 groups and sub-groups based on International Classifification Standards. A rapid and steady development of international standardization in TCM can be observed in ISO/TC 249.

  6. Variability of critical frequency and M(3000)F2 at Tucuman and San Juan

    International Nuclear Information System (INIS)

    Ezquer, R.G.; Mosert, M.; Corbella, R.J.

    2002-01-01

    The variability of the M(3000)F2 factor and the critical frequency of the E and F2 ionospheric regions over two argentine stations for middle solar activity conditions is studied. To this end different parameters to specify variability are used, namely: standard deviation, difference between median to lower quartile and to upper quartile. The results show that low variability is observed for foE and M(3000)F2 factor at both stations for equinoxes and solstices. The coefficients of variability are lower than 10% for foE and M(3000)F2 factor. The highest variability was observed for foF2. In general, the foF2 coefficient of variability ranges between 0 and 30%, at both stations. (author)

  7. Beyond the standard model; Au-dela du modele standard

    Energy Technology Data Exchange (ETDEWEB)

    Cuypers, F. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1997-05-01

    These lecture notes are intended as a pedagogical introduction to several popular extensions of the standard model of strong and electroweak interactions. The topics include the Higgs sector, the left-right symmetric model, grand unification and supersymmetry. Phenomenological consequences and search procedures are emphasized. (author) figs., tabs., 18 refs.

  8. Linear variable voltage diode capacitor and adaptive matching networks

    NARCIS (Netherlands)

    Larson, L.E.; De Vreede, L.C.N.

    2006-01-01

    An integrated variable voltage diode capacitor topology applied to a circuit providing a variable voltage load for controlling variable capacitance. The topology includes a first pair of anti-series varactor diodes, wherein the diode power-law exponent n for the first pair of anti-series varactor

  9. Impact of region contouring variability on image-based focal therapy evaluation

    Science.gov (United States)

    Gibson, Eli; Donaldson, Ian A.; Shah, Taimur T.; Hu, Yipeng; Ahmed, Hashim U.; Barratt, Dean C.

    2016-03-01

    Motivation: Focal therapy is an emerging low-morbidity treatment option for low-intermediate risk prostate cancer; however, challenges remain in accurately delivering treatment to specified targets and determining treatment success. Registered multi-parametric magnetic resonance imaging (MPMRI) acquired before and after treatment can support focal therapy evaluation and optimization; however, contouring variability, when defining the prostate, the clinical target volume (CTV) and the ablation region in images, reduces the precision of quantitative image-based focal therapy evaluation metrics. To inform the interpretation and clarify the limitations of such metrics, we investigated inter-observer contouring variability and its impact on four metrics. Methods: Pre-therapy and 2-week-post-therapy standard-of-care MPMRI were acquired from 5 focal cryotherapy patients. Two clinicians independently contoured, on each slice, the prostate (pre- and post-treatment) and the dominant index lesion CTV (pre-treatment) in the T2-weighted MRI, and the ablated region (post-treatment) in the dynamic-contrast- enhanced MRI. For each combination of clinician contours, post-treatment images were registered to pre-treatment images using a 3D biomechanical-model-based registration of prostate surfaces, and four metrics were computed: the proportion of the target tissue region that was ablated and the target:ablated region volume ratio for each of two targets (the CTV and an expanded planning target volume). Variance components analysis was used to measure the contribution of each type of contour to the variance in the therapy evaluation metrics. Conclusions: 14-23% of evaluation metric variance was attributable to contouring variability (including 6-12% from ablation region contouring); reducing this variability could improve the precision of focal therapy evaluation metrics.

  10. An ArcGIS approach to include tectonic structures in point data regionalization.

    Science.gov (United States)

    Darsow, Andreas; Schafmeister, Maria-Theresia; Hofmann, Thilo

    2009-01-01

    Point data derived from drilling logs must often be regionalized. However, aquifers may show discontinuous surface structures, such as the offset of an aquitard caused by tectonic faults. One main challenge has been to incorporate these structures into the regionalization process of point data. We combined ordinary kriging and inverse distance weighted (IDW) interpolation to account for neotectonic structures in the regionalization process. The study area chosen to test this approach is the largest porous aquifer in Austria. It consists of three basins formed by neotectonic events and delimited by steep faults with a vertical offset of the aquitard up to 70 m within very short distances. First, ordinary kriging was used to incorporate the characteristic spatial variability of the aquitard location by means of a variogram. The tectonic faults could be included into the regionalization process by using breaklines with buffer zones. All data points inside the buffer were deleted. Last, IDW was performed, resulting in an aquitard map representing the discontinuous surface structures. This approach enables one to account for such surfaces using the standard software package ArcGIS; therefore, it could be adopted in many practical applications.

  11. Comparison of seasonal variability in European domestic radon measurements

    Science.gov (United States)

    Groves-Kirkby, C. J.; Denman, A. R.; Phillips, P. S.; Crockett, R. G. M.; Sinclair, J. M.

    2010-03-01

    Analysis of published data characterising seasonal variability of domestic radon concentrations in Europe and elsewhere shows significant variability between different countries and between regions where regional data is available. Comparison is facilitated by application of the Gini Coefficient methodology to reported seasonal variation data. Overall, radon-rich sedimentary strata, particularly high-porosity limestones, exhibit high seasonal variation, while radon-rich igneous lithologies demonstrate relatively constant, but somewhat higher, radon concentrations. High-variability regions include the Pennines and South Downs in England, Languedoc and Brittany in France, and especially Switzerland. Low-variability high-radon regions include the granite-rich Cornwall/Devon peninsula in England, and Auvergne and Ardennes in France, all components of the Devonian-Carboniferous Hercynian belt.

  12. Development of a quantitative risk standard

    International Nuclear Information System (INIS)

    Temme, M.I.

    1982-01-01

    IEEE Working Group SC-5.4 is developing a quantitative risk standard for LWR plant design and operation. The paper describes the Working Group's conclusions on significant issues, including the scope of the standard, the need to define the process (i.e., PRA calculation) for meeting risk criteria, the need for PRA quality requirements and the importance of distinguishing standards from goals. The paper also describes the Working Group's approach to writing this standard

  13. Avatar Embodiment. Towards a Standardized Questionnaire

    Directory of Open Access Journals (Sweden)

    Mar Gonzalez-Franco

    2018-06-01

    Full Text Available Inside virtual reality, users can embody avatars that are collocated from a first-person perspective. When doing so, participants have the feeling that the own body has been substituted by the self-avatar, and that the new body is the source of the sensations. Embodiment is complex as it includes not only body ownership over the avatar, but also agency, co-location, and external appearance. Despite the multiple variables that influence it, the illusion is quite robust, and it can be produced even if the self-avatar is of a different age, size, gender, or race from the participant's own body. Embodiment illusions are therefore the basis for many social VR experiences and a current active research area among the community. Researchers are interested both in the body manipulations that can be accepted, as well as studying how different self-avatars produce different attitudinal, social, perceptual, and behavioral effects. However, findings suggest that despite embodiment being strongly associated with the performance and reactions inside virtual reality, the extent to which the illusion is experienced varies between participants. In this paper, we review the questionnaires used in past experiments and propose a standardized embodiment questionnaire based on 25 questions that are prevalent in the literature. We encourage future virtual reality experiments that include first-person virtual avatars to administer this questionnaire in order to evaluate the degree of embodiment.

  14. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE): Standardized Processing Software for Developmental and High-Artifact Data.

    Science.gov (United States)

    Gabard-Durnam, Laurel J; Mendez Leal, Adriana S; Wilkinson, Carol L; Levin, April R

    2018-01-01

    Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and

  15. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE: Standardized Processing Software for Developmental and High-Artifact Data

    Directory of Open Access Journals (Sweden)

    Laurel J. Gabard-Durnam

    2018-02-01

    Full Text Available Electroenchephalography (EEG recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact

  16. Pan-Canadian Respiratory Standards Initiative for Electronic Health Records (PRESTINE: 2011 National Forum Proceedings

    Directory of Open Access Journals (Sweden)

    M Diane Lougheed

    2012-01-01

    Full Text Available In a novel knowledge translation initiative, the Government of Ontario’s Asthma Plan of Action funded the development of an Asthma Care Map to enable adherence with the Canadian Asthma Consensus Guidelines developed under the auspices of the Canadian Thoracic Society (CTS. Following its successful evaluation within the Primary Care Asthma Pilot Project, respiratory clinicians from the Asthma Research Unit, Queen’s University (Kingston, Ontario are leading an initiative to incorporate standardized Asthma Care Map data elements into electronic health records in primary care in Ontario. Acknowledging that the issue of data standards affects all respiratory conditions, and all provinces and territories, the Government of Ontario approached the CTS Respiratory Guidelines Committee. At its meeting in September 2010, the CTS Respiratory Guidelines Committee agreed that developing and standardizing respiratory data elements for electronic health records are strategically important. In follow-up to that commitment, representatives from the CTS, the Lung Association, the Government of Ontario, the National Lung Health Framework and Canada Health Infoway came together to form a planning committee. The planning committee proposed a phased approach to inform stakeholders about the issue, and engage them in the development, implementation and evaluation of a standardized dataset. An environmental scan was completed in July 2011, which identified data definitions and standards currently available for clinical variables that are likely to be included in electronic medical records in primary care for diagnosis, management and patient education related to asthma and COPD. The scan, sponsored by the Government of Ontario, includes compliance with clinical nomenclatures such as SNOMED-CT® and LOINC®. To help launch and create momentum for this initiative, a national forum was convened on October 2 and 3, 2011, in Toronto, Ontario. The forum was designed to

  17. Why We Should Establish a National System of Standards.

    Science.gov (United States)

    Hennen, Thomas J., Jr.

    2000-01-01

    Explains the need to establish a national system of standards for public libraries. Discusses local standards, state standards, and international standards, and suggests adopting a tiered approach including three levels: minimum standards; target standards; and benchmarking standards, as found in total quality management. (LRW)

  18. Standards in neurosonology. Part I

    Directory of Open Access Journals (Sweden)

    Joanna Wojcza

    2015-09-01

    Full Text Available The paper presents standards related to ultrasound imaging of the cerebral vasculature and structures. The aim of this paper is to standardize both the performance and description of ultrasound imaging of the extracranial and intracranial cerebral arteries as well as a study of a specifi c brain structure, i.e. substantia nigra hyperechogenicity. The following aspects are included in the description of standards for each ultrasonographic method: equipment requirements, patient preparation, study technique and documentation as well as the required elements of ultrasound description. Practical criteria for the diagnosis of certain pathologies in accordance with the latest literature were also presented. Furthermore, additional comments were included in some of the sections. Part I discusses standards for the performance, documentation and description of different ultrasound methods (Duplex, Doppler. Part II and III are devoted to standards for specifi c clinical situations (vasospasm, monitoring after the acute stage of stroke, detection of a right-toleft shunts, confi rmation of the arrest of the cerebral circulation, an assessment of the functional effi ciency of circle of Willis, an assessment of the cerebrovascular vasomotor reserve as well as the measurement of substantia nigra hyperechogenicity.

  19. Standards in neurosonology. Part III

    Directory of Open Access Journals (Sweden)

    Joanna Wojczal

    2016-06-01

    Full Text Available The paper presents standards related to ultrasound imaging of the cerebral vasculature and structures. The aim of this paper is to standardize both the performance and description of ultrasound imaging of the extracranial and intracranial cerebral arteries as well as a study of a specific brain structure, i.e. substantia nigra hyperechogenicity. The following aspects are included in the description of standards for each ultrasonographic method: equipment requirements, patient preparation, study technique and documentation as well as the required elements of ultrasound description. Practical criteria for the diagnosis of certain pathologies in accordance with the latest literature were also presented. Furthermore, additional comments were included in some of the sections. Part I discusses standards for the performance, documentation and description of different ultrasound methods (Duplex, Doppler. Part II and III are devoted to standards for specific clinical situations (vasospasm, monitoring after the acute stage of stroke, detection of a right-to-left shunts, confirmation of the arrest of the cerebral circulation, an assessment of the functional efficiency of circle of Willis, an assessment of the cerebrovascular vasomotor reserve as well as the measurement of substantia nigra hyperechogenicity.

  20. Standard values of quality and ore mining costs in management of multi-plant mining company

    Energy Technology Data Exchange (ETDEWEB)

    Kudelko, Jan [KGHM CUPRUM Research and Development Center, Wroclaw (Poland); Wirth, Herbert [KGHM Polska Miedz S.A., Lubin (Poland)

    2010-03-15

    Profitability of copper deposit mining depends on three basic variables, electrolytic copper price, manufacturing and selling costs of copper and company property involved in production process. If the company property is adjusted to its tasks then the mining profiability depends on costs of copper mining and selling, because the price is the external variable defined by the market. We can shape the costs in two (complementary) ways, traditionally, reducing the labor, material and power consumption, and by adjusting the quality of mined ore (copper content) to the level required by the current copper prices. Required quality of copper ore in the whole company we determine according to the accepted profitability criteria and then we determine quality standard for individual mines. Algorithms determining the ore quality standard resulting from current market price of copper are presented in the paper. Calculation models for the mined ore quality standards, unit mining costs per one ton of copper, electrolytic copper production and ore output are given. Standards were established for one variable assuming that the other variables are determined in this calculation. Innovative solution, presented in the paper, is the method of decomposition of the company controllable variables into the tasks for individual mines providing reaching the targets to the whole technological circuit. Using the models, having relatively few data, it will be possible to calculate quickly the values which are interesting for managers such as for example the prognosis of rate of return (economic or operational), required copper content in the mined ore for the whole company and individual mines at given rate of return or boundary level of copper content in comparison with cost and production level. Examples of calculation are provided. (orig.)

  1. Development of Extended Content Standards for Biodiversity Data

    Science.gov (United States)

    Hugo, Wim; Schmidt, Jochen; Saarenmaa, Hannu

    2013-04-01

    Interoperability in the field of Biodiversity observation has been strongly driven by the development of a number of global initiatives (GEO, GBIF, OGC, TDWG, GenBank, …) and its supporting standards (OGC-WxS, OGC-SOS, Darwin Core (DwC), NetCDF, …). To a large extent, these initiatives have focused on discoverability and standardization of syntactic and schematic interoperability. Semantic interoperability is more complex, requiring development of domain-dependent conceptual data models, and extension of these models with appropriate ontologies (typically manifested as controlled vocabularies). Biodiversity content has been standardized partly, for example through Darwin Core for occurrence data and associated taxonomy, and through Genbank for genetic data, but other contexts of biodiversity observation have lagged behind - making it difficult to achieve semantic interoperability between distributed data sources. With this in mind, WG8 of GEO BON (charged with data and systems interoperability) has started a work programme to address a number of concerns, one of which is the gap in content standards required to make Biodiversity data truly interoperable. The paper reports on the framework developed by WG8 for the classification of Biodiversity observation data into 'families' of use cases and its supporting data schema, where gaps, if any, in the availability if content standards have been identified, and how these are to be addressed by way of an abstract data model and the development of associated content standards. It is proposed that a minimum set of standards (1) will be required to address the scope of Biodiversity content, aligned with levels and dimensions of observation, and based on the 'Essential Biodiversity Variables' (2) being developed by GEO BON . The content standards are envisaged as loosely separated from the syntactic and schematic standards used for the base data exchange: typically, services would offer an existing data standard (DwC, WFS

  2. Governmental standard drink definitions and low-risk alcohol consumption guidelines in 37 countries.

    Science.gov (United States)

    Kalinowski, Agnieszka; Humphreys, Keith

    2016-07-01

    One of the challenges of international alcohol research and policy is the variability in and lack of knowledge of how governments in different nations define a standard drink and low-risk drinking. This study gathered such information from governmental agencies in 37 countries. A pool of 75 countries that might have definitions was created using World Health Organization (WHO) information and the authors' own judgement. Structured internet searches of relevant terms for each country were supplemented by efforts to contact government agencies directly and to consult with alcohol experts in the country. Most of the 75 national governments examined were not identified as having adopted a standard drink definition. Among the 37 that were so identified, the modal standard drink size was 10 g pure ethanol, but variation was wide (8-20 g). Significant variability was also evident for low-risk drinking guidelines, ranging from 10-42 g per day for women and 10-56 g per day for men to 98-140 g per week for women and 150-280 g per week for men. Researchers working and communicating across national boundaries should be sensitive to the substantial variability in 'standard' drink definitions and low-risk drinking guidelines. The potential impact of guidelines, both in general and in specific national cases, remains an important question for public health research. © 2016 Society for the Study of Addiction.

  3. Intraindividual variability in cognitive performance in persons with chronic fatigue syndrome.

    Science.gov (United States)

    Fuentes, K; Hunter, M A; Strauss, E; Hultsch, D F

    2001-05-01

    Studies of cognitive performance among persons with chronic fatigue syndrome (CFS) have yielded inconsistent results. We sought to contribute to findings in this area by examining intraindividual variability as well as level of performance in cognitive functioning. A battery of cognitive measures was administered to 14 CFS patients and 16 healthy individuals on 10 weekly occasions. Analyses comparing the two groups in terms of level of performance defined by latency and accuracy scores revealed that the CFS patients were slower but not less accurate than healthy persons. The CFS group showed greater intraindividual variability (as measured by intraindividual standard deviations and coefficients of variation) than the healthy group, although the results varied by task and time frame. Intraindividual variability was found to be stable across time and correlated across tasks at each testing occasion. Intraindividual variability also uniquely differentiated the groups. The present findings support the proposition that intraindividual variability is a meaningful indicator of cognitive functioning in CFS patients.

  4. Gene expression variability in human hepatic drug metabolizing enzymes and transporters.

    Directory of Open Access Journals (Sweden)

    Lun Yang

    Full Text Available Interindividual variability in the expression of drug-metabolizing enzymes and transporters (DMETs in human liver may contribute to interindividual differences in drug efficacy and adverse reactions. Published studies that analyzed variability in the expression of DMET genes were limited by sample sizes and the number of genes profiled. We systematically analyzed the expression of 374 DMETs from a microarray data set consisting of gene expression profiles derived from 427 human liver samples. The standard deviation of interindividual expression for DMET genes was much higher than that for non-DMET genes. The 20 DMET genes with the largest variability in the expression provided examples of the interindividual variation. Gene expression data were also analyzed using network analysis methods, which delineates the similarities of biological functionalities and regulation mechanisms for these highly variable DMET genes. Expression variability of human hepatic DMET genes may affect drug-gene interactions and disease susceptibility, with concomitant clinical implications.

  5. Can Images Obtained With High Field Strength Magnetic Resonance Imaging Reduce Contouring Variability of the Prostate?

    International Nuclear Information System (INIS)

    Usmani, Nawaid; Sloboda, Ron; Kamal, Wafa; Ghosh, Sunita; Pervez, Nadeem; Pedersen, John; Yee, Don; Danielson, Brita; Murtha, Albert; Amanie, John; Monajemi, Tara

    2011-01-01

    Purpose: The objective of this study is to determine whether there is less contouring variability of the prostate using higher-strength magnetic resonance images (MRI) compared with standard MRI and computed tomography (CT). Methods and Materials: Forty patients treated with prostate brachytherapy were accrued to a prospective study that included the acquisition of 1.5-T MR and CT images at specified time points. A subset of 10 patients had additional 3.0-T MR images acquired at the same time as their 1.5-T MR scans. Images from each of these patients were contoured by 5 radiation oncologists, with a random subset of patients repeated to quantify intraobserver contouring variability. To minimize bias in contouring the prostate, the image sets were placed in folders in a random order with all identifiers removed from the images. Results: Although there was less interobserver contouring variability in the overall prostate volumes in 1.5-T MRI compared with 3.0-T MRI (p < 0.01), there was no significant differences in contouring variability in the different regions of the prostate between 1.5-T MRI and 3.0-T MRI. MRI demonstrated significantly less interobserver contouring variability in both 1.5-T and 3.0-T compared with CT in overall prostate volumes (p < 0.01, p = 0.01), with the greatest benefits being appreciated in the base of the prostate. Overall, there was less intraobserver contouring variability than interobserver contouring variability for all of the measurements analyzed. Conclusions: Use of 3.0-T MRI does not demonstrate a significant improvement in contouring variability compared with 1.5-T MRI, although both magnetic strengths demonstrated less contouring variability compared with CT.

  6. Characterization and Comparison of the 10-2 SITA-Standard and Fast Algorithms

    Directory of Open Access Journals (Sweden)

    Yaniv Barkana

    2012-01-01

    Full Text Available Purpose: To compare the 10-2 SITA-standard and SITA-fast visual field programs in patients with glaucoma. Methods: We enrolled 26 patients with open angle glaucoma with involvement of at least one paracentral location on 24-2 SITA-standard field test. Each subject performed 10-2 SITA-standard and SITA-fast tests. Within 2 months this sequence of tests was repeated. Results: SITA-fast was 30% shorter than SITA-standard (5.5±1.1 vs 7.9±1.1 minutes, <0.001. Mean MD was statistically significantly higher for SITA-standard compared with SITA-fast at first visit (Δ=0.3 dB, =0.017 but not second visit. Inter-visit difference in MD or in number of depressed points was not significant for both programs. Bland-Altman analysis showed that clinically significant variations can exist in individual instances between the 2 programs and between repeat tests with the same program. Conclusions: The 10-2 SITA-fast algorithm is significantly shorter than SITA-standard. The two programs have similar long-term variability. Average same-visit between-program and same-program between-visit sensitivity results were similar for the study population, but clinically significant variability was observed for some individual test pairs. Group inter- and intra-program test results may be comparable, but in the management of the individual patient field change should be verified by repeat testing.

  7. Variability of femoral muscle attachments.

    Science.gov (United States)

    Duda, G N; Brand, D; Freitag, S; Lierse, W; Schneider, E

    1996-09-01

    Analytical and experimental models of the musculoskeletal system often assume single values rather than ranges for anatomical input parameters. The hypothesis of the present study was that anatomical variability significantly influences the results of biomechanical analyses, specifically regarding the moment arms of the various thigh muscles. Insertions and origins of muscles crossing or attaching to the femur were digitized in six specimens. Muscle volumes were measured; muscle attachment area and centroid location were computed. To demonstrate the influence of inter-individual anatomic variability on a mechanical modeling parameter, the corresponding range of muscle moment arms were calculated. Standard deviations, as a percentage of the mean, were about 70% for attachment area and 80% for muscle volume and attachment centroid location. The resulting moment arms of the m. gluteus maximus and m. rectus femoris were especially sensitive to anatomical variations (SD 65%). The results indicate that sensitivity to anatomical variations should be analyzed in any investigation simulating musculoskeletal interactions. To avoid misinterpretations, investigators should consider using several anatomical configurations rather than relying on a mean data set.

  8. Measurement uncertainties for vacuum standards at Korea Research Institute of Standards and Science

    International Nuclear Information System (INIS)

    Hong, S. S.; Shin, Y. H.; Chung, K. H.

    2006-01-01

    The Korea Research Institute of Standards and Science has three major vacuum systems: an ultrasonic interferometer manometer (UIM) (Sec. II, Figs. 1 and 2) for low vacuum, a static expansion system (SES) (Sec. III, Figs. 3 and 4) for medium vacuum, and an orifice-type dynamic expansion system (DES) (Sec. IV, Figs. 5 and 6) for high and ultrahigh vacuum. For each system explicit measurement model equations with multiple variables are, respectively, given. According to ISO standards, all these system variable errors were used to calculate the expanded uncertainty (U). For each system the expanded uncertainties (k=1, confidence level=95%) and relative expanded uncertainty (expanded uncertainty/generated pressure) are summarized in Table IV and are estimated to be as follows. For UIM, at 2.5-300 Pa generated pressure, the expanded uncertainty is -2 Pa and the relative expanded uncertainty is -2 ; at 1-100 kPa generated pressure, the expanded uncertainty is -5 . For SES, at 3-100 Pa generated pressure, the expanded uncertainty is -1 Pa and the relative expanded uncertainty is -3 . For DES, at 4.6x10 -3 -1.3x10 -2 Pa generated pressure, the expanded uncertainty is -4 Pa and the relative expanded uncertainty is -3 ; at 3.0x10 -6 -9.0x10 -4 Pa generated pressure, the expanded uncertainty is -6 Pa and the relative expanded uncertainty is -2 . Within uncertainty limits our bilateral and key comparisons [CCM.P-K4 (10 Pa-1 kPa)] are extensive and in good agreement with those of other nations (Fig. 8 and Table V)

  9. Manipulating continuous variable photonic entanglement

    International Nuclear Information System (INIS)

    Plenio, M.B.

    2005-01-01

    I will review our work on photonic entanglement in the continuous variable regime including both Gaussian and non-Gaussian states. The feasibility and efficiency of various entanglement purification protocols are discussed this context. (author)

  10. Musculoskeletal ultrasound including definitions for ultrasonographic pathology

    DEFF Research Database (Denmark)

    Wakefield, RJ; Balint, PV; Szkudlarek, Marcin

    2005-01-01

    Ultrasound (US) has great potential as an outcome in rheumatoid arthritis trials for detecting bone erosions, synovitis, tendon disease, and enthesopathy. It has a number of distinct advantages over magnetic resonance imaging, including good patient tolerability and ability to scan multiple joints...... in a short period of time. However, there are scarce data regarding its validity, reproducibility, and responsiveness to change, making interpretation and comparison of studies difficult. In particular, there are limited data describing standardized scanning methodology and standardized definitions of US...... pathologies. This article presents the first report from the OMERACT ultrasound special interest group, which has compared US against the criteria of the OMERACT filter. Also proposed for the first time are consensus US definitions for common pathological lesions seen in patients with inflammatory arthritis....

  11. Standardization of diagnostic PCR for the detection of foodborne pathogens

    DEFF Research Database (Denmark)

    Malorny, B.; Tassios, P.T.; Radstrom, P.

    2003-01-01

    In vitro amplification of nucleic acids using the polymerase chain reaction (PCR) has become, since its discovery in the 1980s, a powerful diagnostic tool for the analysis of microbial infections as well as for the analysis of microorganisms in food samples. However, despite its potential, PCR has...... neither gained wide acceptance in routine diagnostics nor been widely incorporated in standardized methods. Lack of validation and standard protocols, as well as variable quality of reagents and equipment, influence the efficient dissemination of PCR methodology from expert research laboratories to end......-user laboratories. Moreover, the food industry understandably requires and expects officially approved standards. Recognizing this, in 1999, the European Commission approved the research project, FOOD-PCR (http://www.PCR.dk), which aims to validate and standardize the use of diagnostic PCR for the detection...

  12. Advocacy: Making the Gold Standard School a Reality

    Science.gov (United States)

    Roberts, Julia Link; Inman, Tracy Ford

    2011-01-01

    In their last column, the authors described a Gold Standard School--a place in which all children thrive including the gifted and talented. The Checklist for a Gold Standard School, which is included in this article, highlights the main characteristics of such a school including a focus on continuous progress, talent development, policies that…

  13. Estimating structural equation models with non-normal variables by using transformations

    NARCIS (Netherlands)

    Montfort, van K.; Mooijaart, A.; Meijerink, F.

    2009-01-01

    We discuss structural equation models for non-normal variables. In this situation the maximum likelihood and the generalized least-squares estimates of the model parameters can give incorrect estimates of the standard errors and the associated goodness-of-fit chi-squared statistics. If the sample

  14. Internal variables in thermoelasticity

    CERN Document Server

    Berezovski, Arkadi

    2017-01-01

    This book describes an effective method for modeling advanced materials like polymers, composite materials and biomaterials, which are, as a rule, inhomogeneous. The thermoelastic theory with internal variables presented here provides a general framework for predicting a material’s reaction to external loading. The basic physical principles provide the primary theoretical information, including the evolution equations of the internal variables. The cornerstones of this framework are the material representation of continuum mechanics, a weak nonlocality, a non-zero extra entropy flux, and a consecutive employment of the dissipation inequality. Examples of thermoelastic phenomena are provided, accompanied by detailed procedures demonstrating how to simulate them.

  15. Impact of pretreatment variables on the outcome of 131I therapy with a standardized dose of 150 Gray in Graves' disease

    International Nuclear Information System (INIS)

    Pfeilschifter, J.; Elser, H.; Haufe, S.; Ziegler, R.; Georgi, P.

    1997-01-01

    Aim: We examined the impact of several pretreatment variables on thyroid size and function in 61 patients with Graves' disease one year after a standardized [131[I treatment with 150 Gray. Methods: FT3, FT4, and TSH serum concentrations were determined before and 1.5, 3, 6, and 12 months after therapy. Thyroid size was measured by ultrasound and scintigraphy before and one year after therapy. Results: One year after therapy, 30% of the patients had latent or manifest hyperthyroidism, 24% were euthyroid, and 46% had developed latent or manifest hypothyroidism. Age and initial thyroid volume were major predictors of posttherapeutical thyroid function. Thus, persistent hyperthyroidism was observed in 70% of the patients age 50 years and older with a thyroid size of more than 50 ml. With few exception, thyroid size markedly decreased after therapy. Initial thyroid size and age were also major predictors of posttherapeutical thyroid volume. Thyroid size normalized in all patients younger than 50 years of age, independent from initial thyroid size. Conclusion: Radioiodine treatment with 150 Gray causes a considerable decrease in thyroid size in most patients with Graves' disease. Age and initial thyroid volume are important determinants of thyroid function and size after therapy and should be considered in dose calculation. (orig.) [de

  16. Tibiofemoral wear in standard and non-standard squat: implication for total knee arthroplasty.

    Science.gov (United States)

    Fekete, Gusztáv; Sun, Dong; Gu, Yaodong; Neis, Patric Daniel; Ferreira, Ney Francisco; Innocenti, Bernardo; Csizmadia, Béla M

    2017-01-01

    Due to the more resilient biomaterials, problems related to wear in total knee replacements (TKRs) have decreased but not disappeared. In the design-related factors, wear is still the second most important mechanical factor that limits the lifetime of TKRs and it is also highly influenced by the local kinematics of the knee. During wear experiments, constant load and slide-roll ratio is frequently applied in tribo-tests beside other important parameters. Nevertheless, numerous studies demonstrated that constant slide-roll ratio is not accurate approach if TKR wear is modelled, while instead of a constant load, a flexion-angle dependent tibiofemoral force should be involved into the wear model to obtain realistic results. A new analytical wear model, based upon Archard's law, is introduced, which can determine the effect of the tibiofemoral force and the varying slide-roll on wear between the tibiofemoral connection under standard and non-standard squat movement. The calculated total wear with constant slide-roll during standard squat was 5.5 times higher compared to the reference value, while if total wear includes varying slide-roll during standard squat, the calculated wear was approximately 6.25 times higher. With regard to non-standard squat, total wear with constant slide-roll during standard squat was 4.16 times higher than the reference value. If total wear included varying slide-roll, the calculated wear was approximately 4.75 times higher. It was demonstrated that the augmented force parameter solely caused 65% higher wear volume while the slide-roll ratio itself increased wear volume by 15% higher compared to the reference value. These results state that the force component has the major effect on wear propagation while non-standard squat should be proposed for TKR patients as rehabilitation exercise.

  17. Adaptive Torque Control of Variable Speed Wind Turbines

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, K. E.

    2004-08-01

    The primary focus of this work is a new adaptive controller that is designed to resemble the standard non-adaptive controller used by the wind industry for variable speed wind turbines below rated power. This adaptive controller uses a simple, highly intuitive gain adaptation law designed to seek out the optimal gain for maximizing the turbine's energy capture. It is designed to work even in real, time-varying winds.

  18. Interdecadal variability of winter precipitation in Southeast China

    OpenAIRE

    Zhang, L.; Zhu, X.; Fraedrich, K.; Sielmann, F.; Zhi, X.

    2014-01-01

    Interdecadal variability of observed winter precipitation in Southeast China (1961–2010) is characterized by the first empirical orthogonal function of the three-monthly Standardized Precipitation Index (SPI) subjected to a 9-year running mean. For interdecadal time scales the dominating spatial modes represent monopole features involving the Arctic Oscillation (AO) and the sea surface temperature (SST) anomalies. Dynamic composite analysis (based on NCEP/NCAR reanalyzes) reveals the followin...

  19. A new standard for core training in radiation safety

    International Nuclear Information System (INIS)

    Trinoskey, P.A.

    1997-02-01

    A new American National Standard for radiation worker training was recently developed. The standard emphasizes performance-based training and establishing a training program rather than simply prescribing objectives. The standard also addresses basic criteria, including instructor qualifications. The standard is based on input from a wide array of regulatory agencies, universities, national laboratories, and nuclear power entities. This paper presents an overview of the new standard and the philosophy behind it. The target audience includes radiation workers, management and supervisory personnel, contractors, students, emergency personnel, and visitors

  20. Comparison of seasonal variability in European domestic radon measurements

    Directory of Open Access Journals (Sweden)

    C. J. Groves-Kirkby

    2010-03-01

    Full Text Available Analysis of published data characterising seasonal variability of domestic radon concentrations in Europe and elsewhere shows significant variability between different countries and between regions where regional data is available. Comparison is facilitated by application of the Gini Coefficient methodology to reported seasonal variation data. Overall, radon-rich sedimentary strata, particularly high-porosity limestones, exhibit high seasonal variation, while radon-rich igneous lithologies demonstrate relatively constant, but somewhat higher, radon concentrations. High-variability regions include the Pennines and South Downs in England, Languedoc and Brittany in France, and especially Switzerland. Low-variability high-radon regions include the granite-rich Cornwall/Devon peninsula in England, and Auvergne and Ardennes in France, all components of the Devonian-Carboniferous Hercynian belt.

  1. Neurocognitive functioning as an intermediary variable between psychopathology and insight in schizophrenia.

    Science.gov (United States)

    Hwang, Samuel Suk-Hyun; Ahn, Yong Min; Kim, Yong Sik

    2015-12-30

    Based on the neuropsychological deficit model of insight in schizophrenia, we constructed exploratory prediction models for insight, designating neurocognitive measures as the intermediary variables between psychopathology and insight into patients with schizophrenia. The models included the positive, negative, and autistic preoccupation symptoms as primary predictors, and activation symptoms as an intermediary variable for insight. Fifty-six Korean patients, in the acute stage of schizophrenia, completed the Positive and Negative Syndrome Scale, as well as a comprehensive neurocognitive battery of tests at the baseline, 8-weeks, and 1-year follow-ups. Among the neurocognitive measures, the Korean Wechsler Adult Intelligence Scale (K-WAIS) picture arrangement, Controlled Oral Word Association Test (COWAT) perseverative response, and the Continuous Performance Test (CPT) standard error of reaction time showed significant correlations with the symptoms and the insight. When these measures were fitted into the model as intermediaries between the symptoms and the insight, only the perseverative response was found to have a partial mediating effect - both cross-sectionally, and in the 8-week longitudinal change. Overall, the relationship between insight and neurocognitive functioning measures was found to be selective and weak. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. The ASAS-SN Catalog of Variable Stars I: The Serendipitous Survey

    Science.gov (United States)

    Jayasinghe, T.; Kochanek, C. S.; Stanek, K. Z.; Shappee, B. J.; Holoien, T. W.-S.; Thompson, Todd A.; Prieto, J. L.; Dong, Subo; Pawlak, M.; Shields, J. V.; Pojmanski, G.; Otero, S.; Britt, C. A.; Will, D.

    2018-04-01

    The All-Sky Automated Survey for Supernovae (ASAS-SN) is the first optical survey to routinely monitor the whole sky with a cadence of ˜2 - 3 days down to V≲ 17 mag. ASAS-SN has monitored the whole sky since 2014, collecting ˜100 - 500 epochs of observations per field. The V-band light curves for candidate variables identified during the search for supernovae are classified using a random forest classifier and visually verified. We present a catalog of 66,533 bright, new variable stars discovered during our search for supernovae, including 27,753 periodic variables and 38,780 irregular variables. V-band light curves for the ASAS-SN variables are available through the ASAS-SN variable stars database (https://asas-sn.osu.edu/variables). The database will begin to include the light curves of known variable stars in the near future along with the results for a systematic, all-sky variability survey.

  3. Instrumental Variables in the Long Run

    DEFF Research Database (Denmark)

    Casey, Gregory; Klemp, Marc Patrick Brag

    2017-01-01

    In the study of long-run economic growth, it is common to use historical or geographical variables as instruments for contemporary endogenous regressors. We study the interpretation of these conventional instrumental variable (IV) regressions in a general, yet simple, framework. Our aim...... quantitative implications for the field of long-run economic growth. We also use our framework to examine related empirical techniques. We find that two prominent regression methodologies - using gravity-based instruments for trade and including ancestry-adjusted variables in linear regression models - have...... is to estimate the long-run causal effect of changes in the endogenous explanatory variable. We find that conventional IV regressions generally cannot recover this parameter of interest. To estimate this parameter, therefore, we develop an augmented IV estimator that combines the conventional regression...

  4. Variable volume combustor with a conical liner support

    Science.gov (United States)

    Johnson, Thomas Edward; McConnaughhay, Johnie Franklin; Keener, Chrisophter Paul; Ostebee, Heath Michael

    2017-06-27

    The present application provides a variable volume combustor for use with a gas turbine engine. The variable volume combustor may include a liner, a number of micro-mixer fuel nozzles positioned within the liner, and a conical liner support supporting the liner.

  5. Managers’ perspectives: practical experience and challenges associated with variable-density operations and uneven-aged management

    Science.gov (United States)

    Kurtis E. Steele

    2013-01-01

    Variable-density thinning has received a lot of public attention in recent years and has subsequently become standard language in most of the Willamette National Forest’s timber management projects. Many techniques have been tried, with varying on-the-ground successes. To accomplish variable-density thinning, the McKenzie River Ranger District currently uses...

  6. Vitamin D measurement standardization: the way out of the chaos

    Science.gov (United States)

    Substantial variability is associated with laboratory measurement of serum total 25-hydroxyvitamin D [25(OH)D]. The resulting chaos impedes development of consensus 25(OH)D values to define stages of vitamin D status. As resolving this situation requires standardized measurement of 25(OH)D, the Vita...

  7. WEATHE- RELATED VARIABILITY OF CALORIMETERY PERFORMANCE IN A POORLY-CONTROLLED ENVIRONMENT

    International Nuclear Information System (INIS)

    CAMERON, M.A.

    2007-01-01

    Four Antech airbath calorimeters at the Hanford site were studied for three summers and two winters in a location not well-shielded from outside temperature changes. All calorimeters showed significant increases in variability of standard measurements during hot weather. The increased variability is postulated to be due to a low setting of the Peltier cold face temperature, which doesn't allow the instrument to drain heat fast enough in a hot environment. A higher setting of the Peltier cold face might lead to better performance in environments subjected to a broad range of temperatures

  8. Interpreting the concordance statistic of a logistic regression model: relation to the variance and odds ratio of a continuous explanatory variable.

    Science.gov (United States)

    Austin, Peter C; Steyerberg, Ewout W

    2012-06-20

    When outcomes are binary, the c-statistic (equivalent to the area under the Receiver Operating Characteristic curve) is a standard measure of the predictive accuracy of a logistic regression model. An analytical expression was derived under the assumption that a continuous explanatory variable follows a normal distribution in those with and without the condition. We then conducted an extensive set of Monte Carlo simulations to examine whether the expressions derived under the assumption of binormality allowed for accurate prediction of the empirical c-statistic when the explanatory variable followed a normal distribution in the combined sample of those with and without the condition. We also examine the accuracy of the predicted c-statistic when the explanatory variable followed a gamma, log-normal or uniform distribution in combined sample of those with and without the condition. Under the assumption of binormality with equality of variances, the c-statistic follows a standard normal cumulative distribution function with dependence on the product of the standard deviation of the normal components (reflecting more heterogeneity) and the log-odds ratio (reflecting larger effects). Under the assumption of binormality with unequal variances, the c-statistic follows a standard normal cumulative distribution function with dependence on the standardized difference of the explanatory variable in those with and without the condition. In our Monte Carlo simulations, we found that these expressions allowed for reasonably accurate prediction of the empirical c-statistic when the distribution of the explanatory variable was normal, gamma, log-normal, and uniform in the entire sample of those with and without the condition. The discriminative ability of a continuous explanatory variable cannot be judged by its odds ratio alone, but always needs to be considered in relation to the heterogeneity of the population.

  9. 41 CFR 101-29.218 - Voluntary standards.

    Science.gov (United States)

    2010-07-01

    ... Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 29-FEDERAL PRODUCT... standards,” but does not include professional standards of personal conduct, institutional codes of ethics...

  10. 26 CFR 1.801-7 - Variable annuities.

    Science.gov (United States)

    2010-04-01

    ... 26 Internal Revenue 8 2010-04-01 2010-04-01 false Variable annuities. 1.801-7 Section 1.801-7...) INCOME TAXES Life Insurance Companies § 1.801-7 Variable annuities. (a) In general. (1) Section 801(g)(1) provides that for purposes of part I, subchapter L, chapter 1 of the Code, an annuity contract includes a...

  11. Using Derivative Estimates to Describe Intraindividual Variability at Multiple Time Scales

    Science.gov (United States)

    Deboeck, Pascal R.; Montpetit, Mignon A.; Bergeman, C. S.; Boker, Steven M.

    2009-01-01

    The study of intraindividual variability is central to the study of individuals in psychology. Previous research has related the variance observed in repeated measurements (time series) of individuals to traitlike measures that are logically related. Intraindividual measures, such as intraindividual standard deviation or the coefficient of…

  12. Building prognostic models for breast cancer patients using clinical variables and hundreds of gene expression signatures

    Directory of Open Access Journals (Sweden)

    Liu Yufeng

    2011-01-01

    Full Text Available Abstract Background Multiple breast cancer gene expression profiles have been developed that appear to provide similar abilities to predict outcome and may outperform clinical-pathologic criteria; however, the extent to which seemingly disparate profiles provide additive prognostic information is not known, nor do we know whether prognostic profiles perform equally across clinically defined breast cancer subtypes. We evaluated whether combining the prognostic powers of standard breast cancer clinical variables with a large set of gene expression signatures could improve on our ability to predict patient outcomes. Methods Using clinical-pathological variables and a collection of 323 gene expression "modules", including 115 previously published signatures, we build multivariate Cox proportional hazards models using a dataset of 550 node-negative systemically untreated breast cancer patients. Models predictive of pathological complete response (pCR to neoadjuvant chemotherapy were also built using this approach. Results We identified statistically significant prognostic models for relapse-free survival (RFS at 7 years for the entire population, and for the subgroups of patients with ER-positive, or Luminal tumors. Furthermore, we found that combined models that included both clinical and genomic parameters improved prognostication compared with models with either clinical or genomic variables alone. Finally, we were able to build statistically significant combined models for pathological complete response (pCR predictions for the entire population. Conclusions Integration of gene expression signatures and clinical-pathological factors is an improved method over either variable type alone. Highly prognostic models could be created when using all patients, and for the subset of patients with lymph node-negative and ER-positive breast cancers. Other variables beyond gene expression and clinical-pathological variables, like gene mutation status or DNA

  13. Use of a non-linear method for including the mass uncertainty of gravimetric standards and system measurement errors in the fitting of calibration curves for XRFA freeze-dried UNO3 standards

    International Nuclear Information System (INIS)

    Pickles, W.L.; McClure, J.W.; Howell, R.H.

    1978-05-01

    A sophisticated nonlinear multiparameter fitting program was used to produce a best fit calibration curve for the response of an x-ray fluorescence analyzer to uranium nitrate, freeze dried, 0.2% accurate, gravimetric standards. The program is based on unconstrained minimization subroutine, VA02A. The program considers the mass values of the gravimetric standards as parameters to be fit along with the normal calibration curve parameters. The fitting procedure weights with the system errors and the mass errors in a consistent way. The resulting best fit calibration curve parameters reflect the fact that the masses of the standard samples are measured quantities with a known error. Error estimates for the calibration curve parameters can be obtained from the curvature of the ''Chi-Squared Matrix'' or from error relaxation techniques. It was shown that nondispersive XRFA of 0.1 to 1 mg freeze-dried UNO 3 can have an accuracy of 0.2% in 1000 s

  14. Whole-body-MR imaging including DWIBS in the work-up of patients with head and neck squamous cell carcinoma: A feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    Noij, Daniel P., E-mail: d.noij@vumc.nl [Department of Radiology and Nuclear Medicine, VU University Medical Center, De Boelelaan 1117, PO Box 7057, 1007 MB Amsterdam (Netherlands); Boerhout, Els J., E-mail: e.boerhout@vumc.nl [Department of Radiology and Nuclear Medicine, VU University Medical Center, De Boelelaan 1117, PO Box 7057, 1007 MB Amsterdam (Netherlands); Pieters-van den Bos, Indra C., E-mail: i.pieters@vumc.nl [Department of Radiology and Nuclear Medicine, VU University Medical Center, De Boelelaan 1117, PO Box 7057, 1007 MB Amsterdam (Netherlands); Comans, Emile F., E-mail: efi.comans@vumc.nl [Department of Radiology and Nuclear Medicine, VU University Medical Center, De Boelelaan 1117, PO Box 7057, 1007 MB Amsterdam (Netherlands); Oprea-Lager, Daniela, E-mail: d.oprea-lager@vumc.nl [Department of Radiology and Nuclear Medicine, VU University Medical Center, De Boelelaan 1117, PO Box 7057, 1007 MB Amsterdam (Netherlands); Reinhard, Rinze, E-mail: r.reinhard@vumc.nl [Department of Radiology and Nuclear Medicine, VU University Medical Center, De Boelelaan 1117, PO Box 7057, 1007 MB Amsterdam (Netherlands); Hoekstra, Otto S., E-mail: os.hoekstra@vumc.nl [Department of Radiology and Nuclear Medicine, VU University Medical Center, De Boelelaan 1117, PO Box 7057, 1007 MB Amsterdam (Netherlands); Bree, Remco de, E-mail: r.debree@vumc.nl [Department Otolaryngology/Head and Neck Surgery, VU University Medical Center, De Boelelaan 1117, PO Box 7057, 1007 MB Amsterdam (Netherlands); Graaf, Pim de, E-mail: p.degraaf@vumc.nl [Department of Radiology and Nuclear Medicine, VU University Medical Center, De Boelelaan 1117, PO Box 7057, 1007 MB Amsterdam (Netherlands); Castelijns, Jonas A., E-mail: j.castelijns@vumc.nl [Department of Radiology and Nuclear Medicine, VU University Medical Center, De Boelelaan 1117, PO Box 7057, 1007 MB Amsterdam (Netherlands)

    2014-07-15

    Objectives: To assess the feasibility of whole-body magnetic resonance imaging (WB-MRI) including diffusion-weighted whole-body imaging with background-body-signal-suppression (DWIBS) for the evaluation of distant malignancies in head and neck squamous cell carcinoma (HNSCC); and to compare WB-MRI findings with {sup 18}F-fluorodeoxyglucose positron emission tomography/computed tomography ({sup 18}F-FDG-PET/CT) and chest-CT. Methods: Thirty-three patients with high risk for metastatic spread (26 males; range 48–79 years, mean age 63 ± 7.9 years (mean ± standard deviation) years) were prospectively included with a follow-up of six months. WB-MRI protocol included short-TI inversion recovery and T1-weighted sequences in the coronal plane and half-fourier acquisition single-shot turbo spin-echo T2 and contrast-enhanced-T1-weighted sequences in the axial plane. Axial DWIBS was reformatted in the coronal plane. Interobserver variability was assessed using weighted kappa and the proportion specific agreement (PA). Results: Two second primary tumors and one metastasis were detected on WB-MRI. WB-MRI yielded seven clinically indeterminate lesions which did not progress at follow-up. The metastasis and one second primary tumor were found when combining {sup 18}F-FDG-PET/CT and chest-CT findings. Interobserver variability for WB-MRI was κ = 0.91 with PA ranging from 0.82 to 1.00. For {sup 18}F-FDG-PET/CT κ could not be calculated due to a constant variable in the table and PA ranged from 0.40 to 0.99. Conclusions: Our WB-MRI protocol with DWIBS is feasible in the work-up of HNSCC patients for detection and characterization of distant pathology. WB-MRI can be complementary to {sup 18}F-FDG-PET/CT, especially in the detection of non {sup 18}F-FDG avid second primary tumors.

  15. Association between different measurements of blood pressure variability by ABP monitoring and ankle-brachial index.

    Science.gov (United States)

    Wittke, Estefânia; Fuchs, Sandra C; Fuchs, Flávio D; Moreira, Leila B; Ferlin, Elton; Cichelero, Fábio T; Moreira, Carolina M; Neyeloff, Jeruza; Moreira, Marina B; Gus, Miguel

    2010-11-05

    Blood pressure (BP) variability has been associated with cardiovascular outcomes, but there is no consensus about the more effective method to measure it by ambulatory blood pressure monitoring (ABPM). We evaluated the association between three different methods to estimate BP variability by ABPM and the ankle brachial index (ABI). In a cross-sectional study of patients with hypertension, BP variability was estimated by the time rate index (the first derivative of SBP over time), standard deviation (SD) of 24-hour SBP; and coefficient of variability of 24-hour SBP. ABI was measured with a doppler probe. The sample included 425 patients with a mean age of 57 ± 12 years, being 69.2% women, 26.1% current smokers and 22.1% diabetics. Abnormal ABI (≤ 0.90 or ≥ 1.40) was present in 58 patients. The time rate index was 0.516 ± 0.146 mmHg/min in patients with abnormal ABI versus 0.476 ± 0.124 mmHg/min in patients with normal ABI (P = 0.007). In a logistic regression model the time rate index was associated with ABI, regardless of age (OR = 6.9, 95% CI = 1.1- 42.1; P = 0.04). In a multiple linear regression model, adjusting for age, SBP and diabetes, the time rate index was strongly associated with ABI (P < 0.01). None of the other indexes of BP variability were associated with ABI in univariate and multivariate analyses. Time rate index is a sensible method to measure BP variability by ABPM. Its performance for risk stratification of patients with hypertension should be explored in longitudinal studies.

  16. Sex and family history of cardiovascular disease influence heart rate variability during stress among healthy adults.

    Science.gov (United States)

    Emery, Charles F; Stoney, Catherine M; Thayer, Julian F; Williams, DeWayne; Bodine, Andrew

    2018-07-01

    Studies of sex differences in heart rate variability (HRV) typically have not accounted for the influence of family history (FH) of cardiovascular disease (CVD). This study evaluated sex differences in HRV response to speech stress among men and women (age range 30-49 years) with and without a documented FH of CVD. Participants were 77 adults (mean age = 39.8 ± 6.2 years; range: 30-49 years; 52% female) with positive FH (FH+, n = 32) and negative FH (FH-, n = 45) of CVD, verified with relatives of participants. Cardiac activity for all participants was recorded via electrocardiogram during a standardized speech stress task with three phases: 5-minute rest, 5-minute speech, and 5-minute recovery. Outcomes included time domain and frequency domain indicators of HRV and heart rate (HR) at rest and during stress. Data were analyzed with repeated measures analysis of variance, with sex and FH as between subject variables and time/phase as a within subject variable. Women exhibited higher HR than did men and greater HR reactivity in response to the speech stress. However, women also exhibited greater HRV in both the time and frequency domains. FH+ women generally exhibited elevated HRV, despite the elevated risk of CVD associated with FH+. Although women participants exhibited higher HR at rest and during stress, women (both FH+ and FH-) also exhibited elevated HRV reactivity, reflecting greater autonomic control. Thus, enhanced autonomic function observed in prior studies of HRV among women is also evident among FH+ women during a standardized stress task. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Mathematical Model of Thyristor Inverter Including a Series-parallel Resonant Circuit

    OpenAIRE

    Miroslaw Luft; Elzbieta Szychta

    2008-01-01

    The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with theaid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  18. China's High-technology Standards Development

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    There are several major technology standards, including audio video coding (AVS), automotive electronics, third generation (3G) mobile phones, mobile television, wireless networks and digital terrestrial television broadcasting, that have been released or are currently under development in China. This article offers a detailed analysis of each standard and studies their impact on China's high-technology industry.

  19. Robotic and endoscopic transaxillary thyroidectomies may be cost prohibitive when compared to standard cervical thyroidectomy: a cost analysis.

    Science.gov (United States)

    Cabot, Jennifer C; Lee, Cho Rok; Brunaud, Laurent; Kleiman, David A; Chung, Woong Youn; Fahey, Thomas J; Zarnegar, Rasa

    2012-12-01

    This study presents a cost analysis of the standard cervical, gasless transaxillary endoscopic, and gasless transaxillary robotic thyroidectomy approaches based on medical costs in the United States. A retrospective review of 140 patients who underwent standard cervical, transaxillary endoscopic, or transaxillary robotic thyroidectomy at 2 tertiary centers was conducted. The cost model included operating room charges, anesthesia fee, consumables cost, equipment depreciation, and maintenance cost. Sensitivity analyses assessed individual cost variables. The mean operative times for the standard cervical, transaxillary endoscopic, and transaxillary robotic approaches were 121 ± 18.9, 185 ± 26.0, and 166 ± 29.4 minutes, respectively. The total cost for the standard cervical, transaxillary endoscopic, and transaxillary robotic approaches were $9,028 ± $891, $12,505 ± $1,222, and $13,670 ± $1,384, respectively. Transaxillary approaches were significantly more expensive than the standard cervical technique (standard cervical/transaxillary endoscopic, P cost when transaxillary endoscopic operative time decreased to 111 minutes and transaxillary robotic operative time decreased to 68 minutes. Increasing the case load did not resolve the cost difference. Transaxillary endoscopic and transaxillary robotic thyroidectomies are significantly more expensive than the standard cervical approach. Decreasing operative times reduces this cost difference. The greater expense may be prohibitive in countries with a flat reimbursement schedule. Copyright © 2012 Mosby, Inc. All rights reserved.

  20. On the Occurrence of Standardized Regression Coefficients Greater than One.

    Science.gov (United States)

    Deegan, John, Jr.

    1978-01-01

    It is demonstrated here that standardized regression coefficients greater than one can legitimately occur. Furthermore, the relationship between the occurrence of such coefficients and the extent of multicollinearity present among the set of predictor variables in an equation is examined. Comments on the interpretation of these coefficients are…

  1. Variable effects of high-dose adrenaline relative to standard-dose adrenaline on resuscitation outcomes according to cardiac arrest duration.

    Science.gov (United States)

    Jeung, Kyung Woon; Ryu, Hyun Ho; Song, Kyung Hwan; Lee, Byung Kook; Lee, Hyoung Youn; Heo, Tag; Min, Yong Il

    2011-07-01

    Adjustment of adrenaline (epinephrine) dosage according to cardiac arrest (CA) duration, rather than administering the same dose, may theoretically improve resuscitation outcomes. We evaluated variable effects of high-dose adrenaline (HDA) relative to standard-dose adrenaline (SDA) on resuscitation outcomes according to CA duration. Twenty-eight male domestic pigs were randomised to the following 4 groups according to the dosage of adrenaline (SDA 0.02 mg/kg vs. HDA 0.2mg/kg) and duration of CA before beginning cardiopulmonary resuscitation (CPR): 6 min SDA, 6 min HDA, 13 min SDA, or 13 min HDA. After the predetermined duration of untreated ventricular fibrillation, CPR was provided. All animals in the 6 min SDA, 6 min HDA, and 13 min HDA groups were successfully resuscitated, while only 4 of 7 pigs in the 13 min SDA group were successfully resuscitated (p=0.043). HDA groups showed higher right atrial pressure, more frequent ventricular ectopic beats, higher blood glucose, higher troponin-I, and more severe metabolic acidosis than SDA groups. Animals of 13 min groups showed more severe metabolic acidosis and higher troponin-I than animals of 6 min groups. All successfully resuscitated animals, except two animals in the 13 min HDA group, survived for 7 days (p=0.121). Neurologic deficit score was not affected by the dose of adrenaline. HDA showed benefit in achieving restoration of spontaneous circulation in 13 min CA, when compared with 6 min CA. However, this benefit did not translate into improved long-term survival or neurologic outcome. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  2. Variable Selection via Partial Correlation.

    Science.gov (United States)

    Li, Runze; Liu, Jingyuan; Lou, Lejia

    2017-07-01

    Partial correlation based variable selection method was proposed for normal linear regression models by Bühlmann, Kalisch and Maathuis (2010) as a comparable alternative method to regularization methods for variable selection. This paper addresses two important issues related to partial correlation based variable selection method: (a) whether this method is sensitive to normality assumption, and (b) whether this method is valid when the dimension of predictor increases in an exponential rate of the sample size. To address issue (a), we systematically study this method for elliptical linear regression models. Our finding indicates that the original proposal may lead to inferior performance when the marginal kurtosis of predictor is not close to that of normal distribution. Our simulation results further confirm this finding. To ensure the superior performance of partial correlation based variable selection procedure, we propose a thresholded partial correlation (TPC) approach to select significant variables in linear regression models. We establish the selection consistency of the TPC in the presence of ultrahigh dimensional predictors. Since the TPC procedure includes the original proposal as a special case, our theoretical results address the issue (b) directly. As a by-product, the sure screening property of the first step of TPC was obtained. The numerical examples also illustrate that the TPC is competitively comparable to the commonly-used regularization methods for variable selection.

  3. Variability of excretion rates of 210Pb and 210Po of humans at environmental levels

    International Nuclear Information System (INIS)

    Spencer, H.; Holtzman, R.B.; Ilcewicz, F.H.; Kramer, L.

    1977-01-01

    Variability of the excretion rates of the nuclides 210 Pb and 210 Po at natural levels was studied in a group of samples collected from men maintained under the carefully controlled conditions of a metabolic ward. They consumed only the standard diet of the ward in which they had been resident for at least several months prior to this study. The mean urinary rates were about 0.1 to 0.5 pCi/day for both 210 Pb and 210 Po, while fecal rates ranged from 1 to 2.7 pCi/day for the two nuclides. For urinary 210 Pb the coefficients of variation (ratio of standard deviation to mean) for three subjects ranged from 19 to 45 percent for eight continuous 24-hr samples compared to 11 to 13 percent for subsequently collected multiday samples (4 to 9 days each) for each subject. However, the standard errors of the means for the one day collections were about equal to the standard deviations of the pooled samples. Similar variability was noted for the 210 Po data. Six day fecal collections from these time periods exhibited higher variabilities than did the urine, from about 12 percent to 50 percent for each of the nuclides. Multiday collections for 12 subjects showed mean coefficients of variation of about 16 percent for 210 Pb and 13 percent for the 210 Po for urine and 21 and 25 percent, respectively, in fecal collections. Since dietary intake was maintained fairly constant, excreta collections were carefully controlled, and the analytical precision was about 5 percent, these variabilities appear to be due to biological variations and are characteristic of the individuals studied. Some possible causes of these effects are discussed

  4. Comparisons of ANSI standards cited in the NRC standard review plan, NUREG-0800 and related documents

    International Nuclear Information System (INIS)

    Ankrum, A.R.; Bohlander, K.L.; Gilbert, E.R.; Pawlowski, R.A.; Spiesman, J.B.

    1995-11-01

    This report provides the results of comparisons of the cited and latest versions of ANSI standards cited in the NRC Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants (NUREG 0800) and related documents. The comparisons were performed by Battelle Pacific Northwest Laboratories in support of the NRC's Standard Review Plan Update and Development Program. Significant changes to the standards, from the cited version to the latest version, are described and discussed in a tabular format for each standard. Recommendations for updating each citation in the Standard Review Plan are presented. Technical considerations and suggested changes are included for related regulatory documents (i.e., Regulatory Guides and the Code of Federal Regulations) citing the standard. The results and recommendations presented in this document have not been subjected to NRC staff review

  5. Comparisons of ASTM standards cited in the NRC standard review plan, NUREG-0800 and related documents

    International Nuclear Information System (INIS)

    Ankrum, A.R.; Bohlander, K.L.; Gilbert, E.R.; Pawlowski, R.A.; Spiesman, J.B.

    1995-10-01

    This report provides the results of comparisons of the cited and latest versions of ASTM standards cited in the NRC Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants (NUREG 0800) and related documents. The comparisons were performed by Battelle Pacific Northwest Laboratories in support of the NRC's Standard Review Plan Update and Development Program. Significant changes to the standards, from the cited version to the latest version, are described and discussed in a tabular format for each standard. Recommendations for updating each citation in the Standard Review Plan are presented. Technical considerations and suggested changes are included for related regulatory documents (i.e., Regulatory Guides and the Code of Federal Regulations) citing the standard. The results and recommendations presented in this document have not been subjected to NRC staff review

  6. Mathematical model of thyristor inverter including a series-parallel resonant circuit

    OpenAIRE

    Luft, M.; Szychta, E.

    2008-01-01

    The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with the aid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  7. HbA1C variability and the risk of renal status progression in Diabetes Mellitus: a meta-analysis.

    Directory of Open Access Journals (Sweden)

    Dongsheng Cheng

    Full Text Available To explore the association between glycated hemoglobin (A1C variability and renal disease progression in patients with diabetes mellitus.A comprehensive search was performed using the PubMed and Embase databases (up to April 26, 2014. The hazard ratio (HR was pooled per unit increase in the standard deviation of A1C (A1C-SD to evaluate the dose-response relationship between A1C-SD and the risk of nephropathy.Eight studies with a total of 17,758 subjects provided the HR for A1C-SD and were included in the final meta-analysis. The pooled HR results demonstrated that A1C-SD was significantly associated with the progression of renal status (HR for both T1DM and T2DM 1.43, 95% confidence interval [CI] 1.24-1.64; HR for T1DM 1.70, 95%CI 1.41-2.05; HR for T2DM 1.20, 95%CI 1.12-1.28. A1C-SD was significantly correlated with new-onset microalbuminuria (HR for T1DM 1.63, 95%CI 1.28-2.07; HR for T2DM 1.23, 95%CI 1.08-1.39. These outcomes were also supported in subgroup analyses. Furthermore, sensitivity analyses demonstrated that the results were robust.A1C variability is independently associated with the development of microalbuminuria and the progression of renal status in both type 1 and 2 diabetes patients. A standard method for measuring A1C variability is essential for further and deeper analyses. In addition, future studies should assess the effect of reducing A1C variability on nephropathy complication.

  8. DISCOVERY OF A NOVA-LIKE CATACLYSMIC VARIABLE IN THE KEPLER MISSION FIELD

    International Nuclear Information System (INIS)

    Williams, Kurtis A.; De Martino, Domitilla; Silvotti, Roberto; Bruni, Ivan; Dufour, Patrick; Riecken, Thomas S.; Kronberg, Martin; Mukadam, Anjum; Handler, G.

    2010-01-01

    We announce the identification of a new cataclysmic variable (CV) star in the field of the Kepler Mission, KIC J192410.81+445934.9. This system was identified during a search for compact pulsators in the Kepler field. High-speed photometry reveals coherent large-amplitude variability with a period of 2.94 hr. Rapid, large-amplitude quasi-periodic variations are also detected on time scales of ∼1200 s and ∼650 s. Time-resolved spectroscopy covering one half photometric period shows shallow, broad Balmer and He I absorption lines with bright emission cores as well as strong He II and Bowen blend emission. Radial velocity variations are also observed in the Balmer and He I emission lines that are consistent with the photometric period. We therefore conclude that KIC J192410.81+445934.9 is a nova-like (NL) variable of the UX UMa class in or near the period gap, and it may belong to the rapidly growing subclass of SW Sex systems. Based on Two Micron All Sky Survey photometry and companion star models, we place a lower limit on the distance to the system of ∼500 pc. Due to limitations of our discovery data, additional observations including spectroscopy and polarimetry are needed to confirm the nature of this object. Such data will enable further understanding of the behavior of NL variables in the critical period range of 3-4 hr, where standard CV evolutionary theory finds major problems. The presence of this system in the Kepler Mission field of view also presents a unique opportunity to obtain a continuous photometric data stream of unparalleled length and precision on a CV system.

  9. Tracer transport in fractures: analysis of field data based on a variable - aperture channel model

    International Nuclear Information System (INIS)

    Tsang, C.F.; Tsang, Y.W.; Hale, F.V.

    1991-06-01

    A variable-aperture channel model is used as the basis to interpret data from a three-year tracer transport experiment in fractured rocks. The data come from the so-called Stripa-3D experiment performed by Neretnieks and coworkers. Within the framework of the variable-aperture channel conceptual model, tracers are envisioned as travelling along a number of variable-aperture flow channels, whose properties are related to the mean b - and standard deviation σ b of the fracture aperture distribution. Two methods are developed to address the presence of strong time variation of the tracer injection flow rate in this experiment. The first approximates the early part of the injection history by an exponential decay function and is applicable to the early time tracer breakthrough data. The second is a deconvolution method involving the use of Toeplitz matrices and is applicable over the complete period of variable injection of the tracers. Both methods give consistent results. These results include not only estimates of b and σ, but also ranges of Peclet numbers, dispersivity and an estimate of the number of channels involved in the tracer transport. An interesting and surprising observation is that the data indicate that the Peclet number increases with the mean travel time: i.e., dispersivity decreasing with mean travel time. This trend is consistent with calculated results of tracer transport in multiple variable-aperture fractures in series. The meaning of this trend is discussed in terms of the strong heterogeneity of the flow system. (au) (22 refs.)

  10. Association between different measurements of blood pressure variability by ABP monitoring and ankle-brachial index

    Directory of Open Access Journals (Sweden)

    Moreira Leila B

    2010-11-01

    Full Text Available Abstract Background Blood pressure (BP variability has been associated with cardiovascular outcomes, but there is no consensus about the more effective method to measure it by ambulatory blood pressure monitoring (ABPM. We evaluated the association between three different methods to estimate BP variability by ABPM and the ankle brachial index (ABI. Methods and Results In a cross-sectional study of patients with hypertension, BP variability was estimated by the time rate index (the first derivative of SBP over time, standard deviation (SD of 24-hour SBP; and coefficient of variability of 24-hour SBP. ABI was measured with a doppler probe. The sample included 425 patients with a mean age of 57 ± 12 years, being 69.2% women, 26.1% current smokers and 22.1% diabetics. Abnormal ABI (≤ 0.90 or ≥ 1.40 was present in 58 patients. The time rate index was 0.516 ± 0.146 mmHg/min in patients with abnormal ABI versus 0.476 ± 0.124 mmHg/min in patients with normal ABI (P = 0.007. In a logistic regression model the time rate index was associated with ABI, regardless of age (OR = 6.9, 95% CI = 1.1- 42.1; P = 0.04. In a multiple linear regression model, adjusting for age, SBP and diabetes, the time rate index was strongly associated with ABI (P Conclusion Time rate index is a sensible method to measure BP variability by ABPM. Its performance for risk stratification of patients with hypertension should be explored in longitudinal studies.

  11. Biofeedback training effects on minimum toe clearance variability during treadmill walking.

    Science.gov (United States)

    Tirosh, Oren; Cambell, Amity; Begg, Rezaul K; Sparrow, W A

    2013-08-01

    A number of variability analysis techniques, including Poincaré plots and detrended fluctuation analysis (DFA) were used to investigate minimum toe clearance (MTC) control during walking. Ten young adults walked on a treadmill for 10 min at preferred speed in three conditions: (i) no-intervention baseline, (ii) with biofeedback of MTC within a target range, and (iii) no-biofeedback retention. Mean, median, standard deviation (SD), and inter quartile range of MTC during biofeedback (45.57 ± 11.65, 44.98 ± 11.57, 7.08 ± 2.61, 8.58 ± 2.77 mm, respectively) and retention (56.95 ± 20.31, 56.69 ± 20.94, 10.68 ± 5.41, 15.38 ± 10.19 mm) were significantly greater than baseline (30.77 ± 9.49, 30.51 ± 9.49, 3.04 ± 0.77, 3.66 ± 0.91 mm). Relative to baseline, skewness was reduced in biofeedback and retention but only significantly for retention (0.88 ± 0.51, 0.63 ± 0.55, and 0.40 ± 0.40, respectively). Baseline Poincaré measures (SD1 = 0.25, SD2 = 0.34) and DFA (α1 = 0.72 and α2 = 0.64) were lower than biofeedback (SD1 = 0.58, SD2 = 0.83, DFA α1 = 0.76 and α2 = 0.92) with significantly greater variability in retention compared to biofeedback only in the long-term SD2 and α2 analyses. Increased DFA longer-term correlations α2 in retention confirm that a novel gait pattern was acquired with a longer-term variability structure. Short- and long-term variability analyses were both useful in quantifying gait adaptations with biofeedback. The findings provide evidence that MTC can be modified with feedback, suggesting future applications in gait training procedures for impaired populations designed to reduce tripping risk.

  12. MANUAL OF STANDARDS FOR REHABILITATION CENTERS AND FACILITIES.

    Science.gov (United States)

    CANIFF, CHARLES E.; AND OTHERS

    A 5-YEAR PROJECT TO SPECIFY STANDARDS OF REHABILITATION CENTERS AND FACILITIES RESULTED IN THREE PUBLICATIONS. THIS MANUAL INCLUDES THE CHARACTERISTICS AND GOALS OF REHABILITATION FACILITIES. THE STANDARDS FOR ORGANIZATION, SERVICES THAT SHOULD BE PROVIDED, PERSONNEL INCLUDED, RECORDS AND REPORTS, FISCAL MANAGEMENT, AND THE PHYSICAL PLANT ARE…

  13. Painleve analysis and transformations for a generalized two-dimensional variable-coefficient Burgers model from fluid mechanics, acoustics and cosmic-ray astrophysics

    International Nuclear Information System (INIS)

    Wei, Guang-Mei

    2006-01-01

    Generalized two-dimensional variable-coefficient Burgers model is of current value in fluid mechanics, acoustics and cosmic-ray astrophysics. In this paper, Painleve analysis leads to the constraints on the variable coefficients for such a model to pass the Painleve test and to an auto-Baecklund transformation. Moreover, four transformations from this model are constructed, to the standard two-dimensional and one-dimensional Burgers models with the relevant constraints on the variable coefficients via symbolic computation. By virtue of the given transformations the properties and solutions of this model can be obtained from those of the standard two-dimensional and one-dimensional ones

  14. Variability Bugs:

    DEFF Research Database (Denmark)

    Melo, Jean

    . Although many researchers suggest that preprocessor-based variability amplifies maintenance problems, there is little to no hard evidence on how actually variability affects programs and programmers. Specifically, how does variability affect programmers during maintenance tasks (bug finding in particular......)? How much harder is it to debug a program as variability increases? How do developers debug programs with variability? In what ways does variability affect bugs? In this Ph.D. thesis, I set off to address such issues through different perspectives using empirical research (based on controlled...... experiments) in order to understand quantitatively and qualitatively the impact of variability on programmers at bug finding and on buggy programs. From the program (and bug) perspective, the results show that variability is ubiquitous. There appears to be no specific nature of variability bugs that could...

  15. Factors in Variability of Serial Gabapentin Concentrations in Elderly Patients with Epilepsy.

    Science.gov (United States)

    Conway, Jeannine M; Eberly, Lynn E; Collins, Joseph F; Macias, Flavia M; Ramsay, R Eugene; Leppik, Ilo E; Birnbaum, Angela K

    2017-10-01

    To characterize and quantify the variability of serial gabapentin concentrations in elderly patients with epilepsy. This study included 83 patients (age ≥ 60 yrs) from an 18-center randomized double-blind double-dummy parallel study from the Veterans Affairs Cooperative 428 Study. All patients were taking 1500 mg/day gabapentin. Within-person coefficient of variation (CV) in gabapentin concentrations, measured weekly to bimonthly for up to 52 weeks, then quarterly, was computed. Impact of patient characteristics on gabapentin concentrations (linear mixed model) and CV (linear regression) were estimated. A total of 482 gabapentin concentration measurements were available for analysis. Gabapentin concentrations and intrapatient CVs ranged from 0.5 to 22.6 μg/ml (mean 7.9 μg/ml, standard deviation [SD] 4.1 μg/ml) and 2% to 79% (mean 27.9%, SD 15.3%), respectively, across all visits. Intrapatient CV was higher by 7.3% for those with a body mass index of ≥ 30 kg/m 2 (coefficient = 7.3, p=0.04). CVs were on average 0.5% higher for each 1-unit higher CV in creatinine clearance (coefficient = 0.5, p=0.03) and 1.2% higher for each 1-hour longer mean time after dose (coefficient = 1.2, p=0.04). Substantial intrapatient variability in serial gabapentin concentration was noted in elderly patients with epilepsy. Creatinine clearance, time of sampling relative to dose, and obesity were found to be positively associated with variability. © 2017 Pharmacotherapy Publications, Inc.

  16. Perspectives for short timescale variability studies with Gaia

    Science.gov (United States)

    Roelens, M.; Eyer, L.; Mowlavi, N.; Lecoeur-Taïbi, I.; Rimoldini, L.; Blanco-Cuaresma, S.; Palaversa, L.; Süveges, M.; Charnas, J.; Wevers, T.

    2017-12-01

    We assess the potential of Gaia for detecting and characterizing short timescale variables, i.e. at timescale from a few seconds to a dozen hours, through extensive light-curve simulations for various short timescale variable types, including both periodic and non-periodic variability. We evidence that the variogram analysis applied to Gaia photometry should enable to detect such fast variability phenomena, down to amplitudes of a few millimagnitudes, with limited contamination from longer timescale variables or constant sources. This approach also gives valuable information on the typical timescale(s) of the considered variation, which could complement results of classical period search methods, and help prepare ground-based follow-up of the Gaia short timescale candidates.

  17. [The association between blood pressure variability and sleep stability in essential hypertensive patients with sleep disorder].

    Science.gov (United States)

    Zhu, Y Q; Long, Q; Xiao, Q F; Zhang, M; Wei, Y L; Jiang, H; Tang, B

    2018-03-13

    Objective: To investigate the association of blood pressure variability and sleep stability in essential hypertensive patients with sleep disorder by cardiopulmonary coupling. Methods: Performed according to strict inclusion and exclusion criteria, 88 new cases of essential hypertension who came from the international department and the cardiology department of china-japan friendship hospital were enrolled. Sleep stability and 24 h ambulatory blood pressure data were collected by the portable sleep monitor based on cardiopulmonary coupling technique and 24 h ambulatory blood pressure monitor. Analysis the correlation of blood pressure variability and sleep stability. Results: In the nighttime, systolic blood pressure standard deviation, systolic blood pressure variation coefficient, the ratio of the systolic blood pressure minimum to the maximum, diastolic blood pressure standard deviation, diastolic blood pressure variation coefficient were positively correlated with unstable sleep duration ( r =0.185, 0.24, 0.237, 0.43, 0.276, P Blood pressure variability is associated with sleep stability, especially at night, the longer the unstable sleep duration, the greater the variability in night blood pressure.

  18. Variability of mechanical properties of nuclear pressure vessel steels

    International Nuclear Information System (INIS)

    Petrequin, P.; Soulat, P.

    1980-01-01

    Causes of variability of mechanical properties nuclear pressure vessel steels are reviewed and discussed. The effects of product shape and size, processing history and heat treatment are investigated. Some quantitative informations are given on the scatter of mechanical properties of typical pressure vessel components. The necessity of using recommended or standardized properties for comparing mechanical properties before and after irradiation in pin pointed. (orig.) [de

  19. Classification criteria of syndromes by latent variable models

    DEFF Research Database (Denmark)

    Petersen, Janne

    2010-01-01

    patient's characteristics. These methods may erroneously reduce multiplicity either by combining markers of different phenotypes or by mixing HALS with other processes such as aging. Latent class models identify homogenous groups of patients based on sets of variables, for example symptoms. As no gold......The thesis has two parts; one clinical part: studying the dimensions of human immunodeficiency virus associated lipodystrophy syndrome (HALS) by latent class models, and a more statistical part: investigating how to predict scores of latent variables so these can be used in subsequent regression...... standard exists for diagnosing HALS the normally applied diagnostic models cannot be used. Latent class models, which have never before been used to diagnose HALS, make it possible, under certain assumptions, to: statistically evaluate the number of phenotypes, test for mixing of HALS with other processes...

  20. Cephalometric Standards of Pre-University Boys with Normal Occlusion in Hamadan

    Directory of Open Access Journals (Sweden)

    N. Farhadian

    2005-04-01

    Full Text Available The important base of orthodontic treatment is correct diagnosis . One of the diagnostic tools is lateral cephalogram. There are some differences in normal standards between different races. The present study was carried out with the aim of determining and assessing the cephalometric standards of boys with the age of 17 to 20 years old in Hamadan.Among 1204 boys of preuniversity centers , 27 persons were selected based on IOTN and normal occlusal standards. Lateral cephalograms were obtained in Natural Head Position. 22 cephalometric variables (15 angles , 5 lines , 2 ratios were determined and measured three times by an orthodontist . Student t - test used for analysis.Mean age of the cases were 18.2±1.4 years. Range of reliability coefficient was between 0.901 to 0.986. In comparison with similar studies following variables were statistically different at p<0.05 level: Articular Angle= 146 ,Gonial Angle =118 , NPog-TH =89 , AB-TH = 4.6 , L1 –TH =116 , Go Gn –TH =20 , Ant. Cranial Base =76mm.The length of anterior cranial base in our study was significantly less than Michigan standards and there was a tendency to more straight profile in this evaluation . In comparison with the Cooke standards there was less protrusion in mandibular incisors and more counter-clockwise rotation of mandible. In comparison with similar study on girls(with normal occlusion and 18.2±1.1 years old linear measurements were generally greater in boys. Therefore it is important to consider the ethnic and racial variations in our ideal treatment plan.

  1. Effects of heat loss as percentage of fuel's energy, friction and variable specific heats of working fluid on performance of air standard Otto cycle

    International Nuclear Information System (INIS)

    Lin, J.-C.; Hou, S.-S.

    2008-01-01

    The objective of this study is to analyze the effects of heat loss characterized by a percentage of the fuel's energy, friction and variable specific heats of working fluid on the performance of an air standard Otto cycle with a restriction of maximum cycle temperature. A more realistic and precise relationship between the fuel's chemical energy and the heat leakage that is based on a pair of inequalities is derived through the resulting temperature. The variations in power output and thermal efficiency with compression ratio, and the relations between the power output and the thermal efficiency of the cycle are presented. The results show that the power output as well as the efficiency where maximum power output occurs will increase with increase of the maximum cycle temperature. The temperature dependent specific heats of the working fluid have a significant influence on the performance. The power output and the working range of the cycle increase with the increase of specific heats of the working fluid, while the efficiency decreases with the increase of specific heats of the working fluid. The friction loss has a negative effect on the performance. Therefore, the power output and efficiency of the cycle decrease with increasing friction loss. It is noteworthy that the effects of heat loss characterized by a percentage of the fuel's energy, friction and variable specific heats of the working fluid on the performance of an Otto cycle engine are significant and should be considered in practical cycle analysis. The results obtained in the present study are of importance to provide good guidance for performance evaluation and improvement of practical Otto engines

  2. Accredited Standards Committee N15 Developments And Future Directions

    International Nuclear Information System (INIS)

    Mathews, Caroline E.; May, Melanie; Preston, Lynne

    2009-01-01

    Accredited Standards Committee (ASC) N15, Methods of Nuclear Material Control, is sponsored by the Institute of Nuclear Materials Management (INMM) to develop standards for protection, control and accounting of special nuclear materials in all phases of the nuclear fuel cycle, including analytical procedures where necessary and special to this purpose, except that physical protection of special nuclear material within a nuclear power plant is not included. Voluntary consensus standards complement federal regulations and technical standards and fulfill an important role for the nuclear regulatory agencies. This paper describes the N15 standards development process, with INMM as the Standards Developing Organization (SDO) and the N15 Committee responsible for implementation. Key components of the N15 standards development process include ANSI accreditation; compliance with the ANSI Essential Requirements (ER), coordination with other SDOs, communication with stakeholders, maintenance of balance between interest categories, and ANSI periodic audits. Recent and future ASC N15 activities are discussed, with a particular focus on new directions in anticipation of renewed growth in nuclear power.

  3. Electroweak interaction: Standard and beyond

    International Nuclear Information System (INIS)

    Harari, H.

    1987-02-01

    Several important topics within the standard model raise questions which are likely to be answered only by further theoretical understanding which goes beyond the standard model. In these lectures we present a discussion of some of these problems, including the quark masses and angles, the Higgs sector, neutrino masses, W and Z properties and possible deviations from a pointlike structure. 44 refs

  4. ISO radiation sterilization standards

    International Nuclear Information System (INIS)

    Lambert, Byron J.; Hansen, Joyce M.

    1998-01-01

    This presentation provides an overview of the current status of the ISO radiation sterilization standards. The ISO standards are voluntary standards which detail both the validation and routine control of the sterilization process. ISO 11137 was approved in 1994 and published in 1995. When reviewing the standard you will note that less than 20% of the standard is devoted to requirements and the remainder is guidance on how to comply with the requirements. Future standards developments in radiation sterilization are being focused on providing additional guidance. The guidance that is currently provided in informative annexes of ISO 11137 includes: device/packaging materials, dose setting methods, and dosimeters and dose measurement, currently, there are four Technical Reports being developed to provide additional guidance: 1. AAMI Draft TIR, 'Radiation Sterilization Material Qualification' 2. ISO TR 13409-1996, 'Sterilization of health care products - Radiation sterilization - Substantiation of 25 kGy as a sterilization dose for small or infrequent production batches' 3. ISO Draft TR, 'Sterilization of health care products - Radiation sterilization Selection of a sterilization dose for a single production batch' 4. ISO Draft TR, 'Sterilization of health care products - Radiation sterilization-Product Families, Plans for Sampling and Frequency of Dose Audits'

  5. Mathematical Model of Thyristor Inverter Including a Series-parallel Resonant Circuit

    Directory of Open Access Journals (Sweden)

    Miroslaw Luft

    2008-01-01

    Full Text Available The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with theaid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  6. Uganda; Financial System Stability Assessment, including Reports on the Observance of Standards and Codes on the following topics: Monetary and Financial Policy Transparency, Banking Supervision, Securities Regulation, and Payment Systems

    OpenAIRE

    International Monetary Fund

    2003-01-01

    This paper presents findings of Uganda’s Financial System Stability Assessment, including Reports on the Observance of Standards and Codes on Monetary and Financial Policy Transparency, Banking Supervision, Securities Regulation, Insurance Regulation, Corporate Governance, and Payment Systems. The banking system in Uganda, which dominates the financial system, is fundamentally sound, more resilient than in the past, and currently poses no threat to macroeconomic stability. A major disruption ...

  7. Analysis of Factors Influencing Fur Quality in Minks of Standard, Pastel, Platinum and White Hedlunda Colour Strains

    Directory of Open Access Journals (Sweden)

    Stanisław Socha

    2010-10-01

    Full Text Available The work aimed at the analysis of the factors that influence conformation traits, included animal size and fur quality traits in four colour types of mink: standard, pastel, platinum and white Hedlunda. The data concerns the evaluation of animal conformation traits in the period of three years. The analysis of variance of particular traits indicates statistically significant effect of the year of birth, colour type and animal sex on the majority of analysed traits. Higher means of license evaluation were obtained by males in majority of the traits. Statistic analysis of body weight showed that the highest body weight characterized males of platinum and white Hedlunda colour types. Minks of standard and pastel colour types were characterised by lower body weight. The mean body weight of males was 2581.17g and of females 1401.42g (there is a clear sexual dimorphism in minks. Minks of white Hedlunda colour type were characterised by the highest means of colour purity, both males and females. Other colour types obtained lower means. The best fur quality characterised platinum minks. Variability of traits, measured by variability coefficient, had the highest values in animal weight (in grams and ranged from 6.0 to 32.0%. Variability of total number of scores ranged from 2.00 to 8.20%. Positive phenotypic correlations were the highest between body size (in points and total number of scores (0.676, while the lowest were obtained between body size (in points and fur quality (–0.178.

  8. International Spinal Cord Injury Core Data Set (version 2.0)-including standardization of reporting

    NARCIS (Netherlands)

    Biering-Sorensen, F.; DeVivo, M. J.; Charlifue, S.; Chen, Y.; New, P. W.; Noonan, V.; Post, M. W. M.; Vogel, L.

    Study design: The study design includes expert opinion, feedback, revisions and final consensus. Objectives: The objective of the study was to present the new knowledge obtained since the International Spinal Cord Injury (SCI) Core Data Set (Version 1.0) published in 2006, and describe the

  9. International Spinal Cord Injury Core Data Set (version 2.0)-including standardization of reporting

    NARCIS (Netherlands)

    Biering-Sørensen, F; DeVivo, M J; Charlifue, Susan; Chen, Y; New, P.W.; Noonan, V.; Post, M W M; Vogel, L.

    STUDY DESIGN: The study design includes expert opinion, feedback, revisions and final consensus. OBJECTIVES: The objective of the study was to present the new knowledge obtained since the International Spinal Cord Injury (SCI) Core Data Set (Version 1.0) published in 2006, and describe the

  10. Incorporation of electricity GHG emissions intensity variability into building environmental assessment

    International Nuclear Information System (INIS)

    Cubi, Eduard; Doluweera, Ganesh; Bergerson, Joule

    2015-01-01

    Highlights: • Current building assessment does not account for variability in the electric grid. • A new method incorporates hourly grid variability into building assessment. • The method is complementary with peak-shaving policies. • The assessment method can affect building design decisions. - Abstract: Current building energy and GHG emissions assessments do not account for the variable performance of the electric grid. Incorporating hourly grid variability into building assessment methods can help to better prioritize energy efficiency measures that result in the largest environmental benefits. This article proposes a method to incorporate GHG emissions intensity changes due to grid variability into building environmental assessment. The proposed method encourages building systems that reduce electricity use during peak periods while accounting for differences in grid GHG emissions intensity (i.e., peak shaving is more strongly encouraged in grids that have GHG intense peak generation). A set of energy saving building technologies are evaluated in a set of building variants (office, residential) and grid types (hydro/nuclear dominated, coal/gas dominated) to demonstrate the proposed method. Differences between total GHG emissions calculated with the new method compared with the standard (which assumes a constant GHG emissions intensity throughout the year) are in the 5–15% range when the contribution of electricity to total GHG emissions is more significant. The influence of the method on the assessment of the relative performance of some energy efficiency measures is much higher. For example, the estimated GHG emissions savings with heat pumps and photovoltaics can change by −40% and +20%, respectively, using the new assessment method instead of the standard. These differences in GHG emissions estimates can influence building design decisions. The new method could be implemented easily, and would lead to better decision making and more accurate

  11. Light-water reactor pressure vessel surveillance standards

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    The master matrix standard describes a series of standard practices, guides, and methods for the prediction of neutron-induced changes in light-water reactor (LWR) pressure vessel steels throughout a pressure vessel's service life. Some of these are existing American Society for Testing and Materials (ASTM) standards, some are ASTM standards that have been modified, and some are newly proposed ASTM standards. The current (1) scope, (2) areas of application, (3) interrelationships, and (4) status and time table of development, improvement, validation, and calibration for a series of 16 ASTM standards are defined. The standard also includes a discussion of LWR pressure vessel surveillance - justification, requirements, and status of work

  12. Dealing with missing standard deviation and mean values in meta-analysis of continuous outcomes: a systematic review.

    Science.gov (United States)

    Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C

    2018-03-07

    Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally

  13. Eye-size variability in deep-sea lanternfishes (Myctophidae): an ecological and phylogenetic study.

    Science.gov (United States)

    de Busserolles, Fanny; Fitzpatrick, John L; Paxton, John R; Marshall, N Justin; Collin, Shaun P

    2013-01-01

    One of the most common visual adaptations seen in the mesopelagic zone (200-1000 m), where the amount of light diminishes exponentially with depth and where bioluminescent organisms predominate, is the enlargement of the eye and pupil area. However, it remains unclear how eye size is influenced by depth, other environmental conditions and phylogeny. In this study, we determine the factors influencing variability in eye size and assess whether this variability is explained by ecological differences in habitat and lifestyle within a family of mesopelagic fishes characterized by broad intra- and interspecific variance in depth range and luminous patterns. We focus our study on the lanternfish family (Myctophidae) and hypothesise that lanternfishes with a deeper distribution and/or a reduction of bioluminescent emissions have smaller eyes and that ecological factors rather than phylogenetic relationships will drive the evolution of the visual system. Eye diameter and standard length were measured in 237 individuals from 61 species of lanternfishes representing all the recognised tribes within the family in addition to compiling an ecological dataset including depth distribution during night and day and the location and sexual dimorphism of luminous organs. Hypotheses were tested by investigating the relationship between the relative size of the eye (corrected for body size) and variations in depth and/or patterns of luminous-organs using phylogenetic comparative analyses. Results show a great variability in relative eye size within the Myctophidae at all taxonomic levels (from subfamily to genus), suggesting that this character may have evolved several times. However, variability in eye size within the family could not be explained by any of our ecological variables (bioluminescence and depth patterns), and appears to be driven solely by phylogenetic relationships.

  14. Position paper on standardization

    International Nuclear Information System (INIS)

    1991-04-01

    The ''NPOC Strategic Plan for Building New Nuclear Plants'' creates a framework within which new standardized nuclear plants may be built. The Strategic Plan is an expression of the nuclear energy industry's serious intent to create the necessary conditions for new plant construction and operation. One of the key elements of the Strategic Plan is a comprehensive industry commitment to standardization: through design certification, combined license, first-of-a-kind engineering, construction, operation and maintenance of nuclear power plants. The NPOC plan proposes four stages of standardization in advanced light water reactors (ALWRs). The first stage is established by the ALWR Utility Requirements Document which specifies owner/operator requirements at a functional level covering all elements of plant design and construction, and many aspects of operations and maintenance. The second stage of standardization is that achieved in the NRC design certification. This certification level includes requirements, design criteria and bases, functional descriptions and performance requirements for systems to assure plant safety. The third stage of standardization, commercial standardization, carries the design to a level of completion beyond that required for design certification to enable the industry to achieve potential increases in efficiency and economy. The final stage of standardization is enhanced standardization beyond design. A standardized approach is being developed in construction practices, operating, maintenance training, and procurement practices. This comprehensive standardization program enables the NRC to proceed with design certification with the confidence that standardization beyond the regulations will be achieved. This confidence should answer the question of design detail required for design certification, and demonstrate that the NRC should require no further regulatory review beyond that required by 10 CFR Part 52

  15. Metronome Cueing of Walking Reduces Gait Variability after a Cerebellar Stroke.

    Science.gov (United States)

    Wright, Rachel L; Bevins, Joseph W; Pratt, David; Sackley, Catherine M; Wing, Alan M

    2016-01-01

    Cerebellar stroke typically results in increased variability during walking. Previous research has suggested that auditory cueing reduces excessive variability in conditions such as Parkinson's disease and post-stroke hemiparesis. The aim of this case report was to investigate whether the use of a metronome cue during walking could reduce excessive variability in gait parameters after a cerebellar stroke. An elderly female with a history of cerebellar stroke and recurrent falling undertook three standard gait trials and three gait trials with an auditory metronome. A Vicon system was used to collect 3-D marker trajectory data. The coefficient of variation was calculated for temporal and spatial gait parameters. SDs of the joint angles were calculated and used to give a measure of joint kinematic variability. Step time, stance time, and double support time variability were reduced with metronome cueing. Variability in the sagittal hip, knee, and ankle angles were reduced to normal values when walking to the metronome. In summary, metronome cueing resulted in a decrease in variability for step, stance, and double support times and joint kinematics. Further research is needed to establish whether a metronome may be useful in gait rehabilitation after cerebellar stroke and whether this leads to a decreased risk of falling.

  16. Major histocompatibility complex harbors widespread genotypic variability of non-additive risk of rheumatoid arthritis including epistasis.

    Science.gov (United States)

    Wei, Wen-Hua; Bowes, John; Plant, Darren; Viatte, Sebastien; Yarwood, Annie; Massey, Jonathan; Worthington, Jane; Eyre, Stephen

    2016-04-25

    Genotypic variability based genome-wide association studies (vGWASs) can identify potentially interacting loci without prior knowledge of the interacting factors. We report a two-stage approach to make vGWAS applicable to diseases: firstly using a mixed model approach to partition dichotomous phenotypes into additive risk and non-additive environmental residuals on the liability scale and secondly using the Levene's (Brown-Forsythe) test to assess equality of the residual variances across genotype groups per marker. We found widespread significant (P 5e-05) vGWAS signals within the major histocompatibility complex (MHC) across all three study cohorts of rheumatoid arthritis. We further identified 10 epistatic interactions between the vGWAS signals independent of the MHC additive effects, each with a weak effect but jointly explained 1.9% of phenotypic variance. PTPN22 was also identified in the discovery cohort but replicated in only one independent cohort. Combining the three cohorts boosted power of vGWAS and additionally identified TYK2 and ANKRD55. Both PTPN22 and TYK2 had evidence of interactions reported elsewhere. We conclude that vGWAS can help discover interacting loci for complex diseases but require large samples to find additional signals.

  17. Variability in recording and scoring of respiratory events during sleep in Europe: a need for uniform standards.

    Science.gov (United States)

    Arnardottir, Erna S; Verbraecken, Johan; Gonçalves, Marta; Gjerstad, Michaela D; Grote, Ludger; Puertas, Francisco Javier; Mihaicuta, Stefan; McNicholas, Walter T; Parrino, Liborio

    2016-04-01

    Uniform standards for the recording and scoring of respiratory events during sleep are lacking in Europe, although many centres follow the published recommendations of the American Academy of Sleep Medicine. The aim of this study was to assess the practice for the diagnosis of sleep-disordered breathing throughout Europe. A specially developed questionnaire was sent to representatives of the 31 national sleep societies in the Assembly of National Sleep Societies of the European Sleep Research Society, and a total of 29 countries completed the questionnaire. Polysomnography was considered the primary diagnostic method for sleep apnea diagnosis in 10 (34.5%), whereas polygraphy was used primarily in six (20.7%) European countries. In the remaining 13 countries (44.8%), no preferred methodology was used. Fifteen countries (51.7%) had developed some type of national uniform standards, but these standards varied significantly in terms of scoring criteria, device specifications and quality assurance procedures between countries. Only five countries (17.2%) had published these standards. Most respondents supported the development of uniform recording and scoring criteria for Europe, which might be based partly on the existing American Academy of Sleep Medicine rules, but also take into account differences in European practice when compared to North America. This survey highlights the current varying approaches to the assessment of patients with sleep-disordered breathing throughout Europe and supports the need for the development of practice parameters in the assessment of such patients that would be suited to European clinical practice. © 2015 European Sleep Research Society.

  18. Including estimates of the future in today's financial statements

    OpenAIRE

    Mary Barth

    2006-01-01

    This paper explains why the question is how, not if, today's financial statements should include estimates of the future. Including such estimates is not new, but their use is increasing. This increase results primarily because standard setters believe asset and liability measures that reflect current economic conditions and up-to-date expectations of the future will result in more useful information for making economic decisions, which is the objective of financial reporting. This is why sta...

  19. Impact of climate variability on tropospheric ozone

    International Nuclear Information System (INIS)

    Grewe, Volker

    2007-01-01

    A simulation with the climate-chemistry model (CCM) E39/C is presented, which covers both the troposphere and stratosphere dynamics and chemistry during the period 1960 to 1999. Although the CCM, by its nature, is not exactly representing observed day-by-day meteorology, there is an overall model's tendency to correctly reproduce the variability pattern due to an inclusion of realistic external forcings, like observed sea surface temperatures (e.g. El Nino), major volcanic eruption, solar cycle, concentrations of greenhouse gases, and Quasi-Biennial Oscillation. Additionally, climate-chemistry interactions are included, like the impact of ozone, methane, and other species on radiation and dynamics, and the impact of dynamics on emissions (lightning). However, a number of important feedbacks are not yet included (e.g. feedbacks related to biogenic emissions and emissions due to biomass burning). The results show a good representation of the evolution of the stratospheric ozone layer, including the ozone hole, which plays an important role for the simulation of natural variability of tropospheric ozone. Anthropogenic NO x emissions are included with a step-wise linear trend for each sector, but no interannual variability is included. The application of a number of diagnostics (e.g. marked ozone tracers) allows the separation of the impact of various processes/emissions on tropospheric ozone and shows that the simulated Northern Hemisphere tropospheric ozone budget is not only dominated by nitrogen oxide emissions and other ozone pre-cursors, but also by changes of the stratospheric ozone budget and its flux into the troposphere, which tends to reduce the simulated positive trend in tropospheric ozone due to emissions from industry and traffic during the late 80s and early 90s. For tropical regions the variability in ozone is dominated by variability in lightning (related to ENSO) and stratosphere-troposphere exchange (related to Northern Hemisphere Stratospheric

  20. Inferring climate variability from skewed proxy records

    Science.gov (United States)

    Emile-Geay, J.; Tingley, M.

    2013-12-01

    Many paleoclimate analyses assume a linear relationship between the proxy and the target climate variable, and that both the climate quantity and the errors follow normal distributions. An ever-increasing number of proxy records, however, are better modeled using distributions that are heavy-tailed, skewed, or otherwise non-normal, on account of the proxies reflecting non-normally distributed climate variables, or having non-linear relationships with a normally distributed climate variable. The analysis of such proxies requires a different set of tools, and this work serves as a cautionary tale on the danger of making conclusions about the underlying climate from applications of classic statistical procedures to heavily skewed proxy records. Inspired by runoff proxies, we consider an idealized proxy characterized by a nonlinear, thresholded relationship with climate, and describe three approaches to using such a record to infer past climate: (i) applying standard methods commonly used in the paleoclimate literature, without considering the non-linearities inherent to the proxy record; (ii) applying a power transform prior to using these standard methods; (iii) constructing a Bayesian model to invert the mechanistic relationship between the climate and the proxy. We find that neglecting the skewness in the proxy leads to erroneous conclusions and often exaggerates changes in climate variability between different time intervals. In contrast, an explicit treatment of the skewness, using either power transforms or a Bayesian inversion of the mechanistic model for the proxy, yields significantly better estimates of past climate variations. We apply these insights in two paleoclimate settings: (1) a classical sedimentary record from Laguna Pallcacocha, Ecuador (Moy et al., 2002). Our results agree with the qualitative aspects of previous analyses of this record, but quantitative departures are evident and hold implications for how such records are interpreted, and

  1. SOUR CHERRY (Prunus cerasus L. GENETIC VARIABILITY AND PHOTOSYNTHETIC EFFICIENCY DURING DROUGHT

    Directory of Open Access Journals (Sweden)

    Marija Viljevac

    2012-12-01

    Full Text Available Sour cherry is an important fruit in Croatian orchards. Cultivar Oblačinska is predominant in existing orchards with noted intracultivar phenotypic heterogeneity. In this study, the genetic variability of 22 genotypes of cvs. Oblačinska, Maraska and Cigančica, as well as standard cvs. Kelleris 14, Kelleris 16, Kereška, Rexelle and Heimann conserved were investigated. Two types of molecular markers were used: microsatellite markers (SSR in order to identify intercultivar, and AFLP in order to identify intracultivar variabilities. A set of 12 SSR markers revealed small genetic distance between cvs. Maraska and Oblačinska while cv. Cigančica is affined to cv. Oblačinska. Furthermore, cvs. Oblačinska, Cigančica and Maraska were characterized compared to standard ones. AFLP markers didn`t confirm significant intracultivar variability of cv. Oblačinska although the variability has been approved at the morphological, chemical and pomological level. Significant corelation between SSR and AFLP markers was found. Identification of sour cherry cultivars tolerant to drought will enable the sustainability of fruit production with respect to the climate change in the future. For this purpose, the tolerance of seven sour cherry genotypes (cvs. Kelleris 16, Maraska, Cigančica and Oblačinska represented by 4 genotypes: OS, 18, D6 and BOR to drought conditions was tested in order to isolate genotypes with the desired properties. In the greenhouse experiment, cherry plants were exposed to drought stress. The leaf relative water content, OJIP test parameters which specify efficiency of the photosynthetic system based on measurements of chlorophyll a fluorescence, and concentrations of photo-synthetic pigments during the experiment were measured as markers of drought tolerance. Photosynthetic performance index (PIABS comprises three key events in the reaction centre of photosystem II affecting the photosynthetic activity: the absorption of energy

  2. Is the choice of renewable portfolio standards random?

    International Nuclear Information System (INIS)

    Huang Mingyuan; Alavalapati, Janaki R.R.; Carter, Douglas R.; Langholtz, Matthew H.

    2007-01-01

    This study investigated factors influencing the adoption or intention to adopt renewable portfolio standards (RPS) by individual states in the United States (U.S). Theory of adoption of innovation was applied as a conceptual framework. A logistic model was used to achieve the task. Gross state product (GSP), growth rate of population (GRP), political party dominancy, education level, natural resources expenditure, and share of coal in electricity generation were used as explanatory variables. Results indicated that the model predicts the dependent variable (state's choice of adopting or not adopting RPS) 82 times correctly out of 100. Results also suggested that education followed by political party dominancy, GSP and GRP are shown to have large impacts on the probability of RPS adoption

  3. ['Gold standard', not 'golden standard'

    NARCIS (Netherlands)

    Claassen, J.A.H.R.

    2005-01-01

    In medical literature, both 'gold standard' and 'golden standard' are employed to describe a reference test used for comparison with a novel method. The term 'gold standard' in its current sense in medical research was coined by Rudd in 1979, in reference to the monetary gold standard. In the same

  4. A moving mesh method with variable relaxation time

    OpenAIRE

    Soheili, Ali Reza; Stockie, John M.

    2006-01-01

    We propose a moving mesh adaptive approach for solving time-dependent partial differential equations. The motion of spatial grid points is governed by a moving mesh PDE (MMPDE) in which a mesh relaxation time \\tau is employed as a regularization parameter. Previously reported results on MMPDEs have invariably employed a constant value of the parameter \\tau. We extend this standard approach by incorporating a variable relaxation time that is calculated adaptively alongside the solution in orde...

  5. Common Core Science Standards: Implications for Students with Learning Disabilities

    Science.gov (United States)

    Scruggs, Thomas E.; Brigham, Frederick J.; Mastropieri, Margo A.

    2013-01-01

    The Common Core Science Standards represent a new effort to increase science learning for all students. These standards include a focus on English and language arts aspects of science learning, and three dimensions of science standards, including practices of science, crosscutting concepts of science, and disciplinary core ideas in the various…

  6. Targeted neonatal echocardiography services: need for standardized training and quality assurance.

    LENUS (Irish Health Repository)

    Finan, Emer

    2014-10-01

    Targeted neonatal echocardiography refers to a focused assessment of myocardial performance and hemodynamics directed by a specific clinical question. It has become the standard of care in many parts of the world, but practice is variable, and there has been a lack of standardized training and evaluation to date. Targeted neonatal echocardiography was first introduced to Canada in 2006. The purpose of this study was to examine the characteristics of targeted neonatal echocardiography practice and training methods in Canadian neonatal intensive care units (NICUs).

  7. Capturing heterogeneity in gene expression studies by surrogate variable analysis.

    Directory of Open Access Journals (Sweden)

    Jeffrey T Leek

    2007-09-01

    Full Text Available It has unambiguously been shown that genetic, environmental, demographic, and technical factors may have substantial effects on gene expression levels. In addition to the measured variable(s of interest, there will tend to be sources of signal due to factors that are unknown, unmeasured, or too complicated to capture through simple models. We show that failing to incorporate these sources of heterogeneity into an analysis can have widespread and detrimental effects on the study. Not only can this reduce power or induce unwanted dependence across genes, but it can also introduce sources of spurious signal to many genes. This phenomenon is true even for well-designed, randomized studies. We introduce "surrogate variable analysis" (SVA to overcome the problems caused by heterogeneity in expression studies. SVA can be applied in conjunction with standard analysis techniques to accurately capture the relationship between expression and any modeled variables of interest. We apply SVA to disease class, time course, and genetics of gene expression studies. We show that SVA increases the biological accuracy and reproducibility of analyses in genome-wide expression studies.

  8. Construction of Database for Pulsating Variable Stars

    Science.gov (United States)

    Chen, B. Q.; Yang, M.; Jiang, B. W.

    2011-07-01

    A database for the pulsating variable stars is constructed for Chinese astronomers to study the variable stars conveniently. The database includes about 230000 variable stars in the Galactic bulge, LMC and SMC observed by the MACHO (MAssive Compact Halo Objects) and OGLE (Optical Gravitational Lensing Experiment) projects at present. The software used for the construction is LAMP, i.e., Linux+Apache+MySQL+PHP. A web page is provided to search the photometric data and the light curve in the database through the right ascension and declination of the object. More data will be incorporated into the database.

  9. The influence of anthropometrics on physical employment standard performance.

    Science.gov (United States)

    Reilly, T; Spivock, M; Prayal-Brown, A; Stockbrugger, B; Blacklock, R

    2016-10-01

    The Canadian Armed Forces (CAF) recently implemented the Fitness for Operational Requirements of CAF Employment (FORCE), a new physical employment standard (PES). Data collection throughout development included anthropometric profiles of the CAF. To determine if anthropometric measurements and demographic information would predict the performance outcomes of the FORCE and/or Common Military Task Fitness Evaluation (CMTFE). We conducted a secondary analysis of data from FORCE research. We obtained bioelectrical impedance and segmental analysis. Statistical analysis included correlation and linear regression analyses. Among the 668 study subjects, as predicted, any task requiring lifting, pulling or moving of an object was significantly and positively correlated (r > 0.67) to lean body mass (LBM) measurements. LBM correlated with stretcher carry (r = 0.78) and with lifting actions such as sand bag drag (r = 0.77), vehicle extrication (r = 0.71), sand bag fortification (r = 0.68) and sand bag lift time (r = -0.67). The difference between the correlation of dead mass (DM) with task performance compared with LBM was not statistically significant. DM and LBM can be used in a PES to predict success on military tasks such as casualty evacuation and manual material handling. However, there is no minimum LBM required to perform these tasks successfully. These data direct future research on how we should diversify research participants by anthropometrics, in addition to the traditional demographic variables of gender and age, to highlight potential important adverse impact with PES design. In addition, the results can be used to develop better training regimens to facilitate passing a PES. © All rights reserved. ‘The Influence of Anthropometrics on Physical Employment Standard Performance’ has been reproduced with the permission of DND, 2016.

  10. Variability of insulin-stimulated myocardial glucose uptake in healthy elderly subjects

    DEFF Research Database (Denmark)

    Kofoed, Klaus F; Hove, Jens D; Freiberg, Jacob

    2002-01-01

    The aim of this study was to assess regional and global variability of insulin-stimulated myocardial glucose uptake in healthy elderly subjects and to evaluate potentially responsible factors. Twenty men with a mean age of 64 years, no history of cardiovascular disease, and normal blood pressure...... rest and hyperaemic blood flow during dipyridamole infusion were measured with nitrogen-13 ammonia and positron emission tomography in 16 left ventricular myocardial segments. Intra-individual and inter-individual variability of insulin-stimulated myocardial glucose uptake [relative dispersion...... = (standard deviation/mean)] was 13% and 29% respectively. Although inter-individual variability of glucose uptake and blood flow at rest was of the same magnitude, no correlation was found between these measures. Regional and global insulin-stimulated myocardial glucose uptake correlated linearly with whole...

  11. Management system - correlation study between new IAEA standards and the market standards

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Dirceu Paulo de [Centro Tecnologico da Marinha em Sao Paulo (CTMSP), Ipero, SP (Brazil)], e-mail: dirceupo@hotmail.com; Zouain, Desiree Moraes [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)], e-mail: dmzouain@ipen.br

    2009-07-01

    In order to answer the growing concern of society with respect of the aspects that affect the quality of life, international and national regulatory bodies have developed standards that enable organizations to establish management systems for quality, environment and sustainable development, health, safety and social responsibility, among other functions. Within this context it is necessary to structure an integrated management system that promotes interests compatibility of several distinct and complementary functions involved. Considering this vision of the management system integration, the International Atomic Energy Agency (IAEA) decided to review the structure of safety standards on Quality Assurance - code and guides 50-C/SGQ1/ 14:1996, publishing, in 2006, IAEA GS-R-3 and IAEA GS-G-3.1 standards, enlarging the management approach of the previous standards, including the possibility of integrating the functions foremost mentioned. This paper presents the results about a correlation study between IAEA management system standards - IAEA GS-R-3: 2006, IAEA GS-G-3.1: 2006 and IAEA DS349 rev. 2007, this latter still a draft standard, with those market management system standards on quality - ISO 9001:2008, environmental - ISO 14001:2004, and occupational health and safety - BS OHSAS 18001:2007, identifying gaps, redundancies and complementarities among their requirements and guidances. The purpose of the study is to provide subsidies that could contribute to the structuring of a management system to nuclear facilities that satisfies, in an integrated manner, the common and complementary requirements and guidances of IAEA and market standards. (author)

  12. Management system - correlation study between new IAEA standards and the market standards

    International Nuclear Information System (INIS)

    Oliveira, Dirceu Paulo de; Zouain, Desiree Moraes

    2009-01-01

    In order to answer the growing concern of society with respect of the aspects that affect the quality of life, international and national regulatory bodies have developed standards that enable organizations to establish management systems for quality, environment and sustainable development, health, safety and social responsibility, among other functions. Within this context it is necessary to structure an integrated management system that promotes interests compatibility of several distinct and complementary functions involved. Considering this vision of the management system integration, the International Atomic Energy Agency (IAEA) decided to review the structure of safety standards on Quality Assurance - code and guides 50-C/SGQ1/ 14:1996, publishing, in 2006, IAEA GS-R-3 and IAEA GS-G-3.1 standards, enlarging the management approach of the previous standards, including the possibility of integrating the functions foremost mentioned. This paper presents the results about a correlation study between IAEA management system standards - IAEA GS-R-3: 2006, IAEA GS-G-3.1: 2006 and IAEA DS349 rev. 2007, this latter still a draft standard, with those market management system standards on quality - ISO 9001:2008, environmental - ISO 14001:2004, and occupational health and safety - BS OHSAS 18001:2007, identifying gaps, redundancies and complementarities among their requirements and guidances. The purpose of the study is to provide subsidies that could contribute to the structuring of a management system to nuclear facilities that satisfies, in an integrated manner, the common and complementary requirements and guidances of IAEA and market standards. (author)

  13. Decomposing global crop yield variability

    Science.gov (United States)

    Ben-Ari, Tamara; Makowski, David

    2014-11-01

    Recent food crises have highlighted the need to better understand the between-year variability of agricultural production. Although increasing future production seems necessary, the globalization of commodity markets suggests that the food system would also benefit from enhanced supplies stability through a reduction in the year-to-year variability. Here, we develop an analytical expression decomposing global crop yield interannual variability into three informative components that quantify how evenly are croplands distributed in the world, the proportion of cultivated areas allocated to regions of above or below average variability and the covariation between yields in distinct world regions. This decomposition is used to identify drivers of interannual yield variations for four major crops (i.e., maize, rice, soybean and wheat) over the period 1961-2012. We show that maize production is fairly spread but marked by one prominent region with high levels of crop yield interannual variability (which encompasses the North American corn belt in the USA, and Canada). In contrast, global rice yields have a small variability because, although spatially concentrated, much of the production is located in regions of below-average variability (i.e., South, Eastern and South Eastern Asia). Because of these contrasted land use allocations, an even cultivated land distribution across regions would reduce global maize yield variance, but increase the variance of global yield rice. Intermediate results are obtained for soybean and wheat for which croplands are mainly located in regions with close-to-average variability. At the scale of large world regions, we find that covariances of regional yields have a negligible contribution to global yield variance. The proposed decomposition could be applied at any spatial and time scales, including the yearly time step. By addressing global crop production stability (or lack thereof) our results contribute to the understanding of a key

  14. Management plan for the Nuclear Standards Program

    International Nuclear Information System (INIS)

    1979-11-01

    This Management Plan was prepared to describe the manner in which Oak Ridge National Laboratory will provide technical management of the Nuclear Standards Program. The organizational structure that has been established within ORNL for this function is the Nuclear Standards Management Center, which includes the Nuclear Standards Office (NSO) already in existence at ORNL. This plan is intended to support the policies and practices for the development and application of technical standards in ETN projects, programs, and technology developments as set forth in a standards policy memorandum from the DOE Program Director for Nuclear Energy

  15. Glycaemic variability in patients with severe sepsis or septic shock admitted to an Intensive Care Unit.

    Science.gov (United States)

    Silveira, L M; Basile-Filho, A; Nicolini, E A; Dessotte, C A M; Aguiar, G C S; Stabile, A M

    2017-08-01

    Sepsis is associated with morbidity and mortality, which implies high costs to the global health system. Metabolic alterations that increase glycaemia and glycaemic variability occur during sepsis. To verify mean body glucose levels and glycaemic variability in Intensive Care Unit (ICU) patients with severe sepsis or septic shock. Retrospective and exploratory study that involved collection of patients' sociodemographic and clinical data and calculation of severity scores. Glycaemia measurements helped to determine glycaemic variability through standard deviation and mean amplitude of glycaemic excursions. Analysis of 116 medical charts and 6730 glycaemia measurements revealed that the majority of patients were male and aged over 60 years. Surgical treatment was the main reason for ICU admission. High blood pressure and diabetes mellitus were the most usual comorbidities. Patients that died during the ICU stay presented the highest SOFA scores and mean glycaemia; they also experienced more hypoglycaemia events. Patients with diabetes had higher mean glycaemia, evaluated through standard deviation and mean amplitude of glycaemia excursions. Organic impairment at ICU admission may underlie glycaemic variability and lead to a less favourable outcome. High glycaemic variability in patients with diabetes indicates that monitoring of these individuals is crucial to ensure better outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Bounds on the Higgs mass in the standard model and the minimal supersymmetric standard model

    CERN Document Server

    Quiros, M.

    1995-01-01

    Depending on the Higgs-boson and top-quark masses, M_H and M_t, the effective potential of the {\\bf Standard Model} can develop a non-standard minimum for values of the field much larger than the weak scale. In those cases the standard minimum becomes metastable and the possibility of decay to the non-standard one arises. Comparison of the decay rate to the non-standard minimum at finite (and zero) temperature with the corresponding expansion rate of the Universe allows to identify the region, in the (M_H, M_t) plane, where the Higgs field is sitting at the standard electroweak minimum. In the {\\bf Minimal Supersymmetric Standard Model}, approximate analytical expressions for the Higgs mass spectrum and couplings are worked out, providing an excellent approximation to the numerical results which include all next-to-leading-log corrections. An appropriate treatment of squark decoupling allows to consider large values of the stop and/or sbottom mixing parameters and thus fix a reliable upper bound on the mass o...

  17. Productivity standards for histology laboratories.

    Science.gov (United States)

    Buesa, René J

    2010-04-01

    The information from 221 US histology laboratories (histolabs) and 104 from 24 other countries with workloads from 600 to 116 000 cases per year was used to calculate productivity standards for 23 technical and 27 nontechnical tasks and for 4 types of work flow indicators. The sample includes 254 human, 40 forensic, and 31 veterinary pathology services. Statistical analyses demonstrate that most productivity standards are not different between services or worldwide. The total workload for the US human pathology histolabs averaged 26 061 cases per year, with 54% between 10 000 and less than 30 000. The total workload for 70% of the histolabs from other countries was less than 20 000, with an average of 15 226 cases per year. The fundamental manual technical tasks in the histolab and their productivity standards are as follows: grossing (14 cases per hour), cassetting (54 cassettes per hour), embedding (50 blocks per hour), and cutting (24 blocks per hour). All the other tasks, each with their own productivity standards, can be completed by auxiliary staff or using automatic instruments. Depending on the level of automation of the histolab, all the tasks derived from a workload of 25 cases will require 15.8 to 17.7 hours of work completed by 2.4 to 2.7 employees with 18% of their working time not directly dedicated to the production of diagnostic slides. This article explains how to extrapolate this productivity calculation for any workload and different levels of automation. The overall performance standard for all the tasks, including 8 hours for automated tissue processing, is 3.2 to 3.5 blocks per hour; and its best indicator is the value of the gross work flow productivity that is essentially dependent on how the work is organized. This article also includes productivity standards for forensic and veterinary histolabs, but the staffing benchmarks for histolabs will be the subject of a separate article. Copyright 2010 Elsevier Inc. All rights reserved.

  18. Delivery of care consistent with the psychosocial standards in pediatric cancer: Current practices in the United States.

    Science.gov (United States)

    Scialla, Michele A; Canter, Kimberly S; Chen, Fang Fang; Kolb, E Anders; Sandler, Eric; Wiener, Lori; Kazak, Anne E

    2018-03-01

    With published evidence-based Standards for Psychosocial Care for Children with Cancer and their Families, it is important to know the current status of their implementation. This paper presents data on delivery of psychosocial care related to the Standards in the United States. Pediatric oncologists, psychosocial leaders, and administrators in pediatric oncology from 144 programs completed an online survey. Participants reported on the extent to which psychosocial care consistent with the Standards was implemented and was comprehensive and state of the art. They also reported on specific practices and services for each Standard and the extent to which psychosocial care was integrated into broader medical care. Participants indicated that psychosocial care consistent with the Standards was usually or always provided at their center for most of the Standards. However, only half of the oncologists (55.6%) and psychosocial leaders (45.6%) agreed or strongly agreed that their psychosocial care was comprehensive and state of the art. Types of psychosocial care provided included evidence-based and less established approaches but were most often provided when problems were identified, rather than proactively. The perception of state of the art care was associated with practices indicative of integrated psychosocial care and the extent to which the Standards are currently implemented. Many oncologists and psychosocial leaders perceive that the delivery of psychosocial care at their center is consistent with the Standards. However, care is quite variable, with evidence for the value of more integrated models of psychosocial services. © 2017 Wiley Periodicals, Inc.

  19. Establishing a Network of Next Generation SED standards with DA White Dwarfs

    Science.gov (United States)

    Saha, Abhijit

    2014-10-01

    Photometric calibration uncertainties are the dominant source of error in current type Ia supernova dark energy studies, and other forefront cosmology efforts, e.g., photo-redshifts for weak lensing mass tomography. Modern 'all-sky' surveys require a network of calibration stars with 1) known SEDs (to properly and unambiguously account for filter differences), and 2) that are on a common photometric zero-point scale. HST enables us to establish this essential network of faint spectrophotometric standards, by eliminating the time-variable Earth's atmosphere, and by exploiting the well-understood energy distributions of DA white dwarfs. We have selected a set of DA WD targets that will have SNR ~200 in the LSST (and PanSTARRS and Dark Energy Survey) survey images, while avoiding saturation. This means they will be included in the surveys, and will directly calibrate the data products. By using ground-based spectra of Balmer lines to obtain the two parameters (temperature and log(g)) that determine the SED, we can use broadband HST photometry to set the overall flux scale for each source, and determine any applicable reddening. Thus calibrated, these standards can then be used as flux standards at wavelengths well beyond the range of HST, and in any arbitrary, but defined passband. This precision photometric heritage from HST benefits essentially all existing and upcoming surveys, standardizes (spectro)photometry across observatories and facilities, and directly addresses one of the current barriers to understanding the nature of dark energy. In our Cycle 20 program we achieve sub-percent accuracy. Here we propose improvements in experimental design from lessons learned.

  20. Nuclear performance standards: Promoting efficient generation

    International Nuclear Information System (INIS)

    Nagelhout, M.

    1990-01-01

    Nuclear plant performance standards are designed to share the risks of operation associated with nuclear generation. Such standards often shift risks from ratepayers to utility shareholders, even without a finding of imprudence or mismanagement. The rationale underlying nuclear performance standards is that ratepayers should not be responsible for excessive replacement power costs incurred as a result of unreasonable decisions by utility management, especially because the high fixed costs of nuclear plants are already included in base rates. In addition, performance standards can be designed to provide incentives to reward utilities that achieve superior nuclear performance, for the benefit of both ratepayers and shareholders