WorldWideScience

Sample records for variables included standardized

  1. Developing standard transmission system for radiology reporting including key images

    Kim, Seon Chil

    2007-01-01

    Development of hospital information system and Picture Archiving Communication System is not new in the medical field, and the development of internet and information technology are also universal. In the course of such development, however, it is hard to share medical information without a refined standard format. Especially in the department of radiology, the role of PACS has become very important in interchanging information with other disparate hospital information systems. A specific system needs to be developed that radiological reports are archived into a database efficiently. This includes sharing of medical images. A model is suggested in this study in which an internal system is developed where radiologists store necessary images and transmit them is the standard international clinical format, Clinical Document Architecture, and share the information with hospitals. CDA document generator was made to generate a new file format and separate the existing storage system from the new system. This was to ensure the access to required data in XML documents. The model presented in this study added a process where crucial images in reading are inserted in the CDA radiological report generator. Therefore, this study suggests a storage and transmission model for CDA documents, which is different from the existing DICOM SR. Radiological reports could be better shared, when the application function for inserting images and the analysis of standard clinical terms are completed

  2. A tool for standardized collector performance calculations including PVT

    Perers, Bengt; Kovacs, Peter; Olsson, Marcus

    2012-01-01

    A tool for standardized calculation of solar collector performance has been developed in cooperation between SP Technical Research Institute of Sweden, DTU Denmark and SERC Dalarna University. The tool is designed to calculate the annual performance of solar collectors at representative locations...... can be tested and modeled as a thermal collector, when the PV electric part is active with an MPP tracker in operation. The thermal collector parameters from this operation mode are used for the PVT calculations....

  3. Individual variability in heart rate recovery after standardized submaximal exercise

    van der Does, Hendrike; Brink, Michel; Visscher, Chris; Lemmink, Koen

    2012-01-01

    To optimize performance, coaches and athletes are always looking for the right balance between training load and recovery. Therefore, closely monitoring of athletes is important. Heart rate recovery (HRR) after standardized sub maximal exercise has been proposed as a useful variable to monitor

  4. The gait standard deviation, a single measure of kinematic variability.

    Sangeux, Morgan; Passmore, Elyse; Graham, H Kerr; Tirosh, Oren

    2016-05-01

    Measurement of gait kinematic variability provides relevant clinical information in certain conditions affecting the neuromotor control of movement. In this article, we present a measure of overall gait kinematic variability, GaitSD, based on combination of waveforms' standard deviation. The waveform standard deviation is the common numerator in established indices of variability such as Kadaba's coefficient of multiple correlation or Winter's waveform coefficient of variation. Gait data were collected on typically developing children aged 6-17 years. Large number of strides was captured for each child, average 45 (SD: 11) for kinematics and 19 (SD: 5) for kinetics. We used a bootstrap procedure to determine the precision of GaitSD as a function of the number of strides processed. We compared the within-subject, stride-to-stride, variability with the, between-subject, variability of the normative pattern. Finally, we investigated the correlation between age and gait kinematic, kinetic and spatio-temporal variability. In typically developing children, the relative precision of GaitSD was 10% as soon as 6 strides were captured. As a comparison, spatio-temporal parameters required 30 strides to reach the same relative precision. The ratio stride-to-stride divided by normative pattern variability was smaller in kinematic variables (the smallest for pelvic tilt, 28%) than in kinetic and spatio-temporal variables (the largest for normalised stride length, 95%). GaitSD had a strong, negative correlation with age. We show that gait consistency may stabilise only at, or after, skeletal maturity. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. 2016 Updated American Society of Clinical Oncology/Oncology Nursing Society Chemotherapy Administration Safety Standards, Including Standards for Pediatric Oncology.

    Neuss, Michael N; Gilmore, Terry R; Belderson, Kristin M; Billett, Amy L; Conti-Kalchik, Tara; Harvey, Brittany E; Hendricks, Carolyn; LeFebvre, Kristine B; Mangu, Pamela B; McNiff, Kristen; Olsen, MiKaela; Schulmeister, Lisa; Von Gehr, Ann; Polovich, Martha

    2016-12-01

    Purpose To update the ASCO/Oncology Nursing Society (ONS) Chemotherapy Administration Safety Standards and to highlight standards for pediatric oncology. Methods The ASCO/ONS Chemotherapy Administration Safety Standards were first published in 2009 and updated in 2011 to include inpatient settings. A subsequent 2013 revision expanded the standards to include the safe administration and management of oral chemotherapy. A joint ASCO/ONS workshop with stakeholder participation, including that of the Association of Pediatric Hematology Oncology Nurses and American Society of Pediatric Hematology/Oncology, was held on May 12, 2015, to review the 2013 standards. An extensive literature search was subsequently conducted, and public comments on the revised draft standards were solicited. Results The updated 2016 standards presented here include clarification and expansion of existing standards to include pediatric oncology and to introduce new standards: most notably, two-person verification of chemotherapy preparation processes, administration of vinca alkaloids via minibags in facilities in which intrathecal medications are administered, and labeling of medications dispensed from the health care setting to be taken by the patient at home. The standards were reordered and renumbered to align with the sequential processes of chemotherapy prescription, preparation, and administration. Several standards were separated into their respective components for clarity and to facilitate measurement of adherence to a standard. Conclusion As oncology practice has changed, so have chemotherapy administration safety standards. Advances in technology, cancer treatment, and education and training have prompted the need for periodic review and revision of the standards. Additional information is available at http://www.asco.org/chemo-standards .

  6. Variability of consumer impacts from energy efficiency standards

    McMahon, James E.; Liu, Xiaomin

    2000-06-15

    A typical prospective analysis of the expected impact of energy efficiency standards on consumers is based on average economic conditions (e.g., energy price) and operating characteristics. In fact, different consumers face different economic conditions and exhibit different behaviors when using an appliance. A method has been developed to characterize the variability among individual households and to calculate the life-cycle cost of appliances taking into account those differences. Using survey data, this method is applied to a distribution of consumers representing the U.S. Examples of clothes washer standards are shown for which 70-90% of the population benefit, compared to 10-30% who are expected to bear increased costs due to new standards. In some cases, sufficient data exist to distinguish among demographic subgroups (for example, low income or elderly households) who are impacted differently from the general population. Rank order correlations between the sampled input distributions and the sampled output distributions are calculated to determine which variability inputs are main factors. This ''importance analysis'' identifies the key drivers contributing to the range of results. Conversely, the importance analysis identifies variables that, while uncertain, make so little difference as to be irrelevant in deciding a particular policy. Examples will be given from analysis of water heaters to illustrate the dominance of the policy implications by a few key variables.

  7. Impact of including surface currents on simulation of Indian Ocean variability with the POAMA coupled model

    Zhao, Mei; Wang, Guomin; Hendon, Harry H.; Alves, Oscar [Bureau of Meteorology, Centre for Australian Weather and Climate Research, Melbourne (Australia)

    2011-04-15

    Impacts on the coupled variability of the Indo-Pacific by including the effects of surface currents on surface stress are explored in four extended integrations of an experimental version of the Bureau of Meteorology's coupled seasonal forecast model POAMA. The first pair of simulations differs only in their treatment of momentum coupling: one version includes the effects of surface currents on the surface stress computation and the other does not. The version that includes the effect of surface currents has less mean-state bias in the equatorial Pacific cold tongue but produces relatively weak coupled variability in the Tropics, especially that related to the Indian Ocean dipole (IOD) and El Nino/Southern Oscillation (ENSO). The version without the effects of surface currents has greater bias in the Pacific cold tongue but stronger IOD and ENSO variability. In order to diagnose the role of changes in local coupling from changes in remote forcing by ENSO for causing changes in IOD variability, a second set of simulations is conducted where effects of surface currents are included only in the Indian Ocean and only in the Pacific Ocean. IOD variability is found to be equally reduced by inclusion of the local effects of surface currents in the Indian Ocean and by the reduction of ENSO variability as a result of including effects of surface currents in the Pacific. Some implications of these results for predictability of the IOD and its dependence on ENSO, and for ocean subsurface data assimilation are discussed. (orig.)

  8. What to use to express the variability of data: Standard deviation or standard error of mean?

    Barde, Mohini P.; Barde, Prajakt J.

    2012-01-01

    Statistics plays a vital role in biomedical research. It helps present data precisely and draws the meaningful conclusions. While presenting data, one should be aware of using adequate statistical measures. In biomedical journals, Standard Error of Mean (SEM) and Standard Deviation (SD) are used interchangeably to express the variability; though they measure different parameters. SEM quantifies uncertainty in estimate of the mean whereas SD indicates dispersion of the data from mean. As reade...

  9. 32 CFR 37.620 - What financial management standards do I include for nonprofit participants?

    2010-07-01

    ... 32 National Defense 1 2010-07-01 2010-07-01 false What financial management standards do I include... financial management standards do I include for nonprofit participants? So as not to force system changes..., your expenditure-based TIA's requirements for the financial management system of any nonprofit...

  10. What to use to express the variability of data: Standard deviation or standard error of mean?

    Barde, Mohini P; Barde, Prajakt J

    2012-07-01

    Statistics plays a vital role in biomedical research. It helps present data precisely and draws the meaningful conclusions. While presenting data, one should be aware of using adequate statistical measures. In biomedical journals, Standard Error of Mean (SEM) and Standard Deviation (SD) are used interchangeably to express the variability; though they measure different parameters. SEM quantifies uncertainty in estimate of the mean whereas SD indicates dispersion of the data from mean. As readers are generally interested in knowing the variability within sample, descriptive data should be precisely summarized with SD. Use of SEM should be limited to compute CI which measures the precision of population estimate. Journals can avoid such errors by requiring authors to adhere to their guidelines.

  11. Taylor Series Trajectory Calculations Including Oblateness Effects and Variable Atmospheric Density

    Scott, James R.

    2011-01-01

    Taylor series integration is implemented in NASA Glenn's Spacecraft N-body Analysis Program, and compared head-to-head with the code's existing 8th- order Runge-Kutta Fehlberg time integration scheme. This paper focuses on trajectory problems that include oblateness and/or variable atmospheric density. Taylor series is shown to be significantly faster and more accurate for oblateness problems up through a 4x4 field, with speedups ranging from a factor of 2 to 13. For problems with variable atmospheric density, speedups average 24 for atmospheric density alone, and average 1.6 to 8.2 when density and oblateness are combined.

  12. Microscopic age determination of human skeletons including an unknown but calculable variable

    Wallin, Johan Albert; Tkocz, Izabella; Kristensen, Gustav

    1994-01-01

    estimation, which includes the covariance matrix of four single equation residuals, improves the accuracy of age determination. The standard deviation, however, of age prediction remains 12.58 years. An experimental split of the data was made in order to demonstrate that the use of subgroups gives a false...

  13. A Case for Including Atmospheric Thermodynamic Variables in Wind Turbine Fatigue Loading Parameter Identification

    Kelley, Neil D.

    1999-01-01

    This paper makes the case for establishing efficient predictor variables for atmospheric thermodynamics that can be used to statistically correlate the fatigue accumulation seen on wind turbines. Recently, two approaches to this issue have been reported. One uses multiple linear-regression analysis to establish the relative causality between a number of predictors related to the turbulent inflow and turbine loads. The other approach, using many of the same predictors, applies the technique of principal component analysis. An examination of the ensemble of predictor variables revealed that they were all kinematic in nature; i.e., they were only related to the description of the velocity field. Boundary-layer turbulence dynamics depends upon a description of the thermal field and its interaction with the velocity distribution. We used a series of measurements taken within a multi-row wind farm to demonstrate the need to include atmospheric thermodynamic variables as well as velocity-related ones in the search for efficient turbulence loading predictors in various turbine-operating environments. Our results show that a combination of vertical stability and hub-height mean shearing stress variables meet this need over a period of 10 minutes

  14. Preliminary Safety Information Document for the Standard MHTGR. Volume 1, (includes latest Amendments)

    None

    1986-01-01

    With NRC concurrence, the Licensing Plan for the Standard HTGR describes an application program consistent with 10CFR50, Appendix O to support a US Nuclear Regulatory Commission (NRC) review and design certification of an advanced Standard modular High Temperature Gas-Cooled Reactor (MHTGR) design. Consistent with the NRC's Advanced Reactor Policy, the Plan also outlines a series of preapplication activities which have as an objective the early issuance of an NRC Licensability Statement on the Standard MHTGR conceptual design. This Preliminary Safety Information Document (PSID) has been prepared as one of the submittals to the NRC by the US Department of Energy in support of preapplication activities on the Standard MHTGR. Other submittals to be provided include a Probabilistic Risk Assessment, a Regulatory Technology Development Plan, and an Emergency Planning Bases Report.

  15. Standardized Competencies for Parenteral Nutrition Order Review and Parenteral Nutrition Preparation, Including Compounding: The ASPEN Model.

    Boullata, Joseph I; Holcombe, Beverly; Sacks, Gordon; Gervasio, Jane; Adams, Stephen C; Christensen, Michael; Durfee, Sharon; Ayers, Phil; Marshall, Neil; Guenter, Peggi

    2016-08-01

    Parenteral nutrition (PN) is a high-alert medication with a complex drug use process. Key steps in the process include the review of each PN prescription followed by the preparation of the formulation. The preparation step includes compounding the PN or activating a standardized commercially available PN product. The verification and review, as well as preparation of this complex therapy, require competency that may be determined by using a standardized process for pharmacists and for pharmacy technicians involved with PN. An American Society for Parenteral and Enteral Nutrition (ASPEN) standardized model for PN order review and PN preparation competencies is proposed based on a competency framework, the ASPEN-published interdisciplinary core competencies, safe practice recommendations, and clinical guidelines, and is intended for institutions and agencies to use with their staff. © 2016 American Society for Parenteral and Enteral Nutrition.

  16. Direct-phase-variable model of a synchronous reluctance motor including all slot and winding harmonics

    Obe, Emeka S.; Binder, A.

    2011-01-01

    A detailed model in direct-phase variables of a synchronous reluctance motor operating at mains voltage and frequency is presented. The model includes the stator and rotor slot openings, the actual winding layout and the reluctance rotor geometry. Hence, all mmf and permeance harmonics are taken into account. It is seen that non-negligible harmonics introduced by slots are present in the inductances computed by the winding function procedure. These harmonics are usually ignored in d-q models. The machine performance is simulated in the stator reference frame to depict the difference between this new direct-phase model including all harmonics and the conventional rotor reference frame d-q model. Saturation is included by using a polynomial fitting the variation of d-axis inductance with stator current obtained by finite-element software FEMAG DC (registered) . The detailed phase-variable model can yield torque pulsations comparable to those obtained from finite elements while the d-q model cannot.

  17. How to include the variability of TMS responses in simulations: a speech mapping case study

    De Geeter, N.; Lioumis, P.; Laakso, A.; Crevecoeur, G.; Dupré, L.

    2016-11-01

    When delivered over a specific cortical site, TMS can temporarily disrupt the ongoing process in that area. This allows mapping of speech-related areas for preoperative evaluation purposes. We numerically explore the observed variability of TMS responses during a speech mapping experiment performed with a neuronavigation system. We selected four cases with very small perturbations in coil position and orientation. In one case (E) a naming error occurred, while in the other cases (NEA, B, C) the subject appointed the images as smoothly as without TMS. A realistic anisotropic head model was constructed of the subject from T1-weighted and diffusion-weighted MRI. The induced electric field distributions were computed, associated to the coil parameters retrieved from the neuronavigation system. Finally, the membrane potentials along relevant white matter fibre tracts, extracted from DTI-based tractography, were computed using a compartmental cable equation. While only minor differences could be noticed between the induced electric field distributions of the four cases, computing the corresponding membrane potentials revealed different subsets of tracts were activated. A single tract was activated for all coil positions. Another tract was only triggered for case E. NEA induced action potentials in 13 tracts, while NEB stimulated 11 tracts and NEC one. The calculated results are certainly sensitive to the coil specifications, demonstrating the observed variability in this study. However, even though a tract connecting Broca’s with Wernicke’s area is only triggered for the error case, further research is needed on other study cases and on refining the neural model with synapses and network connections. Case- and subject-specific modelling that includes both electromagnetic fields and neuronal activity enables demonstration of the variability in TMS experiments and can capture the interaction with complex neural networks.

  18. Evaluation of Dogs with Border Collie Collapse, Including Response to Two Standardized Strenuous Exercise Protocols.

    Taylor, Susan; Shmon, Cindy; Su, Lillian; Epp, Tasha; Minor, Katie; Mickelson, James; Patterson, Edward; Shelton, G Diane

    2016-01-01

    Clinical and metabolic variables were evaluated in 13 dogs with border collie collapse (BCC) before, during, and following completion of standardized strenuous exercise protocols. Six dogs participated in a ball-retrieving protocol, and seven dogs participated in a sheep-herding protocol. Findings were compared with 16 normal border collies participating in the same exercise protocols (11 retrieving, five herding). Twelve dogs with BCC developed abnormal mentation and/or an abnormal gait during evaluation. All dogs had post-exercise elevations in rectal temperature, pulse rate, arterial blood pH, PaO2, and lactate, and decreased PaCO2 and bicarbonate, as expected with strenuous exercise, but there were no significant differences between BCC dogs and normal dogs. Electrocardiography demonstrated sinus tachycardia in all dogs following exercise. Needle electromyography was normal, and evaluation of muscle biopsy cryosections using a standard panel of histochemical stains and reactions did not reveal a reason for collapse in 10 dogs with BCC in which these tests were performed. Genetic testing excluded the dynamin-1 related exercise-induced collapse mutation and the V547A malignant hyperthermia mutation as the cause of BCC. Common reasons for exercise intolerance were eliminated. Although a genetic basis is suspected, the cause of collapse in BCC was not determined.

  19. Including alternative resources in state renewable portfolio standards: Current design and implementation experience

    Heeter, Jenny; Bird, Lori

    2013-01-01

    As of October 2012, 29 states, the District of Columbia, and Puerto Rico have instituted a renewable portfolio standard (RPS). Each state policy is unique, varying in percentage targets, timetables, and eligible resources. Increasingly, new RPS polices have included alternative resources. Alternative resources have included energy efficiency, thermal resources, and, to a lesser extent, non-renewables. This paper examines state experience with implementing renewable portfolio standards that include energy efficiency, thermal resources, and non-renewable energy and explores compliance experience, costs, and how states evaluate, measure, and verify energy efficiency and convert thermal energy. It aims to gain insights from the experience of states for possible federal clean energy policy as well as to share experience and lessons for state RPS implementation. - Highlights: • Increasingly, new RPS policies have included alternative resources. • Nearly all states provide a separate tier or cap on the quantity of eligible alternative resources. • Where allowed, non-renewables and energy efficiency are being heavily utilized

  20. Including Alternative Resources in State Renewable Portfolio Standards: Current Design and Implementation Experience

    Heeter, J.; Bird, L.

    2012-11-01

    Currently, 29 states, the District of Columbia, and Puerto Rico have instituted a renewable portfolio standard (RPS). An RPS sets a minimum threshold for how much renewable energy must be generated in a given year. Each state policy is unique, varying in percentage targets, timetables, and eligible resources. This paper examines state experience with implementing renewable portfolio standards that include energy efficiency, thermal resources, and non-renewable energy and explores compliance experience, costs, and how states evaluate, measure, and verify energy efficiency and convert thermal energy. It aims to gain insights from the experience of states for possible federal clean energy policy as well as to share experience and lessons for state RPS implementation.

  1. Variability of gastric emptying time using standardized radiolabeled meals

    Christian, P.E.; Brophy, C.M.; Egger, M.J.; Taylor, A.; Moore, J.G.

    1984-01-01

    To define the range of inter- and intra-subject variability on gastric emptying measurements, eight healthy male subjects (ages 19-40) received meals on four separate occasions. The meal consisted of 150 g of beef stew labeled with Tc-99m SC labeled liver (600 μCi) and 150 g of orange juice containing In-111 DTPA (100 μCi) as the solid- and liquid-phase markers respectively. Images of the solid and liquid phases were obtained at 20 min intervals immediately after meal ingestion. The stomach region was selected from digital images and data were corrected for radionuclide interference, radioactive decay and the geometric mean of anterior and posterior counts. More absolute variability was seen with the solid than the liquid marker emptying for the group. The mean solid half-emptying time was 58 +- 17 min (range 29-92) while the mean liquid half-emptying time was 24 +- 8 min (range 12-37). A nested random effects analysis of variance showed moderate intra-subject variability for solid half-emptying times (rho = 0.4594), and high intra-subject variability was implied by a low correlation (rho = 0.2084) for liquid half-emptying. The average inter-subject differences were 58.3% of the total variance for solids (rho = 0.0017). For liquids, the inter-subject variability was 69.1% of the total variance, but was only suggestive of statistical significance (rho = 0.0666). The normal half emptying time for gastric emptying of liquids and solids is a variable phenomenon in healthy subjects and has great inter- and intra-individual day-to-day differences

  2. Variability of gastric emptying time using standardized radiolabeled meals

    Christian, P.E.; Brophy, C.M.; Egger, M.J.; Taylor, A.; Moore, J.G.

    1984-01-01

    To define the range of inter- and intra-subject variability on gastric emptying measurements, eight healthy male subjects (ages 19-40) received meals on four separate occasions. The meal consisted of 150 g of beef stew labeled with Tc-99m SC labeled liver (600 ..mu..Ci) and 150 g of orange juice containing In-111 DTPA (100 ..mu..Ci) as the solid- and liquid-phase markers respectively. Images of the solid and liquid phases were obtained at 20 min intervals immediately after meal ingestion. The stomach region was selected from digital images and data were corrected for radionuclide interference, radioactive decay and the geometric mean of anterior and posterior counts. More absolute variability was seen with the solid than the liquid marker emptying for the group. The mean solid half-emptying time was 58 +- 17 min (range 29-92) while the mean liquid half-emptying time was 24 +- 8 min (range 12-37). A nested random effects analysis of variance showed moderate intra-subject variability for solid half-emptying times (rho = 0.4594), and high intra-subject variability was implied by a low correlation (rho = 0.2084) for liquid half-emptying. The average inter-subject differences were 58.3% of the total variance for solids (rho = 0.0017). For liquids, the inter-subject variability was 69.1% of the total variance, but was only suggestive of statistical significance (rho = 0.0666). The normal half emptying time for gastric emptying of liquids and solids is a variable phenomenon in healthy subjects and has great inter- and intra-individual day-to-day differences.

  3. Understanding morphological variability in a taxonomic context in Chilean diplomystids (Teleostei: Siluriformes, including the description of a new species

    Gloria Arratia

    2017-02-01

    Full Text Available Following study of the external morphology and its unmatched variability throughout ontogeny and a re-examination of selected morphological characters based on many specimens of diplomystids from Central and South Chile, we revised and emended previous specific diagnoses and consider Diplomystes chilensis, D. nahuelbutaensis, D. camposensis, and Olivaichthys viedmensis (Baker River to be valid species. Another group, previously identified as Diplomystes sp., D. spec., D. aff. chilensis, and D. cf. chilensis inhabiting rivers between Rapel and Itata Basins is given a new specific name (Diplomystes incognitus and is diagnosed. An identification key to the Chilean species, including the new species, is presented. All specific diagnoses are based on external morphological characters, such as aspects of the skin, neuromast lines, and main lateral line, and position of the anus and urogenital pore, as well as certain osteological characters to facilitate the identification of these species that previously was based on many internal characters. Diplomystids below 150 mm standard length (SL share a similar external morphology and body proportions that make identification difficult; however, specimens over 150 mm SL can be diagnosed by the position of the urogenital pore and anus, and a combination of external and internal morphological characters. According to current knowledge, diplomystid species have an allopatric distribution with each species apparently endemic to particular basins in continental Chile and one species (O. viedmensis known only from one river in the Chilean Patagonia, but distributed extensively in southern Argentina.

  4. Inlet-engine matching for SCAR including application of a bicone variable geometry inlet

    Wasserbauer, J. F.; Gerstenmaier, W. H.

    1978-01-01

    Airflow characteristics of variable cycle engines (VCE) designed for Mach 2.32 can have transonic airflow requirements as high as 1.6 times the cruise airflow. This is a formidable requirement for conventional, high performance, axisymmetric, translating centerbody mixed compression inlets. An alternate inlet is defined, where the second cone of a two cone center body collapses to the initial cone angle to provide a large off-design airflow capability, and incorporates modest centerbody translation to minimize spillage drag. Estimates of transonic spillage drag are competitive with those of conventional translating centerbody inlets. The inlet's cruise performance exhibits very low bleed requirements with good recovery and high angle of attack capability.

  5. Spatial modelling of marine organisms in Forsmark and Oskarshamn. Including calculation of physical predictor variables

    Carlen, Ida; Nikolopoulos, Anna; Isaeus, Martin (AquaBiota Water Research, Stockholm (SE))

    2007-06-15

    GIS grids (maps) of marine parameters were created using point data from previous site investigations in the Forsmark and Oskarshamn areas. The proportion of global radiation reaching the sea bottom in Forsmark and Oskarshamn was calculated in ArcView, using Secchi depth measurements and the digital elevation models for the respective area. The number of days per year when the incoming light exceeds 5 MJ/m2 at the bottom was then calculated using the result of the previous calculations together with measured global radiation. Existing modelled grid-point data on bottom and pelagic temperature for Forsmark were interpolated to create surface covering grids. Bottom and pelagic temperature grids for Oskarshamn were calculated using point measurements to achieve yearly averages for a few points and then using regressions with existing grids to create new maps. Phytoplankton primary production in Forsmark was calculated using point measurements of chlorophyll and irradiance, and a regression with a modelled grid of Secchi depth. Distribution of biomass of macrophyte communities in Forsmark and Oskarshamn was calculated using spatial modelling in GRASP, based on field data from previous surveys. Physical parameters such as those described above were used as predictor variables. Distribution of biomass of different functional groups of fish in Forsmark was calculated using spatial modelling based on previous surveys and with predictor variables such as physical parameters and results from macrophyte modelling. All results are presented as maps in the report. The quality of the modelled predictions varies as a consequence of the quality and amount of the input data, the ecology and knowledge of the predicted phenomena, and by the modelling technique used. A substantial part of the variation is not described by the models, which should be expected for biological modelling. Therefore, the resulting grids should be used with caution and with this uncertainty kept in mind. All

  6. Spatial modelling of marine organisms in Forsmark and Oskarshamn. Including calculation of physical predictor variables

    Carlen, Ida; Nikolopoulos, Anna; Isaeus, Martin

    2007-06-01

    GIS grids (maps) of marine parameters were created using point data from previous site investigations in the Forsmark and Oskarshamn areas. The proportion of global radiation reaching the sea bottom in Forsmark and Oskarshamn was calculated in ArcView, using Secchi depth measurements and the digital elevation models for the respective area. The number of days per year when the incoming light exceeds 5 MJ/m2 at the bottom was then calculated using the result of the previous calculations together with measured global radiation. Existing modelled grid-point data on bottom and pelagic temperature for Forsmark were interpolated to create surface covering grids. Bottom and pelagic temperature grids for Oskarshamn were calculated using point measurements to achieve yearly averages for a few points and then using regressions with existing grids to create new maps. Phytoplankton primary production in Forsmark was calculated using point measurements of chlorophyll and irradiance, and a regression with a modelled grid of Secchi depth. Distribution of biomass of macrophyte communities in Forsmark and Oskarshamn was calculated using spatial modelling in GRASP, based on field data from previous surveys. Physical parameters such as those described above were used as predictor variables. Distribution of biomass of different functional groups of fish in Forsmark was calculated using spatial modelling based on previous surveys and with predictor variables such as physical parameters and results from macrophyte modelling. All results are presented as maps in the report. The quality of the modelled predictions varies as a consequence of the quality and amount of the input data, the ecology and knowledge of the predicted phenomena, and by the modelling technique used. A substantial part of the variation is not described by the models, which should be expected for biological modelling. Therefore, the resulting grids should be used with caution and with this uncertainty kept in mind. All

  7. Fatigue Behavior under Multiaxial Stress States Including Notch Effects and Variable Amplitude Loading

    Gates, Nicholas R.

    The central objective of the research performed in this study was to be able to better understand and predict fatigue crack initiation and growth from stress concentrations subjected to complex service loading histories. As such, major areas of focus were related to the understanding and modeling of material deformation behavior, fatigue damage quantification, notch effects, cycle counting, damage accumulation, and crack growth behavior under multiaxial nominal loading conditions. To support the analytical work, a wide variety of deformation and fatigue tests were also performed using tubular and plate specimens made from 2024-T3 aluminum alloy, with and without the inclusion of a circular through-thickness hole. However, the analysis procedures implemented were meant to be general in nature, and applicable to a wide variety of materials and component geometries. As a result, experimental data from literature were also used, when appropriate, to supplement the findings of various analyses. Popular approaches currently used for multiaxial fatigue life analysis are based on the idea of computing an equivalent stress/strain quantity through the extension of static yield criteria. This equivalent stress/strain is then considered to be equal, in terms of fatigue damage, to a uniaxial loading of the same magnitude. However, it has often been shown, and was shown again in this study, that although equivalent stress- and strain-based analysis approaches may work well in certain situations, they lack a general robustness and offer little room for improvement. More advanced analysis techniques, on the other hand, provide an opportunity to more accurately account for various aspects of the fatigue failure process under both constant and variable amplitude loading conditions. As a result, such techniques were of primary interest in the investigations performed. By implementing more advanced life prediction methodologies, both the overall accuracy and the correlation of fatigue

  8. Standard recommended practice for examination of fuel element cladding including the determination of the mechanical properties

    Anon.

    1975-01-01

    Guidelines are provided for the post-irradiation examination of fuel cladding and to achieve better correlation and interpretation of the data in the field of radiation effects. The recommended practice is applicable to metal cladding of all types of fuel elements. The tests cited are suitable for determining mechanical properties of the fuel elements cladding. Various ASTM standards and test methods are cited

  9. An Assessment of the Need for Standard Variable Names for Airborne Field Campaigns

    Beach, A. L., III; Chen, G.; Northup, E. A.; Kusterer, J.; Quam, B. M.

    2017-12-01

    The NASA Earth Venture Program has led to a dramatic increase in airborne observations, requiring updated data management practices with clearly defined data standards and protocols for metadata. An airborne field campaign can involve multiple aircraft and a variety of instruments. It is quite common to have different instruments/techniques measure the same parameter on one or more aircraft platforms. This creates a need to allow instrument Principal Investigators (PIs) to name their variables in a way that would distinguish them across various data sets. A lack of standardization of variables names presents a challenge for data search tools in enabling discovery of similar data across airborne studies, aircraft platforms, and instruments. This was also identified by data users as one of the top issues in data use. One effective approach for mitigating this problem is to enforce variable name standardization, which can effectively map the unique PI variable names to fixed standard names. In order to ensure consistency amongst the standard names, it will be necessary to choose them from a controlled list. However, no such list currently exists despite a number of previous efforts to establish a sufficient list of atmospheric variable names. The Atmospheric Composition Variable Standard Name Working Group was established under the auspices of NASA's Earth Science Data Systems Working Group (ESDSWG) to solicit research community feedback to create a list of standard names that are acceptable to data providers and data users This presentation will discuss the challenges and recommendations of standard variable names in an effort to demonstrate how airborne metadata curation/management can be improved to streamline data ingest, improve interoperability, and discoverability to a broader user community.

  10. Impact of a standardized nurse observation protocol including MEWS after Intensive Care Unit discharge.

    De Meester, K; Das, T; Hellemans, K; Verbrugghe, W; Jorens, P G; Verpooten, G A; Van Bogaert, P

    2013-02-01

    Analysis of in-hospital mortality after serious adverse events (SAE's) in our hospital showed the need for more frequent observation in medical and surgical wards. We hypothesized that the incidence of SAE's could be decreased by introducing a standard nurse observation protocol. To investigate the effect of a standard nurse observation protocol implementing the Modified Early Warning Score (MEWS) and a color graphic observation chart. Pre- and post-intervention study by analysis of patients records for a 5-day period after Intensive Care Unit (ICU) discharge to 14 medical and surgical wards before (n=530) and after (n=509) the intervention. For the total study population the mean Patient Observation Frequency Per Nursing Shift (POFPNS) during the 5-day period after ICU discharge increased from .9993 (95% C.I. .9637-1.0350) in the pre-intervention period to 1.0732 (95% C.I. 1.0362-1.1101) (p=.005) in the post-intervention period. There was an increased risk of a SAE in patients with MEWS 4 or higher in the present nursing shift (HR 8.25; 95% C.I. 2.88-23.62) and the previous nursing shift (HR 12.83;95% C.I. 4.45-36.99). There was an absolute risk reduction for SAE's within 120h after ICU discharge of 2.2% (95% C.I. -0.4-4.67%) from 5.7% to 3.5%. The intervention had a positive impact on the observation frequency. MEWS had a predictive value for SAE's in patients after ICU discharge. The drop in SAE's was substantial but did not reach statistical significance. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  11. International Spinal Cord Injury Core Data Set (version 2.0)-including standardization of reporting

    Biering-Sorensen, F.; DeVivo, M. J.; Charlifue, S.; Chen, Y.; New, P. W.; Noonan, V.; Post, M. W. M.; Vogel, L.

    Study design: The study design includes expert opinion, feedback, revisions and final consensus. Objectives: The objective of the study was to present the new knowledge obtained since the International Spinal Cord Injury (SCI) Core Data Set (Version 1.0) published in 2006, and describe the

  12. International Spinal Cord Injury Core Data Set (version 2.0)-including standardization of reporting

    Biering-Sørensen, F; DeVivo, M J; Charlifue, Susan; Chen, Y; New, P.W.; Noonan, V.; Post, M W M; Vogel, L.

    STUDY DESIGN: The study design includes expert opinion, feedback, revisions and final consensus. OBJECTIVES: The objective of the study was to present the new knowledge obtained since the International Spinal Cord Injury (SCI) Core Data Set (Version 1.0) published in 2006, and describe the

  13. Lyral has been included in the patch test standard series in Germany.

    Geier, Johannes; Brasch, Jochen; Schnuch, Axel; Lessmann, Holger; Pirker, Claudia; Frosch, Peter J

    2002-05-01

    Lyral 5% pet. was tested in 3245 consecutive patch test patients in 20 departments of dermatology in order (i) to check the diagnostic quality of this patch test preparation, (ii) to examine concomitant reactions to Lyral and fragrance mix (FM), and (iii) to assess the frequency of contact allergy to Lyral in an unselected patch test population of German dermatological clinics. 62 patients reacted to Lyral, i.e. 1.9%. One third of the positive reactions were + + and + + +. The reaction index was 0.27. Thus, the test preparation can be regarded a good diagnostic tool. Lyral and fragrance mix (FM) were tested in parallel in 3185 patients. Of these, 300 (9.4%) reacted to FM, and 59 (1.9%) to Lyral. In 40 patients, positive reactions to both occurred, which is 13.3% of those reacting to FM, and 67.8% of those reacting to Lyral. So the concordance of positive test reactions to Lyral and FM was only slight. Based on these results, the German Contact Dermatitis Research Group (DKG) decided to add Lyral 5% pet. to the standard series.

  14. Chromospheric activity of periodic variable stars (including eclipsing binaries) observed in DR2 LAMOST stellar spectral survey

    Zhang, Liyun; Lu, Hongpeng; Han, Xianming L.; Jiang, Linyan; Li, Zhongmu; Zhang, Yong; Hou, Yonghui; Wang, Yuefei; Cao, Zihuang

    2018-05-01

    The LAMOST spectral survey provides a rich databases for studying stellar spectroscopic properties and chromospheric activity. We cross-matched a total of 105,287 periodic variable stars from several photometric surveys and databases (CSS, LINEAR, Kepler, a recently updated eclipsing star catalogue, ASAS, NSVS, some part of SuperWASP survey, variable stars from the Tsinghua University-NAOC Transient Survey, and other objects from some new references) with four million stellar spectra published in the LAMOST data release 2 (DR2). We found 15,955 spectra for 11,469 stars (including 5398 eclipsing binaries). We calculated their equivalent widths (EWs) of their Hα, Hβ, Hγ, Hδ and Caii H lines. Using the Hα line EW, we found 447 spectra with emission above continuum for a total of 316 stars (178 eclipsing binaries). We identified 86 active stars (including 44 eclipsing binaries) with repeated LAMOST spectra. A total of 68 stars (including 34 eclipsing binaries) show chromospheric activity variability. We also found LAMOST spectra of 12 cataclysmic variables, five of which show chromospheric activity variability. We also made photometric follow-up studies of three short period targets (DY CVn, HAT-192-0001481, and LAMOST J164933.24+141255.0) using the Xinglong 60-cm telescope and the SARA 90-cm and 1-m telescopes, and obtained new BVRI CCD light curves. We analyzed these light curves and obtained orbital and starspot parameters. We detected the first flare event with a huge brightness increase of more than about 1.5 magnitudes in R filter in LAMOST J164933.24+141255.0.

  15. Outlining precision boundaries among areas with different variability standards using magnetic susceptibility and geomorphic surfaces

    Matias,Sammy S. R.; Marques Júnior,José; Siqueira,Diego S.; Pereira,Gener T.

    2014-01-01

    There is an increasing demand for detailed maps that represent in a simplified way the knowledge of the variability of a particular area or region maps. The objective was to outline precision boundaries among areas with different accuracy variability standards using magnetic susceptibility and geomorphic surfaces. The study was conducted in an area of 110 ha, which identified three compartment landscapes based on the geomorphic surfaces model. To determinate pH, organic matter, phosphorus, po...

  16. How novice, skilled and advanced clinical researchers include variables in a case report form for clinical research: a qualitative study.

    Chu, Hongling; Zeng, Lin; Fetters, Micheal D; Li, Nan; Tao, Liyuan; Shi, Yanyan; Zhang, Hua; Wang, Xiaoxiao; Li, Fengwei; Zhao, Yiming

    2017-09-18

    Despite varying degrees in research training, most academic clinicians are expected to conduct clinical research. The objective of this research was to understand how clinical researchers of different skill levels include variables in a case report form for their clinical research. The setting for this research was a major academic institution in Beijing, China. The target population was clinical researchers with three levels of experience, namely, limited clinical research experience, clinicians with rich clinical research experience and clinical research experts. Using a qualitative approach, we conducted 13 individual interviews (face to face) and one group interview (n=4) with clinical researchers from June to September 2016. Based on maximum variation sampling to identify researchers with three levels of research experience: eight clinicians with limited clinical research experience, five clinicians with rich clinical research experience and four clinical research experts. These 17 researchers had diverse hospital-based medical specialties and or specialisation in clinical research. Our analysis yields a typology of three processes developing a case report form that varies according to research experience level. Novice clinician researchers often have an incomplete protocol or none at all, and conduct data collection and publication based on a general framework. Experienced clinician researchers include variables in the case report form based on previous experience with attention to including domains or items at risk for omission and by eliminating unnecessary variables. Expert researchers consider comprehensively in advance data collection and implementation needs and plan accordingly. These results illustrate increasing levels of sophistication in research planning that increase sophistication in selection for variables in the case report form. These findings suggest that novice and intermediate-level researchers could benefit by emulating the comprehensive

  17. Standard Errors of Estimated Latent Variable Scores with Estimated Structural Parameters

    Hoshino, Takahiro; Shigemasu, Kazuo

    2008-01-01

    The authors propose a concise formula to evaluate the standard error of the estimated latent variable score when the true values of the structural parameters are not known and must be estimated. The formula can be applied to factor scores in factor analysis or ability parameters in item response theory, without bootstrap or Markov chain Monte…

  18. Accounting for human variability and sensitivity in setting standards for electromagnetic fields.

    Bailey, William H; Erdreich, Linda S

    2007-06-01

    Biological sensitivity and variability are key issues for risk assessment and standard setting. Variability encompasses general inter-individual variations in population responses, while sensitivity relates to unusual or extreme responses based on genetic, congenital, medical, or environmental conditions. For risk assessment and standard setting, these factors affect estimates of thresholds for effects and dose-response relationships and inform efforts to protect the more sensitive members of the population, not just the typical or average person. While issues of variability and sensitivity can be addressed by experimental and clinical studies of electromagnetic fields, investigators have paid little attention to these important issues. This paper provides examples that illustrate how default assumptions regarding variability can be incorporated into estimates of 60-Hz magnetic field exposures with no risk of cardiac stimulation and how population thresholds and variability of peripheral nerve stimulation responses at 60-Hz can be estimated from studies of pulsed gradient magnetic fields in magnetic resonance imaging studies. In the setting of standards for radiofrequency exposures, the International Commission for Non-Ionizing Radiation Protection uses inter-individual differences in thermal sensitivity as one of the considerations in the development of "safety factors." However, neither the range of sensitivity nor the sufficiency or excess of the 10-fold and the additional 5-fold safety factors have been assessed quantitatively. Data on the range of responses between median and sensitive individuals regarding heat stress and cognitive function should be evaluated to inform a reassessment of these safety factors and to identify data gaps.

  19. 32 CFR 37.615 - What standards do I include for financial systems of for-profit firms?

    2010-07-01

    ... 32 National Defense 1 2010-07-01 2010-07-01 false What standards do I include for financial... SECRETARY OF DEFENSE DoD GRANT AND AGREEMENT REGULATIONS TECHNOLOGY INVESTMENT AGREEMENTS Award Terms Affecting Participants' Financial, Property, and Purchasing Systems Financial Matters § 37.615 What...

  20. The Impact of Approved Accounting Standard AASB 1024 “Consolidated Accounts” on the Information Included in Consolidated Financial Statements

    Pramuka, Bambang Agus

    1995-01-01

    The intent of consolidated financial statements is to provide meaningful, relevant, useful, and reliable information about the operations of a group of companies. In compliance with AASB 1024 'Consolidated Accounts', and AAS 24 Consolidated Financial Reports', a parent entity now has to include in its consolidated financial statements all controlled entities, regardless of their legal form or the ownership interest held. The new Standard also pr...

  1. Variability of assay methods for total and free PSA after WHO standardization.

    Foj, L; Filella, X; Alcover, J; Augé, J M; Escudero, J M; Molina, R

    2014-03-01

    The variability of total PSA (tPSA) and free PSA (fPSA) results among commercial assays has been suggested to be decreased by calibration to World Health Organization (WHO) reference materials. To characterize the current situation, it is necessary to know its impact in the critical cutoffs used in clinical practice. In the present study, we tested 167 samples with tPSA concentrations of 0 to 20 μg/L using seven PSA and six fPSA commercial assays, including Access, ARCHITECT i2000, ADVIA Centaur XP, IMMULITE 2000, Elecsys, and Lumipulse G1200, in which we only measured tPSA. tPSA and fPSA were measured in Access using the Hybritech and WHO calibrators. Passing-Bablok analysis was performed for PSA, and percentage of fPSA with the Hybritech-calibrated access comparison assay. For tPSA, relative differences were more than 10 % at 0.2 μg/L for ARCHITECT i2000, and at a critical concentration of 3, 4, and 10 μg/L, the relative difference was exceeded by ADVIA Centaur XP and WHO-calibrated Access. For percent fPSA, at a critical concentration of 10 %, the 10 % relative difference limit was exceeded by IMMULITE 2000 assay. At a critical concentration of 20 and 25 %, ADVIA Centaur XP, ARCHITECT i2000, and IMMULITE 2000 assays exceeded the 10 % relative difference limit. We have shown significant discordances between assays included in this study despite advances in standardization conducted in the last years. Further harmonization efforts are required in order to obtain a complete clinical concordance.

  2. Variability of gastric emptying measurements in man employing standardized radiolabeled meals

    Brophy, C.M.; Moore, J.G.; Christian, P.E.; Egger, M.J.; Taylor, A.T.

    1986-01-01

    Radiolabeled liquid and solid portions of standardized 300-g meals were administered on four different study days to eight healthy subjects in an attempt to define the range of inter- and intrasubject variability in gastric emptying. Meal half emptying times, analysis of variance, and intraclass correlations were computed and compared within and between subjects. The mean solid half emptying time was 58 +/- 17 min (range 29-92), while the mean liquid half emptying time was 24 +/- 8 min (range 12-37). A nested random effects analysis of variance showed moderate intrasubject variability for solid emptying and high intrasubject variability for liquid emptying. The variability of solid and liquid emptying was comparable and relatively large when compared with other reports in the literature. The isotopic method for measuring gastric emptying is a valuable tool for investigating problems in gastric pathophysiology, particularly when differences between groups of subjects are sought. However, meal emptying time is a variable phenomenon in healthy subjects with significant inter- and intraindividual day-to-day differences. These day-to-day variations in gastric emptying must be considered in interpreting individual study results

  3. Variability of gastric emptying measurements in man employing standardized radiolabeled meals

    Brophy, C.M.; Moore, J.G.; Christian, P.E.; Egger, M.J.; Taylor, A.T.

    1986-08-01

    Radiolabeled liquid and solid portions of standardized 300-g meals were administered on four different study days to eight healthy subjects in an attempt to define the range of inter- and intrasubject variability in gastric emptying. Meal half emptying times, analysis of variance, and intraclass correlations were computed and compared within and between subjects. The mean solid half emptying time was 58 +/- 17 min (range 29-92), while the mean liquid half emptying time was 24 +/- 8 min (range 12-37). A nested random effects analysis of variance showed moderate intrasubject variability for solid emptying and high intrasubject variability for liquid emptying. The variability of solid and liquid emptying was comparable and relatively large when compared with other reports in the literature. The isotopic method for measuring gastric emptying is a valuable tool for investigating problems in gastric pathophysiology, particularly when differences between groups of subjects are sought. However, meal emptying time is a variable phenomenon in healthy subjects with significant inter- and intraindividual day-to-day differences. These day-to-day variations in gastric emptying must be considered in interpreting individual study results.

  4. Multiplicative surrogate standard deviation: a group metric for the glycemic variability of individual hospitalized patients.

    Braithwaite, Susan S; Umpierrez, Guillermo E; Chase, J Geoffrey

    2013-09-01

    Group metrics are described to quantify blood glucose (BG) variability of hospitalized patients. The "multiplicative surrogate standard deviation" (MSSD) is the reverse-transformed group mean of the standard deviations (SDs) of the logarithmically transformed BG data set of each patient. The "geometric group mean" (GGM) is the reverse-transformed group mean of the means of the logarithmically transformed BG data set of each patient. Before reverse transformation is performed, the mean of means and mean of SDs each has its own SD, which becomes a multiplicative standard deviation (MSD) after reverse transformation. Statistical predictions and comparisons of parametric or nonparametric tests remain valid after reverse transformation. A subset of a previously published BG data set of 20 critically ill patients from the first 72 h of treatment under the SPRINT protocol was transformed logarithmically. After rank ordering according to the SD of the logarithmically transformed BG data of each patient, the cohort was divided into two equal groups, those having lower or higher variability. For the entire cohort, the GGM was 106 (÷/× 1.07) mg/dl, and MSSD was 1.24 (÷/× 1.07). For the subgroups having lower and higher variability, respectively, the GGM did not differ, 104 (÷/× 1.07) versus 109 (÷/× 1.07) mg/dl, but the MSSD differed, 1.17 (÷/× 1.03) versus 1.31 (÷/× 1.05), p = .00004. By using the MSSD with its MSD, groups can be characterized and compared according to glycemic variability of individual patient members. © 2013 Diabetes Technology Society.

  5. A standardized approach to study human variability in isometric thermogenesis during low-intensity physical activity

    Delphine eSarafian

    2013-07-01

    Full Text Available Limitations of current methods: The assessment of human variability in various compartments of daily energy expenditure (EE under standardized conditions is well defined at rest (as basal metabolic rate and thermic effect of feeding, and currently under validation for assessing the energy cost of low-intensity dynamic work. However, because physical activities of daily life consist of a combination of both dynamic and isometric work, there is also a need to develop standardized tests for assessing human variability in the energy cost of low-intensity isometric work.Experimental objectives: Development of an approach to study human variability in isometric thermogenesis by incorporating a protocol of intermittent leg press exercise of varying low-intensity isometric loads with measurements of EE by indirect calorimetry. Results: EE was measured in the seated position with the subject at rest or while intermittently pressing both legs against a press-platform at 5 low-intensity isometric loads (+5, +10, + 15, +20 and +25 kg force, each consisting of a succession of 8 cycles of press (30 s and rest (30 s. EE, integrated over each 8-min period of the intermittent leg press exercise, was found to increase linearly across the 5 isometric loads with a correlation coefficient (r > 0.9 for each individual. The slope of this EE-Load relationship, which provides the energy cost of this standardized isometric exercise expressed per kg force applied intermittently (30 s in every min, was found to show good repeatability when assessed in subjects who repeated the same experimental protocol on 3 separate days: its low intra-individual coefficient of variation (CV of ~ 10% contrasted with its much higher inter-individual CV of 35%; the latter being mass-independent but partly explained by height. Conclusion: This standardized approach to study isometric thermogenesis opens up a new avenue for research in EE phenotyping and metabolic predisposition to obesity

  6. Statistical methodology for discrete fracture model - including fracture size, orientation uncertainty together with intensity uncertainty and variability

    Darcel, C. (Itasca Consultants SAS (France)); Davy, P.; Le Goc, R.; Dreuzy, J.R. de; Bour, O. (Geosciences Rennes, UMR 6118 CNRS, Univ. def Rennes, Rennes (France))

    2009-11-15

    the lineament scale (k{sub t} = 2) on the other, addresses the issue of the nature of the transition. We develop a new 'mechanistic' model that could help in modeling why and where this transition can occur. The transition between both regimes would occur for a fracture length of 1-10 m and even at a smaller scale for the few outcrops that follow the self-similar density model. A consequence for the disposal issue is that the model that is likely to apply in the 'blind' scale window between 10-100 m is the self-similar model as it is defined for large-scale lineaments. The self-similar model, as it is measured for some outcrops and most lineament maps, is definitely worth being investigated as a reference for scales above 1-10 m. In the rest of the report, we develop a methodology for incorporating uncertainty and variability into the DFN modeling. Fracturing properties arise from complex processes which produce an intrinsic variability; characterizing this variability as an admissible variation of model parameter or as the division of the site into subdomains with distinct DFN models is a critical point of the modeling effort. Moreover, the DFN model encompasses a part of uncertainty, due to data inherent uncertainties and sampling limits. Both effects must be quantified and incorporated into the DFN site model definition process. In that context, all available borehole data including recording of fracture intercept positions, pole orientation and relative uncertainties are used as the basis for the methodological development and further site model assessment. An elementary dataset contains a set of discrete fracture intercepts from which a parent orientation/density distribution can be computed. The elementary bricks of the site, from which these initial parent density distributions are computed, rely on the former Single Hole Interpretation division of the boreholes into sections whose local boundaries are expected to reflect - locally - geology

  7. Statistical methodology for discrete fracture model - including fracture size, orientation uncertainty together with intensity uncertainty and variability

    Darcel, C.; Davy, P.; Le Goc, R.; Dreuzy, J.R. de; Bour, O.

    2009-11-01

    the other, addresses the issue of the nature of the transition. We develop a new 'mechanistic' model that could help in modeling why and where this transition can occur. The transition between both regimes would occur for a fracture length of 1-10 m and even at a smaller scale for the few outcrops that follow the self-similar density model. A consequence for the disposal issue is that the model that is likely to apply in the 'blind' scale window between 10-100 m is the self-similar model as it is defined for large-scale lineaments. The self-similar model, as it is measured for some outcrops and most lineament maps, is definitely worth being investigated as a reference for scales above 1-10 m. In the rest of the report, we develop a methodology for incorporating uncertainty and variability into the DFN modeling. Fracturing properties arise from complex processes which produce an intrinsic variability; characterizing this variability as an admissible variation of model parameter or as the division of the site into subdomains with distinct DFN models is a critical point of the modeling effort. Moreover, the DFN model encompasses a part of uncertainty, due to data inherent uncertainties and sampling limits. Both effects must be quantified and incorporated into the DFN site model definition process. In that context, all available borehole data including recording of fracture intercept positions, pole orientation and relative uncertainties are used as the basis for the methodological development and further site model assessment. An elementary dataset contains a set of discrete fracture intercepts from which a parent orientation/density distribution can be computed. The elementary bricks of the site, from which these initial parent density distributions are computed, rely on the former Single Hole Interpretation division of the boreholes into sections whose local boundaries are expected to reflect - locally - geology and fracturing properties main characteristics. From that

  8. The variability of standard artificial soils: Behaviour, extractability and bioavailability of organic pollutants

    Hofman, Jakub; Hovorková, Ivana; Semple, Kirk T.

    2014-01-01

    Highlights: • Artificial soils from different laboratories revealed different fates, behaviour and bioavailability of lindane and phenanthrene. • Lindane behaviour was related to organic carbon. • Phenanthrene behaviour was significantly affected by degrading microorganisms from peat. • Sterilization of artificial soils might reduce unwanted variability. -- Abstract: Artificial soil is an important standard medium and reference material for soil ecotoxicity bioassays. Recent studies have documented the significant variability of their basic properties among different laboratories. Our study investigated (i) the variability of ten artificial soils from different laboratories by means of the fate, extractability and bioavailability of phenanthrene and lindane, and (ii) the relationships of these results to soil properties and ageing. Soils were spiked with 14 C-phenanthrene and 14 C-lindane, and the total residues, fractions extractable by hydroxypropyl-β-cyclodextrin, and the fractions of phenanthrene mineralizable by bacteria were determined after 1, 14, 28 and 56 days. Significant temporal changes in total residues and extractable and mineralizable fractions were observed for phenanthrene, resulting in large differences between soils after 56 days. Phenanthrene mineralization by indigenous peat microorganisms was suggested as the main driver of that, outweighing the effects of organic matter. Lindane total residues and extractability displayed much smaller changes over time and smaller differences between soils related to organic matter. Roughly estimated, the variability between the artificial soils was comparable to natural soils. The implications of such variability for the results of toxicity tests and risk assessment decisions should be identified. We also suggested that the sterilization of artificial soils might reduce unwanted variability

  9. The variability of standard artificial soils: Behaviour, extractability and bioavailability of organic pollutants

    Hofman, Jakub, E-mail: hofman@recetox.muni.cz [Research Centre for Toxic Compounds in the Environment (RECETOX), Faculty of Science, Masaryk University, Kamenice 753/5, Brno CZ-62500 (Czech Republic); Hovorková, Ivana [Research Centre for Toxic Compounds in the Environment (RECETOX), Faculty of Science, Masaryk University, Kamenice 753/5, Brno CZ-62500 (Czech Republic); Semple, Kirk T. [Lancaster Environment Centre, Lancaster University, Lancaster LA1 4YQ (United Kingdom)

    2014-01-15

    Highlights: • Artificial soils from different laboratories revealed different fates, behaviour and bioavailability of lindane and phenanthrene. • Lindane behaviour was related to organic carbon. • Phenanthrene behaviour was significantly affected by degrading microorganisms from peat. • Sterilization of artificial soils might reduce unwanted variability. -- Abstract: Artificial soil is an important standard medium and reference material for soil ecotoxicity bioassays. Recent studies have documented the significant variability of their basic properties among different laboratories. Our study investigated (i) the variability of ten artificial soils from different laboratories by means of the fate, extractability and bioavailability of phenanthrene and lindane, and (ii) the relationships of these results to soil properties and ageing. Soils were spiked with {sup 14}C-phenanthrene and {sup 14}C-lindane, and the total residues, fractions extractable by hydroxypropyl-β-cyclodextrin, and the fractions of phenanthrene mineralizable by bacteria were determined after 1, 14, 28 and 56 days. Significant temporal changes in total residues and extractable and mineralizable fractions were observed for phenanthrene, resulting in large differences between soils after 56 days. Phenanthrene mineralization by indigenous peat microorganisms was suggested as the main driver of that, outweighing the effects of organic matter. Lindane total residues and extractability displayed much smaller changes over time and smaller differences between soils related to organic matter. Roughly estimated, the variability between the artificial soils was comparable to natural soils. The implications of such variability for the results of toxicity tests and risk assessment decisions should be identified. We also suggested that the sterilization of artificial soils might reduce unwanted variability.

  10. Variability of linezolid concentrations after standard dosing in critically ill patients: a prospective observational study

    2014-01-01

    Introduction Severe infections in intensive care patients show high morbidity and mortality rates. Linezolid is an antimicrobial drug frequently used in critically ill patients. Recent data indicates that there might be high variability of linezolid serum concentrations in intensive care patients receiving standard doses. This study was aimed to evaluate whether standard dosing of linezolid leads to therapeutic serum concentrations in critically ill patients. Methods In this prospective observational study, 30 critically ill adult patients with suspected infections received standard dosing of 600 mg linezolid intravenously twice a day. Over 4 days, multiple serum samples were obtained from each patient, in order to determine the linezolid concentrations by liquid chromatography tandem mass spectrometry. Results A high variability of serum linezolid concentrations was observed (range of area under the linezolid concentration time curve over 24 hours (AUC24) 50.1 to 453.9 mg/L, median 143.3 mg*h/L; range of trough concentrations (Cmin) linezolid concentrations over 24 hours and at single time points (defined according to the literature as AUC24  400 mg*h/L and Cmin > 10 mg/L) were observed for 7 of the patients. Conclusions A high variability of linezolid serum concentrations with a substantial percentage of potentially subtherapeutic levels was observed in intensive care patients. The findings suggest that therapeutic drug monitoring of linezolid might be helpful for adequate dosing of linezolid in critically ill patients. Trial registration Clinicaltrials.gov NCT01793012. Registered 24 January 2013. PMID:25011656

  11. Auxiliary variables in multiple imputation in regression with missing X: a warning against including too many in small sample research

    Hardt Jochen

    2012-12-01

    Full Text Available Abstract Background Multiple imputation is becoming increasingly popular. Theoretical considerations as well as simulation studies have shown that the inclusion of auxiliary variables is generally of benefit. Methods A simulation study of a linear regression with a response Y and two predictors X1 and X2 was performed on data with n = 50, 100 and 200 using complete cases or multiple imputation with 0, 10, 20, 40 and 80 auxiliary variables. Mechanisms of missingness were either 100% MCAR or 50% MAR + 50% MCAR. Auxiliary variables had low (r=.10 vs. moderate correlations (r=.50 with X’s and Y. Results The inclusion of auxiliary variables can improve a multiple imputation model. However, inclusion of too many variables leads to downward bias of regression coefficients and decreases precision. When the correlations are low, inclusion of auxiliary variables is not useful. Conclusion More research on auxiliary variables in multiple imputation should be performed. A preliminary rule of thumb could be that the ratio of variables to cases with complete data should not go below 1 : 3.

  12. Technical standards and guidelines: prenatal screening for Down syndrome that includes first-trimester biochemistry and/or ultrasound measurements.

    Palomaki, Glenn E; Lee, Jo Ellen S; Canick, Jacob A; McDowell, Geraldine A; Donnenfeld, Alan E

    2009-09-01

    This statement is intended to augment the current general ACMG Standards and Guidelines for Clinical Genetics Laboratories and to address guidelines specific to first-trimester screening for Down syndrome. The aim is to provide the laboratory the necessary information to ensure accurate and reliable Down syndrome screening results given a screening protocol (e.g., combined first trimester and integrated testing). Information about various test combinations and their expected performance are provided, but other issues such as availability of reagents, patient interest in early test results, access to open neural tube defect screening, and availability of chorionic villus sampling are all contextual factors in deciding which screening protocol(s) will be selected by individual health care providers. Individual laboratories are responsible for meeting the quality assurance standards described by the Clinical Laboratory Improvement Act, the College of American Pathologists, and other regulatory agencies, with respect to appropriate sample documentation, assay validation, general proficiency, and quality control measures. These guidelines address first-trimester screening that includes ultrasound measurement and interpretation of nuchal translucency thickness and protocols that combine markers from both the first and second trimesters. Laboratories can use their professional judgment to make modification or additions.

  13. Iwamoto-Harada coalescence/pickup model for cluster emission: state density approach including angular momentum variables

    Běták Emil

    2014-04-01

    Full Text Available For low-energy nuclear reactions well above the resonance region, but still below the pion threshold, statistical pre-equilibrium models (e.g., the exciton and the hybrid ones are a frequent tool for analysis of energy spectra and the cross sections of cluster emission. For α’s, two essentially distinct approaches are popular, namely the preformed one and the different versions of coalescence approaches, whereas only the latter group of models can be used for other types of cluster ejectiles. The original Iwamoto-Harada model of pre-equilibrium cluster emission was formulated using the overlap of the cluster and its constituent nucleons in momentum space. Transforming it into level or state densities is not a straigthforward task; however, physically the same model was presented at a conference on reaction models five years earlier. At that time, only the densities without spin were used. The introduction of spin variables into the exciton model enabled detailed calculation of the γ emission and its competition with nucleon channels, and – at the same time – it stimulated further developments of the model. However – to the best of our knowledge – no spin formulation has been presented for cluster emission till recently, when the first attempts have been reported, but restricted to the first emission only. We have updated this effort now and we are able to handle (using the same simplifications as in our previous work pre-equilibrium cluster emission with spin including all nuclei in the reaction chain.

  14. BioInfra.Prot: A comprehensive proteomics workflow including data standardization, protein inference, expression analysis and data publication.

    Turewicz, Michael; Kohl, Michael; Ahrens, Maike; Mayer, Gerhard; Uszkoreit, Julian; Naboulsi, Wael; Bracht, Thilo; Megger, Dominik A; Sitek, Barbara; Marcus, Katrin; Eisenacher, Martin

    2017-11-10

    The analysis of high-throughput mass spectrometry-based proteomics data must address the specific challenges of this technology. To this end, the comprehensive proteomics workflow offered by the de.NBI service center BioInfra.Prot provides indispensable components for the computational and statistical analysis of this kind of data. These components include tools and methods for spectrum identification and protein inference, protein quantification, expression analysis as well as data standardization and data publication. All particular methods of the workflow which address these tasks are state-of-the-art or cutting edge. As has been shown in previous publications, each of these methods is adequate to solve its specific task and gives competitive results. However, the methods included in the workflow are continuously reviewed, updated and improved to adapt to new scientific developments. All of these particular components and methods are available as stand-alone BioInfra.Prot services or as a complete workflow. Since BioInfra.Prot provides manifold fast communication channels to get access to all components of the workflow (e.g., via the BioInfra.Prot ticket system: bioinfraprot@rub.de) users can easily benefit from this service and get support by experts. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  15. Variations in Carabidae assemblages across the farmland habitats in relation to selected environmental variables including soil properties

    Beáta Baranová

    2018-03-01

    Full Text Available The variations in ground beetles (Coleoptera: Carabidae assemblages across the three types of farmland habitats, arable land, meadows and woody vegetation were studied in relation to vegetation cover structure, intensity of agrotechnical interventions and selected soil properties. Material was pitfall trapped in 2010 and 2011 on twelve sites of the agricultural landscape in the Prešov town and its near vicinity, Eastern Slovakia. A total of 14,763 ground beetle individuals were entrapped. Material collection resulted into 92 Carabidae species, with the following six species dominating: Poecilus cupreus, Pterostichus melanarius, Pseudoophonus rufipes, Brachinus crepitans, Anchomenus dorsalis and Poecilus versicolor. Studied habitats differed significantly in the number of entrapped individuals, activity abundance as well as representation of the carabids according to their habitat preferences and ability to fly. However, no significant distinction was observed in the diversity, evenness neither dominance. The most significant environmental variables affecting Carabidae assemblages species variability were soil moisture and herb layer 0-20 cm. Another best variables selected by the forward selection were intensity of agrotechnical interventions, humus content and shrub vegetation. The other from selected soil properties seem to have just secondary meaning for the adult carabids. Environmental variables have the strongest effect on the habitat specialists, whereas ground beetles without special requirements to the habitat quality seem to be affected by the studied environmental variables just little.

  16. Standardizing effect size from linear regression models with log-transformed variables for meta-analysis.

    Rodríguez-Barranco, Miguel; Tobías, Aurelio; Redondo, Daniel; Molina-Portillo, Elena; Sánchez, María José

    2017-03-17

    Meta-analysis is very useful to summarize the effect of a treatment or a risk factor for a given disease. Often studies report results based on log-transformed variables in order to achieve the principal assumptions of a linear regression model. If this is the case for some, but not all studies, the effects need to be homogenized. We derived a set of formulae to transform absolute changes into relative ones, and vice versa, to allow including all results in a meta-analysis. We applied our procedure to all possible combinations of log-transformed independent or dependent variables. We also evaluated it in a simulation based on two variables either normally or asymmetrically distributed. In all the scenarios, and based on different change criteria, the effect size estimated by the derived set of formulae was equivalent to the real effect size. To avoid biased estimates of the effect, this procedure should be used with caution in the case of independent variables with asymmetric distributions that significantly differ from the normal distribution. We illustrate an application of this procedure by an application to a meta-analysis on the potential effects on neurodevelopment in children exposed to arsenic and manganese. The procedure proposed has been shown to be valid and capable of expressing the effect size of a linear regression model based on different change criteria in the variables. Homogenizing the results from different studies beforehand allows them to be combined in a meta-analysis, independently of whether the transformations had been performed on the dependent and/or independent variables.

  17. SU-F-R-30: Interscanner Variability of Radiomics Features in Computed Tomography (CT) Using a Standard ACR Phantom

    Shafiq ul Hassan, M; Zhang, G; Moros, E [H Lee Moffitt Cancer Center and Research Institute, Tampa, FL (United States); Department of Physics, University of South Florida, Tampa, FL (United States); Budzevich, M; Latifi, K; Hunt, D; Gillies, R [H Lee Moffitt Cancer Center and Research Institute, Tampa, FL (United States)

    2016-06-15

    Purpose: A simple approach to investigate Interscanner variability of Radiomics features in computed tomography (CT) using a standard ACR phantom. Methods: The standard ACR phantom was scanned on CT scanners from three different manufacturers. Scanning parameters of 120 KVp, 200 mA were used while slice thickness of 3.0 mm on two scanners and 3.27 mm on third scanner was used. Three spherical regions of interest (ROI) from water, medium density and high density inserts were contoured. Ninety four Radiomics features were extracted using an in-house program. These features include shape (11), intensity (22), GLCM (26), GLZSM (11), RLM (11), and NGTDM (5) and 8 fractal dimensions features. To evaluate the Interscanner variability across three scanners, a coefficient of variation (COV) is calculated for each feature group. Each group is further classified according to the COV- by calculating the percentage of features in each of the following categories: COV less than 2%, between 2 and 10% and greater than 10%. Results: For all feature groups, similar trend was observed for three different inserts. Shape features were the most robust for all scanners as expected. 70% of the shape features had COV <2%. For intensity feature group, 2% COV varied from 9 to 32% for three scanners. All features in four groups GLCM, GLZSM, RLM and NGTDM were found to have Interscanner variability ≥2%. The fractal dimensions dependence for medium and high density inserts were similar while it was different for water inserts. Conclusion: We concluded that even for similar scanning conditions, Interscanner variability across different scanners was significant. The texture features based on GLCM, GLZSM, RLM and NGTDM are highly scanner dependent. Since the inserts of the ACR Phantom are not heterogeneous in HU values suggests that matrix based 2nd order features are highly affected by variation in noise. Research partly funded by NIH/NCI R01CA190105-01.

  18. Using Copulas in the Estimation of the Economic Project Value in the Mining Industry, Including Geological Variability

    Krysa, Zbigniew; Pactwa, Katarzyna; Wozniak, Justyna; Dudek, Michal

    2017-12-01

    Geological variability is one of the main factors that has an influence on the viability of mining investment projects and on the technical risk of geology projects. In the current scenario, analyses of economic viability of new extraction fields have been performed for the KGHM Polska Miedź S.A. underground copper mine at Fore Sudetic Monocline with the assumption of constant averaged content of useful elements. Research presented in this article is aimed at verifying the value of production from copper and silver ore for the same economic background with the use of variable cash flows resulting from the local variability of useful elements. Furthermore, the ore economic model is investigated for a significant difference in model value estimated with the use of linear correlation between useful elements content and the height of mine face, and the approach in which model parameters correlation is based upon the copula best matched information capacity criterion. The use of copula allows the simulation to take into account the multi variable dependencies at the same time, thereby giving a better reflection of the dependency structure, which linear correlation does not take into account. Calculation results of the economic model used for deposit value estimation indicate that the correlation between copper and silver estimated with the use of copula generates higher variation of possible project value, as compared to modelling correlation based upon linear correlation. Average deposit value remains unchanged.

  19. A comparison of important international and national standards for limiting exposure to EMF including the scientific rationale.

    Roy, Colin R; Martin, Lindsay J

    2007-06-01

    A comparison of Eastern (from Russia, Hungary, Bulgaria, Poland, and the Czech Republic) and Western (represented by the International Commission on Non-Ionizing Radiation Protection guidelines and the Institute of Electrical and Electronic Engineers standards) radiofrequency standards reveals key differences. The Eastern approach is to protect against non-thermal effects caused by chronic exposure to low level exposure, and the occupational basic restriction is power load (the product of intensity and exposure duration). In contrast, the Western approach is to protect against established acute biological effects that could signal an adverse health effect, and the principal basic restriction is the specific absorption rate to protect against thermal effects. All of the standards are science-based, but a fundamental difference arises from a lack of agreement on the composition of the reference scientific database and of which adverse effect needs to be protected against. However, differences also exist between the ICNIRP and IEEE standards. An additional complication arises when standards are derived or modified using a precautionary approach. For ELF the differences between ICNIRP and IEEE are more fundamental; namely, differences in the basic restriction used (induced current; in-situ electric field) and the location of breakpoints in the strength-frequency curves result in large differences. In 2006, ICNIRP will initiate the review of their ELF and radiofrequency guidelines, and this will provide an opportunity to address differences in standards and the move towards harmonization of EMF standards and guidelines.

  20. Inlet-engine matching for SCAR including application of a bicone variable geometry inlet. [Supersonic Cruise Aircraft Research

    Wasserbauer, J. F.; Gerstenmaier, W. H.

    1978-01-01

    Airflow characteristics of variable cycle engines (VCE) designed for Mach 2.32 can have transonic airflow requirements as high as 1.6 times the cruise airflow. This is a formidable requirement for conventional, high performance, axisymmetric, translating centerbody mixed compression inlets. An alternate inlet is defined where the second cone of a two cone centerbody collapses to the initial cone angle to provide a large off-design airflow capability, and incorporates modest centerbody translation to minimize spillage drag. Estimates of transonic spillage drag are competitive with those of conventional translating centerbody inlets. The inlet's cruise performance exhibits very low bleed requirements with good recovery and high angle of attack capability.

  1. Major histocompatibility complex harbors widespread genotypic variability of non-additive risk of rheumatoid arthritis including epistasis.

    Wei, Wen-Hua; Bowes, John; Plant, Darren; Viatte, Sebastien; Yarwood, Annie; Massey, Jonathan; Worthington, Jane; Eyre, Stephen

    2016-04-25

    Genotypic variability based genome-wide association studies (vGWASs) can identify potentially interacting loci without prior knowledge of the interacting factors. We report a two-stage approach to make vGWAS applicable to diseases: firstly using a mixed model approach to partition dichotomous phenotypes into additive risk and non-additive environmental residuals on the liability scale and secondly using the Levene's (Brown-Forsythe) test to assess equality of the residual variances across genotype groups per marker. We found widespread significant (P 5e-05) vGWAS signals within the major histocompatibility complex (MHC) across all three study cohorts of rheumatoid arthritis. We further identified 10 epistatic interactions between the vGWAS signals independent of the MHC additive effects, each with a weak effect but jointly explained 1.9% of phenotypic variance. PTPN22 was also identified in the discovery cohort but replicated in only one independent cohort. Combining the three cohorts boosted power of vGWAS and additionally identified TYK2 and ANKRD55. Both PTPN22 and TYK2 had evidence of interactions reported elsewhere. We conclude that vGWAS can help discover interacting loci for complex diseases but require large samples to find additional signals.

  2. Slit-scanning technique using standard cell sorter instruments for analyzing and sorting nonacrocentric human chromosomes, including small ones

    Rens, W.; van Oven, C. H.; Stap, J.; Jakobs, M. E.; Aten, J. A.

    1994-01-01

    We have investigated the performance of two types of standard flow cell sorter instruments, a System 50 Cytofluorograph and a FACSTar PLUS cell sorter, for the on-line centromeric index (CI) analysis of human chromosomes. To optimize the results, we improved the detection efficiency for centromeres

  3. Deriving Daytime Variables From the AmeriFlux Standard Eddy Covariance Data Set

    van Ingen, Catharine [Berkeley Water Center. Berkeley, CA (United States); Microsoft. San Francisco, CA (United States); Agarwal, Deborah A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Berkeley Water Center. Berkeley, CA (United States); Univ. of California, Berkeley, CA (United States); Humphrey, Marty [Univ. of Virginia, Charlottesville, VA (United States); Li, Jie [Univ. of Virginia, Charlottesville, VA (United States)

    2008-12-06

    A gap-filled, quality assessed eddy covariance dataset has recently become available for the AmeriFluxnetwork. This dataset uses standard processing and produces commonly used science variables. This shared dataset enables robust comparisons across different analyses. Of course, there are many remaining questions. One of those is how to define 'during the day' which is an important concept for many analyses. Some studies have used local time — for example 9am to 5pm; others have used thresholds on photosynthetic active radiation (PAR). A related question is how to derive quantities such as the Bowen ratio. Most studies compute the ratio of the averages of the latent heat (LE) and sensible heat (H). In this study, we use different methods of defining 'during the day' for GPP, LE, and H. We evaluate the differences between methods in two ways. First, we look at a number of statistics of GPP. Second, we look at differences in the derived Bowen ratio. Our goal is not science per se, but rather informatics in support of the science.

  4. Large Variability in the Diversity of Physiologically Complex Surgical Procedures Exists Nationwide Among All Hospitals Including Among Large Teaching Hospitals.

    Dexter, Franklin; Epstein, Richard H; Thenuwara, Kokila; Lubarsky, David A

    2017-11-22

    Multiple previous studies have shown that having a large diversity of procedures has a substantial impact on quality management of hospital surgical suites. At hospitals with substantial diversity, unless sophisticated statistical methods suitable for rare events are used, anesthesiologists working in surgical suites will have inaccurate predictions of surgical blood usage, case durations, cost accounting and price transparency, times remaining in late running cases, and use of intraoperative equipment. What is unknown is whether large diversity is a feature of only a few very unique set of hospitals nationwide (eg, the largest hospitals in each state or province). The 2013 United States Nationwide Readmissions Database was used to study heterogeneity among 1981 hospitals in their diversities of physiologically complex surgical procedures (ie, the procedure codes). The diversity of surgical procedures performed at each hospital was quantified using a summary measure, the number of different physiologically complex surgical procedures commonly performed at the hospital (ie, 1/Herfindahl). A total of 53.9% of all hospitals commonly performed 3-fold larger diversity (ie, >30 commonly performed physiologically complex procedures). Larger hospitals had greater diversity than the small- and medium-sized hospitals (P 30 procedures (lower 99% CL, 71.9% of hospitals). However, there was considerable variability among the large teaching hospitals in their diversity (interquartile range of the numbers of commonly performed physiologically complex procedures = 19.3; lower 99% CL, 12.8 procedures). The diversity of procedures represents a substantive differentiator among hospitals. Thus, the usefulness of statistical methods for operating room management should be expected to be heterogeneous among hospitals. Our results also show that "large teaching hospital" alone is an insufficient description for accurate prediction of the extent to which a hospital sustains the

  5. Nonlinear method for including the mass uncertainty of standards and the system measurement errors in the fitting of calibration curves

    Pickles, W.L.; McClure, J.W.; Howell, R.H.

    1978-01-01

    A sophisticated nonlinear multiparameter fitting program was used to produce a best fit calibration curve for the response of an x-ray fluorescence analyzer to uranium nitrate, freeze dried, 0.2% accurate, gravimetric standards. The program is based on unconstrained minimization subroutine, VA02A. The program considers the mass values of the gravimetric standards as parameters to be fit along with the normal calibration curve parameters. The fitting procedure weights with the system errors and the mass errors in a consistent way. The resulting best fit calibration curve parameters reflect the fact that the masses of the standard samples are measured quantities with a known error. Error estimates for the calibration curve parameters can be obtained from the curvature of the ''Chi-Squared Matrix'' or from error relaxation techniques. It was shown that nondispersive XRFA of 0.1 to 1 mg freeze-dried UNO 3 can have an accuracy of 0.2% in 1000 s. 5 figures

  6. ENDF/B-5 Standards Data Library (including modifications made in 1986). Summary of contents and documentation

    DayDay, N.; Lemmel, H.D.

    1986-01-01

    This document summarizes the contents and documentation of the ENDF/B-5 Standards Data Library (EN5-ST) released in September 1979. The library contains complete evaluations for all significant neutron reactions in the energy range 10 -5 eV to 20 MeV for H-1, He-3, Li-6, B-10, C-12, Au-197 and U-235 isotopes. In 1986 the files for C-12, Au-197 and U-235 were slightly modified. The entire library or selective retrievals from it can be obtained free of charge from the IAEA Nuclear Data Section. (author)

  7. ENDF/B-5 Standards Data Library (including modifications made in 1986). Summary of contents and documentation

    DayDay, N; Lemmel, H D

    1986-05-01

    This document summarizes the contents and documentation of the ENDF/B-5 Standards Data Library (EN5-ST) released in September 1979. The library contains complete evaluations for all significant neutron reactions in the energy range 10{sup -5}eV to 20 MeV for H-1, He-3, Li-6, B-10, C-12, Au-197 and U-235 isotopes. In 1986 the files for C-12, Au-197 and U-235 were slightly modified. The entire library or selective retrievals from it can be obtained free of charge from the IAEA Nuclear Data Section. (author) Refs, figs, tabs

  8. Proposed Standards for Variable Harmonization Documentation and Referencing: A Case Study Using QuickCharmStats 1.1

    Winters, Kristi; Netscher, Sebastian

    2016-01-01

    Comparative statistical analyses often require data harmonization, yet the social sciences do not have clear operationalization frameworks that guide and homogenize variable coding decisions across disciplines. When faced with a need to harmonize variables researchers often look for guidance from various international studies that employ output harmonization, such as the Comparative Survey of Election Studies, which offer recoding structures for the same variable (e.g. marital status). More problematically there are no agreed documentation standards or journal requirements for reporting variable harmonization to facilitate a transparent replication process. We propose a conceptual and data-driven digital solution that creates harmonization documentation standards for publication and scholarly citation: QuickCharmStats 1.1. It is free and open-source software that allows for the organizing, documenting and publishing of data harmonization projects. QuickCharmStats starts at the conceptual level and its workflow ends with a variable recording syntax. It is therefore flexible enough to reflect a variety of theoretical justifications for variable harmonization. Using the socio-demographic variable ‘marital status’, we demonstrate how the CharmStats workflow collates metadata while being guided by the scientific standards of transparency and replication. It encourages researchers to publish their harmonization work by providing researchers who complete the peer review process a permanent identifier. Those who contribute original data harmonization work to their discipline can now be credited through citations. Finally, we propose peer-review standards for harmonization documentation, describe a route to online publishing, and provide a referencing format to cite harmonization projects. Although CharmStats products are designed for social scientists our adherence to the scientific method ensures our products can be used by researchers across the sciences. PMID

  9. Review of neutron activation analysis in the standardization and study of reference materials, including its application to radionuclide reference materials

    Byrne, A.R.

    1993-01-01

    Neutron activation analysis (NAA) plays a very important role in the certification of reference materials (RMs) and their characterization, including homogeneity testing. The features of the method are briefly reviewed, particularly aspects relating to its completely independent nuclear basis, its virtual freedom from blank problems, and its capacity for self-verification. This last aspect, arising from the essentially isotopic character of NAA, can be exploited by using different nuclear reactions and induced nuclides, and the possibility of employing two modes, one instrumental (nondestructive), the other radiochemical (destructive). This enables the derivation of essentially independent analytical information and the unique capacity of NAA for selfvalidation. The application of NAA to quantify natural or man-made radionuclides such as uranium, thorium, 237 Np, 129 I and 230 Th is discussed, including its advantages over conventional radiometric methods and its usefulness in providing independent data for nuclides where other confirmatory analyses are impossible, or are only recently becoming available through newer 'atom counting' techniques. Certain additional, prospective uses of NAA in the study of RMs and potential RMs are mentioned, including transmutation reactions, creation of endogenously radiolabelled matrices for production and study of RMs (such as dissolution and leaching tests, use as incorporated radiotracers for chemical recovery correction), and the possibility of molecular activation analysis for specification. (orig.)

  10. MRI screening for silicone breast implant rupture: accuracy, inter- and intraobserver variability using explantation results as reference standard

    Maijers, M.C.; Ritt, M.J.P.F. [VU University Medical Centre, Department of Plastic, Reconstructive and Hand Surgery, De Boelelaan 1117, PO Box 7057, Amsterdam (Netherlands); Niessen, F.B. [VU University Medical Centre, Department of Plastic, Reconstructive and Hand Surgery, De Boelelaan 1117, PO Box 7057, Amsterdam (Netherlands); Jan van Goyen Clinic, Department of Plastic Surgery, Amsterdam (Netherlands); Veldhuizen, J.F.H. [MRI Centre, Amsterdam (Netherlands); Manoliu, R.A. [MRI Centre, Amsterdam (Netherlands); VU University Medical Centre, Department of Radiology, Amsterdam (Netherlands)

    2014-06-15

    The recall of Poly Implant Prothese (PIP) silicone breast implants in 2010 resulted in large numbers of asymptomatic women with implants who underwent magnetic resonance imaging (MRI) screening. This study's aim was to assess the accuracy and interobserver variability of MRI screening in the detection of rupture and extracapsular silicone leakage. A prospective study included 107 women with 214 PIP implants who underwent explantation preceded by MRI. In 2013, two radiologists blinded for previous MRI findings or outcome at surgery, independently re-evaluated all MRI examinations. A structured protocol described the MRI findings. The ex vivo findings served as reference standard. In 208 of the 214 explanted prostheses, radiologists agreed independently about the condition of the implants. In five of the six cases they disagreed (2.6 %), but subsequently reached consensus. A sensitivity of 93 %, specificity of 93 %, positive predictive value of 77 % and negative predictive value of 98 % was found. The interobserver agreement was excellent (kappa value of 0.92). MRI has a high accuracy in diagnosing rupture in silicone breast implants. Considering the high kappa value of interobserver agreement, MRI appears to be a consistent diagnostic test. A simple, uniform classification, may improve communication between radiologist and plastic surgeon. (orig.)

  11. MRI screening for silicone breast implant rupture: accuracy, inter- and intraobserver variability using explantation results as reference standard

    Maijers, M.C.; Ritt, M.J.P.F.; Niessen, F.B.; Veldhuizen, J.F.H.; Manoliu, R.A.

    2014-01-01

    The recall of Poly Implant Prothese (PIP) silicone breast implants in 2010 resulted in large numbers of asymptomatic women with implants who underwent magnetic resonance imaging (MRI) screening. This study's aim was to assess the accuracy and interobserver variability of MRI screening in the detection of rupture and extracapsular silicone leakage. A prospective study included 107 women with 214 PIP implants who underwent explantation preceded by MRI. In 2013, two radiologists blinded for previous MRI findings or outcome at surgery, independently re-evaluated all MRI examinations. A structured protocol described the MRI findings. The ex vivo findings served as reference standard. In 208 of the 214 explanted prostheses, radiologists agreed independently about the condition of the implants. In five of the six cases they disagreed (2.6 %), but subsequently reached consensus. A sensitivity of 93 %, specificity of 93 %, positive predictive value of 77 % and negative predictive value of 98 % was found. The interobserver agreement was excellent (kappa value of 0.92). MRI has a high accuracy in diagnosing rupture in silicone breast implants. Considering the high kappa value of interobserver agreement, MRI appears to be a consistent diagnostic test. A simple, uniform classification, may improve communication between radiologist and plastic surgeon. (orig.)

  12. Quantification of the islet product: presentation of a standardized current good manufacturing practices compliant system with minimal variability.

    Friberg, Andrew S; Brandhorst, Heide; Buchwald, Peter; Goto, Masafumi; Ricordi, Camillo; Brandhorst, Daniel; Korsgren, Olle

    2011-03-27

    Accurate islet quantification has proven difficult to standardize in a good manufacturing practices (GMP) approved manner. The influence of assessment variables from both manual and computer-assisted digital image analysis (DIA) methods were compared using calibrated, standardized microspheres or islets alone. Additionally, a mixture of microspheres and exocrine tissue was used to evaluate the variability of both the current, internationally recognized, manual method and a novel GMP-friendly purity- and volume-based method (PV) evaluated by DIA in a semiclosed, culture bag system. Computer-assisted DIA recorded known microsphere size distribution and quantities accurately. By using DIA to evaluate islets, the interindividual manually evaluated percent coefficients of variation (CV%; n=14) were reduced by almost half for both islet equivalents (IEs; 31% vs. 17%, P=0.002) and purity (20% vs. 13%, P=0.033). The microsphere pool mixed with exocrine tissue did not differ from expected IE with either method. However, manual IE resulted in a total CV% of 44.3% and a range spanning 258 k IE, whereas PV resulted in CV% of 10.7% and range of 60 k IE. Purity CV% for each method were similar approximating 10.5% and differed from expected by +7% for the manual method and +3% for PV. The variability of standard counting methods for islet samples and clinical quantities of microspheres mixed with exocrine tissue were reduced with DIA. They were reduced even further by use of a semiclosed bag system compared with standard manual counting, thereby facilitating the standardization of islet evaluation according to GMP standards.

  13. A method to standardize gait and balance variables for gait velocity.

    Iersel, M.B. van; Olde Rikkert, M.G.M.; Borm, G.F.

    2007-01-01

    Many gait and balance variables depend on gait velocity, which seriously hinders the interpretation of gait and balance data derived from walks at different velocities. However, as far as we know there is no widely accepted method to correct for effects of gait velocity on other gait and balance

  14. Variability of inter-team distances associated with match events in elite-standard soccer

    Frencken, Wouter; De Poel, Harjo; Visscher, Chris; Lemmink, Koen

    2012-01-01

    In soccer, critical match events like goal attempts can be preceded by periods of instability in the balance between the two teams' behaviours. Therefore, we determined periods of high variability in the distance between the teams' centroid positions longitudinally and laterally in an

  15. Evaluation of standardized and applied variables in predicting treatment outcomes of polytrauma patients.

    Aksamija, Goran; Mulabdic, Adi; Rasic, Ismar; Muhovic, Samir; Gavric, Igor

    2011-01-01

    Polytrauma is defined as an injury where they are affected by at least two different organ systems or body, with at least one life-threatening injuries. Given the multilevel model care of polytrauma patients within KCUS are inevitable weaknesses in the management of this category of patients. To determine the dynamics of existing procedures in treatment of polytrauma patients on admission to KCUS, and based on statistical analysis of variables applied to determine and define the factors that influence the final outcome of treatment, and determine their mutual relationship, which may result in eliminating the flaws in the approach to the problem. The study was based on 263 polytrauma patients. Parametric and non-parametric statistical methods were used. Basic statistics were calculated, based on the calculated parameters for the final achievement of research objectives, multicoleration analysis, image analysis, discriminant analysis and multifactorial analysis were used. From the universe of variables for this study we selected sample of n = 25 variables, of which the first two modular, others belong to the common measurement space (n = 23) and in this paper defined as a system variable methods, procedures and assessments of polytrauma patients. After the multicoleration analysis, since the image analysis gave a reliable measurement results, we started the analysis of eigenvalues, that is defining the factors upon which they obtain information about the system solve the problem of the existing model and its correlation with treatment outcome. The study singled out the essential factors that determine the current organizational model of care, which may affect the treatment and better outcome of polytrauma patients. This analysis has shown the maximum correlative relationships between these practices and contributed to development guidelines that are defined by isolated factors.

  16. ASSESSMENT OF THE CHANGES IN BLOOD PRESSURE CIRCADIAN PROFILE AND VARIABILITY IN PATIENTS WITH CHRONIC HEART FAILURE AND ARTERIAL HYPERTENSION DURING COMBINED THERAPY INCLUDING IVABRADINE

    M. V. Surovtseva

    2012-01-01

    Full Text Available Aim. To assess the changes in blood pressure (BP circadian profile and variability in patients with chronic heart failure (CHF of ischemic etiology and arterial hypertension (HT due to the complex therapy including ivabradine. Material and methods. Patients (n=90 with CHF class II–III NYHA associated with stable angina II-III class and HT were examined. The patients were randomized into 3 groups depending on received drugs: perindopril and ivabradine - group 1; perindopril, bisoprolol and ivabradine - group 2; perindopril and bisoprolol - group 3. The duration of therapy was 6 months. Ambulatory BP monitoring (ABPM was assessed at baseline and after treatment. Results. More significant reduction in average 24-hours systolic BP was found in groups 1 and 2 compared to group 3 (Δ%: -19.4±0,4; -21.1±0.4 and -11.8±0.6, respectively as well as diastolic BP (Δ%: -10.6±0.6; -12.9±0.4 and -4,3±0.3, respectively and other ABPM indicators. Improvement of BP circadian rhythm was found due to increase in the number of «Dipper» patients (p=0.016. More significant reduction in average daily and night systolic and diastolic BP (p=0.001, as well as daily and night BP variability (p=0.001 was also found in patients of group 2 compared to these of group 1. Conclusion. Moderate antihypertensive effect (in respect of both diastolic and systolic BP was shown when ivabradine was included into the complex therapy of patients with ischemic CHF and HT. The effect was more pronounced when ivabradine was combined with perindopril and bisoprolol. This was accompanied by reduction in high BP daily variability and improvement of the BP circadian rhythm. 

  17. ASSESSMENT OF THE CHANGES IN BLOOD PRESSURE CIRCADIAN PROFILE AND VARIABILITY IN PATIENTS WITH CHRONIC HEART FAILURE AND ARTERIAL HYPERTENSION DURING COMBINED THERAPY INCLUDING IVABRADINE

    M. V. Surovtseva

    2015-12-01

    Full Text Available Aim. To assess the changes in blood pressure (BP circadian profile and variability in patients with chronic heart failure (CHF of ischemic etiology and arterial hypertension (HT due to the complex therapy including ivabradine. Material and methods. Patients (n=90 with CHF class II–III NYHA associated with stable angina II-III class and HT were examined. The patients were randomized into 3 groups depending on received drugs: perindopril and ivabradine - group 1; perindopril, bisoprolol and ivabradine - group 2; perindopril and bisoprolol - group 3. The duration of therapy was 6 months. Ambulatory BP monitoring (ABPM was assessed at baseline and after treatment. Results. More significant reduction in average 24-hours systolic BP was found in groups 1 and 2 compared to group 3 (Δ%: -19.4±0,4; -21.1±0.4 and -11.8±0.6, respectively as well as diastolic BP (Δ%: -10.6±0.6; -12.9±0.4 and -4,3±0.3, respectively and other ABPM indicators. Improvement of BP circadian rhythm was found due to increase in the number of «Dipper» patients (p=0.016. More significant reduction in average daily and night systolic and diastolic BP (p=0.001, as well as daily and night BP variability (p=0.001 was also found in patients of group 2 compared to these of group 1. Conclusion. Moderate antihypertensive effect (in respect of both diastolic and systolic BP was shown when ivabradine was included into the complex therapy of patients with ischemic CHF and HT. The effect was more pronounced when ivabradine was combined with perindopril and bisoprolol. This was accompanied by reduction in high BP daily variability and improvement of the BP circadian rhythm. 

  18. On the use of Standardized Drought Indices under decadal climate variability: Critical assessment and drought policy implications

    Núñez, J.; Rivera, D.; Oyarzún, R.; Arumí, J. L.

    2014-09-01

    Since the recent High Level Meeting on National Drought Policy held in Geneva in 2013, a greater concern about the creation and adaptation of national drought monitoring systems is expected. Consequently, backed by international recommendations, the use of Standardized Drought Indices (SDI), such as the Standardized Precipitation Index (SPI), as an operational basis of drought monitoring systems has been increasing in many parts of the world. Recommendations for the use of the SPI, and consequently, those indices that share its properties, do not take into account the limitations that this type of index can exhibit under the influence of multidecadal climate variability. These limitations are fundamentally related to the lack of consistency among the operational definition expressed by this type of index, the conceptual definition with which it is associated and the political definition it supports. Furthermore, the limitations found are not overcome by the recommendations for their application. This conclusion is supported by the long-term study of the Standardized Streamflow Index (SSI) in the arid north-central region of Chile, under the influence of multidecadal climate variability. The implications of the findings of the study are discussed with regard to their link to aspects of drought policy in the cases of Australia, the United States and Chile.

  19. The Frontlines of Medicine Project: a proposal for the standardized communication of emergency department data for public health uses including syndromic surveillance for biological and chemical terrorism.

    Barthell, Edward N; Cordell, William H; Moorhead, John C; Handler, Jonathan; Feied, Craig; Smith, Mark S; Cochrane, Dennis G; Felton, Christopher W; Collins, Michael A

    2002-04-01

    The Frontlines of Medicine Project is a collaborative effort of emergency medicine (including emergency medical services and clinical toxicology), public health, emergency government, law enforcement, and informatics. This collaboration proposes to develop a nonproprietary, "open systems" approach for reporting emergency department patient data. The common element is a standard approach to sending messages from individual EDs to regional oversight entities that could then analyze the data received. ED encounter data could be used for various public health initiatives, including syndromic surveillance for chemical and biological terrorism. The interlinking of these regional systems could also permit public health surveillance at a national level based on ED patient encounter data. Advancements in the Internet and Web-based technologies could allow the deployment of these standardized tools in a rapid time frame.

  20. Including indigestible carbohydrates in the evening meal of healthy subjects improves glucose tolerance, lowers inflammatory markers, and increases satiety after a subsequent standardized breakfast

    Nilsson, A.C.; Ostman, E.M.; Holst, Jens Juul

    2008-01-01

    Low-glycemic index (GI) foods and foods rich in whole grain are associated with reduced risk of type 2 diabetes and cardiovascular disease. We studied the effect of cereal-based bread evening meals (50 g available starch), varying in GI and content of indigestible carbohydrates, on glucose...... tolerance and related variables after a subsequent standardized breakfast in healthy subjects (n = 15). At breakfast, blood was sampled for 3 h for analysis of blood glucose, serum insulin, serum FFA, serum triacylglycerides, plasma glucagon, plasma gastric-inhibitory peptide, plasma glucagon-like peptide-1...... based bread (ordinary, high-amylose- or beta-glucan-rich genotypes) or an evening meal with white wheat flour bread (WWB) enriched with a mixture of barley fiber and resistant starch improved glucose tolerance at the subsequent breakfast compared with unsupplemented WWB (P

  1. Variability of standard artificial soils: Physico-chemical properties and phenanthrene desorption measured by means of supercritical fluid extraction

    Bielská, Lucie; Hovorková, Ivana; Komprdová, Klára; Hofman, Jakub

    2012-01-01

    The study is focused on artificial soil which is supposed to be a standardized “soil like” medium. We compared physico-chemical properties and extractability of Phenanthrene from 25 artificial soils prepared according to OECD standardized procedures at different laboratories. A substantial range of soil properties was found, also for parameters which should be standardized because they have an important influence on the bioavailability of pollutants (e.g. total organic carbon ranged from 1.4 to 6.1%). The extractability of Phe was measured by supercritical fluid extraction (SFE) at harsh and mild conditions. Highly variable Phe extractability from different soils (3–89%) was observed. The extractability was strongly related (R 2 = 0.87) to total organic carbon content, 0.1–2 mm particle size, and humic/fulvic acid ratio in the following multiple regression model: SFE (%) = 1.35 * sand (%) − 0.77 * TOC (%)2 + 0.27 * HA/FA. - Highlights: ► We compared properties and extractability of Phe from 25 different artificial soils. ► Substantial range of soil properties was found, also for important parameters. ► Phe extractability was measured by supercritical fluid extraction (SFE) at 2 modes. ► Phe extractability was highly variable from different soils (3–89%). ► Extractability was strongly related to TOC, 0.1–2 mm particles, and HA/FA. - Significant variability in physico-chemical properties exists between artificial soils prepared at different laboratories and affects behavior of contaminants in these soils.

  2. Progress Report on the Airborne Composition Standard Variable Name and Time Series Working Groups of the 2017 ESDSWG

    Evans, K. D.; Early, A. B.; Northup, E. A.; Ames, D. P.; Teng, W. L.; Olding, S. W.; Krotkov, N. A.; Arctur, D. K.; Beach, A. L., III; Silverman, M. L.

    2017-12-01

    The role of NASA's Earth Science Data Systems Working Groups (ESDSWG) is to make recommendations relevant to NASA's Earth science data systems from users' experiences and community insight. Each group works independently, focusing on a unique topic. Progress of two of the 2017 Working Groups will be presented. In a single airborne field campaign, there can be several different instruments and techniques that measure the same parameter on one or more aircraft platforms. Many of these same parameters are measured during different airborne campaigns using similar or different instruments and techniques. The Airborne Composition Standard Variable Name Working Group is working to create a list of variable standard names that can be used across all airborne field campaigns in order to assist in the transition to the ICARTT Version 2.0 file format. The overall goal is to enhance the usability of ICARTT files and the search ability of airborne field campaign data. The Time Series Working Group (TSWG) is a continuation of the 2015 and 2016 Time Series Working Groups. In 2015, we started TSWG with the intention of exploring the new OGC (Open Geospatial Consortium) WaterML 2 standards as a means for encoding point-based time series data from NASA satellites. In this working group, we realized that WaterML 2 might not be the best solution for this type of data, for a number of reasons. Our discussion with experts from other agencies, who have worked on similar issues, identified several challenges that we would need to address. As a result, we made the recommendation to study the new TimeseriesML 1.0 standard of OGC as a potential NASA time series standard. The 2016 TSWG examined closely the TimeseriesML 1.0 and, in coordination with the OGC TimeseriesML Standards Working Group, identified certain gaps in TimeseriesML 1.0 that would need to be addressed for the standard to be applicable to NASA time series data. An engineering report was drafted based on the OGC Engineering

  3. Prediction of bleeding and thrombosis by standard biochemical coagulation variables in haematological intensive care patients

    Russell, L.; Madsen, M. B.; Dahl, M.

    2018-01-01

    -dimer and fibrinogen, and markers of infection (C-reactive protein, pro-calcitonin), kidney function (creatinine) and tissue damage (lactate dehydrogenase (LDH)). Results: We included 116 patients; 66 (57%) had at least one bleeding episode and 11 (9%) patients had at least one thrombotic event. The differences...

  4. A SEARCH FOR L/T TRANSITION DWARFS WITH Pan-STARRS1 AND WISE: DISCOVERY OF SEVEN NEARBY OBJECTS INCLUDING TWO CANDIDATE SPECTROSCOPIC VARIABLES

    Best, William M. J.; Liu, Michael C.; Magnier, Eugene A.; Aller, Kimberly M.; Burgett, W. S.; Chambers, K. C.; Hodapp, K. W.; Kaiser, N.; Kudritzki, R.-P.; Morgan, J. S.; Tonry, J. L.; Wainscoat, R. J.; Deacon, Niall R.; Dupuy, Trent J.; Redstone, Joshua; Price, P. A.

    2013-01-01

    We present initial results from a wide-field (30,000 deg 2 ) search for L/T transition brown dwarfs within 25 pc using the Pan-STARRS1 and Wide-field Infrared Survey Explorer (WISE) surveys. Previous large-area searches have been incomplete for L/T transition dwarfs, because these objects are faint in optical bands and have near-infrared (near-IR) colors that are difficult to distinguish from background stars. To overcome these obstacles, we have cross-matched the Pan-STARRS1 (optical) and WISE (mid-IR) catalogs to produce a unique multi-wavelength database for finding ultracool dwarfs. As part of our initial discoveries, we have identified seven brown dwarfs in the L/T transition within 9-15 pc of the Sun. The L9.5 dwarf PSO J140.2308+45.6487 and the T1.5 dwarf PSO J307.6784+07.8263 (both independently discovered by Mace et al.) show possible spectroscopic variability at the Y and J bands. Two more objects in our sample show evidence of photometric J-band variability, and two others are candidate unresolved binaries based on their spectra. We expect our full search to yield a well-defined, volume-limited sample of L/T transition dwarfs that will include many new targets for study of this complex regime. PSO J307.6784+07.8263 in particular may be an excellent candidate for in-depth study of variability, given its brightness (J = 14.2 mag) and proximity (11 pc)

  5. A SEARCH FOR L/T TRANSITION DWARFS WITH Pan-STARRS1 AND WISE: DISCOVERY OF SEVEN NEARBY OBJECTS INCLUDING TWO CANDIDATE SPECTROSCOPIC VARIABLES

    Best, William M. J.; Liu, Michael C.; Magnier, Eugene A.; Aller, Kimberly M.; Burgett, W. S.; Chambers, K. C.; Hodapp, K. W.; Kaiser, N.; Kudritzki, R.-P.; Morgan, J. S.; Tonry, J. L.; Wainscoat, R. J. [Institute for Astronomy, University of Hawaii at Manoa, Honolulu, HI 96822 (United States); Deacon, Niall R. [Max Planck Institute for Astronomy, Koenigstuhl 17, D-69117 Heidelberg (Germany); Dupuy, Trent J. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Redstone, Joshua [Facebook, 335 Madison Ave, New York, NY 10017-4677 (United States); Price, P. A., E-mail: wbest@ifa.hawaii.edu [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States)

    2013-11-10

    We present initial results from a wide-field (30,000 deg{sup 2}) search for L/T transition brown dwarfs within 25 pc using the Pan-STARRS1 and Wide-field Infrared Survey Explorer (WISE) surveys. Previous large-area searches have been incomplete for L/T transition dwarfs, because these objects are faint in optical bands and have near-infrared (near-IR) colors that are difficult to distinguish from background stars. To overcome these obstacles, we have cross-matched the Pan-STARRS1 (optical) and WISE (mid-IR) catalogs to produce a unique multi-wavelength database for finding ultracool dwarfs. As part of our initial discoveries, we have identified seven brown dwarfs in the L/T transition within 9-15 pc of the Sun. The L9.5 dwarf PSO J140.2308+45.6487 and the T1.5 dwarf PSO J307.6784+07.8263 (both independently discovered by Mace et al.) show possible spectroscopic variability at the Y and J bands. Two more objects in our sample show evidence of photometric J-band variability, and two others are candidate unresolved binaries based on their spectra. We expect our full search to yield a well-defined, volume-limited sample of L/T transition dwarfs that will include many new targets for study of this complex regime. PSO J307.6784+07.8263 in particular may be an excellent candidate for in-depth study of variability, given its brightness (J = 14.2 mag) and proximity (11 pc)

  6. Technical support document: Energy efficiency standards for consumer products: Refrigerators, refrigerator-freezers, and freezers including draft environmental assessment, regulatory impact analysis

    NONE

    1995-07-01

    The Energy Policy and Conservation Act (P.L. 94-163), as amended by the National Appliance Energy Conservation Act of 1987 (P.L. 100-12) and by the National Appliance Energy Conservation Amendments of 1988 (P.L. 100-357), and by the Energy Policy Act of 1992 (P.L. 102-486), provides energy conservation standards for 12 of the 13 types of consumer products` covered by the Act, and authorizes the Secretary of Energy to prescribe amended or new energy standards for each type (or class) of covered product. The assessment of the proposed standards for refrigerators, refrigerator-freezers, and freezers presented in this document is designed to evaluate their economic impacts according to the criteria in the Act. It includes an engineering analysis of the cost and performance of design options to improve the efficiency of the products; forecasts of the number and average efficiency of products sold, the amount of energy the products will consume, and their prices and operating expenses; a determination of change in investment, revenues, and costs to manufacturers of the products; a calculation of the costs and benefits to consumers, electric utilities, and the nation as a whole; and an assessment of the environmental impacts of the proposed standards.

  7. Technical support document: Energy conservation standards for consumer products: Dishwashers, clothes washers, and clothes dryers including: Environmental impacts; regulatory impact analysis

    1990-12-01

    The Energy Policy and Conservation Act as amended (P.L. 94-163), establishes energy conservation standards for 12 of the 13 types of consumer products specifically covered by the Act. The legislation requires the Department of Energy (DOE) to consider new or amended standards for these and other types of products at specified times. This Technical Support Document presents the methodology, data and results from the analysis of the energy and economic impacts of standards on dishwashers, clothes washers, and clothes dryers. The economic impact analysis is performed in five major areas: An Engineering Analysis, which establishes technical feasibility and product attributes including costs of design options to improve appliance efficiency. A Consumer Analysis at two levels: national aggregate impacts, and impacts on individuals. The national aggregate impacts include forecasts of appliance sales, efficiencies, energy use, and consumer expenditures. The individual impacts are analyzed by Life-Cycle Cost (LCC), Payback Periods, and Cost of Conserved Energy (CCE), which evaluate the savings in operating expenses relative to increases in purchase price; A Manufacturer Analysis, which provides an estimate of manufacturers' response to the proposed standards. Their response is quantified by changes in several measures of financial performance for a firm. An Industry Impact Analysis shows financial and competitive impacts on the appliance industry. A Utility Analysis that measures the impacts of the altered energy-consumption patterns on electric utilities. A Environmental Effects analysis, which estimates changes in emissions of carbon dioxide, sulfur oxides, and nitrogen oxides, due to reduced energy consumption in the home and at the power plant. A Regulatory Impact Analysis collects the results of all the analyses into the net benefits and costs from a national perspective. 47 figs., 171 tabs. (JF)

  8. Can consistent benchmarking within a standardized pain management concept decrease postoperative pain after total hip arthroplasty? A prospective cohort study including 367 patients.

    Benditz, Achim; Greimel, Felix; Auer, Patrick; Zeman, Florian; Göttermann, Antje; Grifka, Joachim; Meissner, Winfried; von Kunow, Frederik

    2016-01-01

    The number of total hip replacement surgeries has steadily increased over recent years. Reduction in postoperative pain increases patient satisfaction and enables better mobilization. Thus, pain management needs to be continuously improved. Problems are often caused not only by medical issues but also by organization and hospital structure. The present study shows how the quality of pain management can be increased by implementing a standardized pain concept and simple, consistent, benchmarking. All patients included in the study had undergone total hip arthroplasty (THA). Outcome parameters were analyzed 24 hours after surgery by means of the questionnaires from the German-wide project "Quality Improvement in Postoperative Pain Management" (QUIPS). A pain nurse interviewed patients and continuously assessed outcome quality parameters. A multidisciplinary team of anesthetists, orthopedic surgeons, and nurses implemented a regular procedure of data analysis and internal benchmarking. The health care team was informed of any results, and suggested improvements. Every staff member involved in pain management participated in educational lessons, and a special pain nurse was trained in each ward. From 2014 to 2015, 367 patients were included. The mean maximal pain score 24 hours after surgery was 4.0 (±3.0) on an 11-point numeric rating scale, and patient satisfaction was 9.0 (±1.2). Over time, the maximum pain score decreased (mean 3.0, ±2.0), whereas patient satisfaction significantly increased (mean 9.8, ±0.4; p benchmarking a standardized pain management concept. But regular benchmarking, implementation of feedback mechanisms, and staff education made the pain management concept even more successful. Multidisciplinary teamwork and flexibility in adapting processes seem to be highly important for successful pain management.

  9. Logic Learning Machine and standard supervised methods for Hodgkin's lymphoma prognosis using gene expression data and clinical variables.

    Parodi, Stefano; Manneschi, Chiara; Verda, Damiano; Ferrari, Enrico; Muselli, Marco

    2018-03-01

    This study evaluates the performance of a set of machine learning techniques in predicting the prognosis of Hodgkin's lymphoma using clinical factors and gene expression data. Analysed samples from 130 Hodgkin's lymphoma patients included a small set of clinical variables and more than 54,000 gene features. Machine learning classifiers included three black-box algorithms ( k-nearest neighbour, Artificial Neural Network, and Support Vector Machine) and two methods based on intelligible rules (Decision Tree and the innovative Logic Learning Machine method). Support Vector Machine clearly outperformed any of the other methods. Among the two rule-based algorithms, Logic Learning Machine performed better and identified a set of simple intelligible rules based on a combination of clinical variables and gene expressions. Decision Tree identified a non-coding gene ( XIST) involved in the early phases of X chromosome inactivation that was overexpressed in females and in non-relapsed patients. XIST expression might be responsible for the better prognosis of female Hodgkin's lymphoma patients.

  10. Can consistent benchmarking within a standardized pain management concept decrease postoperative pain after total hip arthroplasty? A prospective cohort study including 367 patients

    Benditz A

    2016-12-01

    Full Text Available Achim Benditz,1 Felix Greimel,1 Patrick Auer,2 Florian Zeman,3 Antje Göttermann,4 Joachim Grifka,1 Winfried Meissner,4 Frederik von Kunow1 1Department of Orthopedics, University Medical Center Regensburg, 2Clinic for anesthesia, Asklepios Klinikum Bad Abbach, Bad Abbach, 3Centre for Clinical Studies, University Medical Center Regensburg, Regensburg, 4Department of Anesthesiology and Intensive Care, Jena University Hospital, Jena, Germany Background: The number of total hip replacement surgeries has steadily increased over recent years. Reduction in postoperative pain increases patient satisfaction and enables better mobilization. Thus, pain management needs to be continuously improved. Problems are often caused not only by medical issues but also by organization and hospital structure. The present study shows how the quality of pain management can be increased by implementing a standardized pain concept and simple, consistent, benchmarking.Methods: All patients included in the study had undergone total hip arthroplasty (THA. Outcome parameters were analyzed 24 hours after surgery by means of the questionnaires from the German-wide project “Quality Improvement in Postoperative Pain Management” (QUIPS. A pain nurse interviewed patients and continuously assessed outcome quality parameters. A multidisciplinary team of anesthetists, orthopedic surgeons, and nurses implemented a regular procedure of data analysis and internal benchmarking. The health care team was informed of any results, and suggested improvements. Every staff member involved in pain management participated in educational lessons, and a special pain nurse was trained in each ward.Results: From 2014 to 2015, 367 patients were included. The mean maximal pain score 24 hours after surgery was 4.0 (±3.0 on an 11-point numeric rating scale, and patient satisfaction was 9.0 (±1.2. Over time, the maximum pain score decreased (mean 3.0, ±2.0, whereas patient satisfaction

  11. A novel synthetic quantification standard including virus and internal report targets: application for the detection and quantification of emerging begomoviruses on tomato.

    Péréfarres, Frédéric; Hoareau, Murielle; Chiroleu, Frédéric; Reynaud, Bernard; Dintinger, Jacques; Lett, Jean-Michel

    2011-08-05

    Begomovirus is a genus of phytopathogenic single-stranded DNA viruses, transmitted by the whitefly Bemisia tabaci. This genus includes emerging and economically significant viruses such as those associated with Tomato Yellow Leaf Curl Disease, for which diagnostic tools are needed to prevent dispersion and new introductions. Five real-time PCRs with an internal tomato reporter gene were developed for accurate detection and quantification of monopartite begomoviruses, including two strains of the Tomato yellow leaf curl virus (TYLCV; Mld and IL strains), the Tomato leaf curl Comoros virus-like viruses (ToLCKMV-like viruses) and the two molecules of the bipartite Potato yellow mosaic virus. These diagnostic tools have a unique standard quantification, comprising the targeted viral and internal report amplicons. These duplex real-time PCRs were applied to artificially inoculated plants to monitor and compare their viral development. Real-time PCRs were optimized for accurate detection and quantification over a range of 2 × 10(9) to 2 × 10(3) copies of genomic viral DNA/μL for TYLCV-Mld, TYLCV-IL and PYMV-B and 2 × 10(8) to 2 × 10(3) copies of genomic viral DNA/μL for PYMV-A and ToLCKMV-like viruses. These real-time PCRs were applied to artificially inoculated plants and viral loads were compared at 10, 20 and 30 days post-inoculation. Different patterns of viral accumulation were observed between the bipartite and the monopartite begomoviruses. Interestingly, PYMV accumulated more viral DNA at each date for both genomic components compared to all the monopartite viruses. Also, PYMV reached its highest viral load at 10 dpi contrary to the other viruses (20 dpi). The accumulation kinetics of the two strains of emergent TYLCV differed from the ToLCKMV-like viruses in the higher quantities of viral DNA produced in the early phase of the infection and in the shorter time to reach this peak viral load. To detect and quantify a wide range of begomoviruses, five duplex

  12. A novel synthetic quantification standard including virus and internal report targets: application for the detection and quantification of emerging begomoviruses on tomato

    Lett Jean-Michel

    2011-08-01

    Full Text Available Abstract Background Begomovirus is a genus of phytopathogenic single-stranded DNA viruses, transmitted by the whitefly Bemisia tabaci. This genus includes emerging and economically significant viruses such as those associated with Tomato Yellow Leaf Curl Disease, for which diagnostic tools are needed to prevent dispersion and new introductions. Five real-time PCRs with an internal tomato reporter gene were developed for accurate detection and quantification of monopartite begomoviruses, including two strains of the Tomato yellow leaf curl virus (TYLCV; Mld and IL strains, the Tomato leaf curl Comoros virus-like viruses (ToLCKMV-like viruses and the two molecules of the bipartite Potato yellow mosaic virus. These diagnostic tools have a unique standard quantification, comprising the targeted viral and internal report amplicons. These duplex real-time PCRs were applied to artificially inoculated plants to monitor and compare their viral development. Results Real-time PCRs were optimized for accurate detection and quantification over a range of 2 × 109 to 2 × 103 copies of genomic viral DNA/μL for TYLCV-Mld, TYLCV-IL and PYMV-B and 2 × 108 to 2 × 103 copies of genomic viral DNA/μL for PYMV-A and ToLCKMV-like viruses. These real-time PCRs were applied to artificially inoculated plants and viral loads were compared at 10, 20 and 30 days post-inoculation. Different patterns of viral accumulation were observed between the bipartite and the monopartite begomoviruses. Interestingly, PYMV accumulated more viral DNA at each date for both genomic components compared to all the monopartite viruses. Also, PYMV reached its highest viral load at 10 dpi contrary to the other viruses (20 dpi. The accumulation kinetics of the two strains of emergent TYLCV differed from the ToLCKMV-like viruses in the higher quantities of viral DNA produced in the early phase of the infection and in the shorter time to reach this peak viral load. Conclusions To detect and

  13. Models of simulation and prediction of the behavior of dengue in four Colombian cities, including climate like modulating variable of the disease

    Garcia Giraldo, Jairo A; Boshell, Jose Francisco

    2004-01-01

    ARIMA-type models are proposed to simulate the behavior of dengue and to make apparent the relations with the climatic variability in four localities of Colombia. The climatic variable was introduced into the models as an index that modulates the behavior of the disease. It was obtained by means of a multivariate analysis of principal components. The investigation was carried out with information corresponding to the epidemiological weeks from January 1997 to December 2000, for both the number of disease cases and the data corresponding to the meteorological variables. The study shows that the variations of the climate between the previous 9 to 14 weeks have influence on the appearance of new cases of dengue. In particular, the precipitation in these weeks was seen to be greater when in later periods the disease presented epidemic characteristics than the precipitation in those weeks preceded the disease within endemic limits

  14. Variability and accuracy of coronary CT angiography including use of iterative reconstruction algorithms for plaque burden assessment as compared with intravascular ultrasound - an ex vivo study

    Stolzmann, Paul [Massachusetts General Hospital and Harvard Medical School, Cardiac MR PET CT Program, Boston, MA (United States); University Hospital Zurich, Institute of Diagnostic and Interventional Radiology, Zurich (Switzerland); Schlett, Christopher L.; Maurovich-Horvat, Pal; Scheffel, Hans; Engel, Leif-Christopher; Karolyi, Mihaly; Hoffmann, Udo [Massachusetts General Hospital and Harvard Medical School, Cardiac MR PET CT Program, Boston, MA (United States); Maehara, Akiko; Ma, Shixin; Mintz, Gary S. [Columbia University Medical Center, Cardiovascular Research Foundation, New York, NY (United States)

    2012-10-15

    To systematically assess inter-technique and inter-/intra-reader variability of coronary CT angiography (CTA) to measure plaque burden compared with intravascular ultrasound (IVUS) and to determine whether iterative reconstruction algorithms affect variability. IVUS and CTA data were acquired from nine human coronary arteries ex vivo. CT images were reconstructed using filtered back projection (FBPR) and iterative reconstruction algorithms: adaptive-statistical (ASIR) and model-based (MBIR). After co-registration of 284 cross-sections between IVUS and CTA, two readers manually delineated the cross-sectional plaque area in all images presented in random order. Average plaque burden by IVUS was 63.7 {+-} 10.7% and correlated significantly with all CTA measurements (r = 0.45-0.52; P < 0.001), while CTA overestimated the burden by 10 {+-} 10%. There were no significant differences among FBPR, ASIR and MBIR (P > 0.05). Increased overestimation was associated with smaller plaques, eccentricity and calcification (P < 0.001). Reproducibility of plaque burden by CTA and IVUS datasets was excellent with a low mean intra-/inter-reader variability of <1/<4% for CTA and <0.5/<1% for IVUS respectively (P < 0.05) with no significant difference between CT reconstruction algorithms (P > 0.05). In ex vivo coronary arteries, plaque burden by coronary CTA had extremely low inter-/intra-reader variability and correlated significantly with IVUS measurements. Accuracy as well as reader reliability were independent of CT image reconstruction algorithm. (orig.)

  15. Effects of benchmarking on the quality of type 2 diabetes care: results of the OPTIMISE (Optimal Type 2 Diabetes Management Including Benchmarking and Standard Treatment) study in Greece

    Tsimihodimos, Vasilis; Kostapanos, Michael S.; Moulis, Alexandros; Nikas, Nikos; Elisaf, Moses S.

    2015-01-01

    Objectives: To investigate the effect of benchmarking on the quality of type 2 diabetes (T2DM) care in Greece. Methods: The OPTIMISE (Optimal Type 2 Diabetes Management Including Benchmarking and Standard Treatment) study [ClinicalTrials.gov identifier: NCT00681850] was an international multicenter, prospective cohort study. It included physicians randomized 3:1 to either receive benchmarking for glycated hemoglobin (HbA1c), systolic blood pressure (SBP) and low-density lipoprotein cholesterol (LDL-C) treatment targets (benchmarking group) or not (control group). The proportions of patients achieving the targets of the above-mentioned parameters were compared between groups after 12 months of treatment. Also, the proportions of patients achieving those targets at 12 months were compared with baseline in the benchmarking group. Results: In the Greek region, the OPTIMISE study included 797 adults with T2DM (570 in the benchmarking group). At month 12 the proportion of patients within the predefined targets for SBP and LDL-C was greater in the benchmarking compared with the control group (50.6 versus 35.8%, and 45.3 versus 36.1%, respectively). However, these differences were not statistically significant. No difference between groups was noted in the percentage of patients achieving the predefined target for HbA1c. At month 12 the increase in the percentage of patients achieving all three targets was greater in the benchmarking (5.9–15.0%) than in the control group (2.7–8.1%). In the benchmarking group more patients were on target regarding SBP (50.6% versus 29.8%), LDL-C (45.3% versus 31.3%) and HbA1c (63.8% versus 51.2%) at 12 months compared with baseline (p Benchmarking may comprise a promising tool for improving the quality of T2DM care. Nevertheless, target achievement rates of each, and of all three, quality indicators were suboptimal, indicating there are still unmet needs in the management of T2DM. PMID:26445642

  16. 12 YEARS OF X-RAY VARIABILITY IN M31 GLOBULAR CLUSTERS, INCLUDING 8 BLACK HOLE CANDIDATES, AS SEEN BY CHANDRA

    Barnard, R.; Garcia, M.; Murray, S. S.

    2012-01-01

    We examined 134 Chandra observations of the population of X-ray sources associated with globular clusters (GCs) in the central region of M31. These are expected to be X-ray binary systems (XBs), consisting of a neutron star or black hole accreting material from a close companion. We created long-term light curves for these sources, correcting for background, interstellar absorption, and instrumental effects. We tested for variability by examining the goodness of fit for the best-fit constant intensity. We also created structure functions (SFs) for every object in our sample, the first time this technique has been applied to XBs. We found significant variability in 28 out of 34 GCs and GC candidates; the other 6 sources had 0.3-10 keV luminosities fainter than ∼2 × 10 36 erg s –1 , limiting our ability to detect similar variability. The SFs of XBs with 0.3-10 keV luminosities ∼2-50 × 10 36 erg s –1 generally showed considerably more variability than the published ensemble SF of active galactic nuclei (AGNs). Our brightest XBs were mostly consistent with the AGN SF; however, their 2-10 keV fluxes could be matched by <1 AGN per square degree. These encouraging results suggest that examining the long-term light curves of other X-ray sources in the field may provide an important distinction between X-ray binaries and background galaxies, as the X-ray emission spectra from these two classes of X-ray sources are similar. Additionally, we identify 3 new black hole candidates (BHCs) using additional XMM-Newton data, bringing the total number of M31 GC BHCs to 9, with 8 covered in this survey.

  17. Characterization of SiO2/SiC interface states and channel mobility from MOSFET characteristics including variable-range hopping at cryogenic temperature

    Hironori Yoshioka

    2018-04-01

    Full Text Available The characteristics of SiC MOSFETs (drain current vs. gate voltage were measured at 0.14−350 K and analyzed considering variable-range hopping conduction through interface states. The total interface state density was determined to be 5.4×1012 cm−2 from the additional shift in the threshold gate voltage with a temperature change. The wave-function size of interface states was determined from the temperature dependence of the measured hopping current and was comparable to the theoretical value. The channel mobility was approximately 100 cm2V−1s−1 and was almost independent of temperature.

  18. Characterization of SiO2/SiC interface states and channel mobility from MOSFET characteristics including variable-range hopping at cryogenic temperature

    Yoshioka, Hironori; Hirata, Kazuto

    2018-04-01

    The characteristics of SiC MOSFETs (drain current vs. gate voltage) were measured at 0.14-350 K and analyzed considering variable-range hopping conduction through interface states. The total interface state density was determined to be 5.4×1012 cm-2 from the additional shift in the threshold gate voltage with a temperature change. The wave-function size of interface states was determined from the temperature dependence of the measured hopping current and was comparable to the theoretical value. The channel mobility was approximately 100 cm2V-1s-1 and was almost independent of temperature.

  19. Clinical variability of Waardenburg-Shah syndrome in patients with proximal 13q deletion syndrome including the endothelin-B receptor locus.

    Tüysüz, Beyhan; Collin, Anna; Arapoğlu, Müjde; Suyugül, Nezir

    2009-10-01

    Waardenburg-Shah syndrome (Waardenburg syndrome type IV-WS4) is an auditory-pigmentary disorder that combines clinical features of pigmentary abnormalities of the skin, hair and irides, sensorineural hearing loss, and Hirschsprung disease (HSCR). Mutations in the endothelin-B receptor (EDNRB) gene on 13q22 have been found to cause this syndrome. Mutations in both alleles cause the full phenotype, while heterozygous mutations cause isolated HSCR or HSCR with minor pigmentary anomalies and/or sensorineural deafness. We investigated the status of the EDNRB gene, by FISH analysis, in three patients with de novo proximal 13q deletions detected at cytogenetic analysis and examined the clinical variability of WS4 among these patients. Chromosome 13q was screened with locus specific FISH probes and breakpoints were determined at 13q22.1q31.3 in Patients 1 and 3, and at 13q21.1q31.3 in Patient 2. An EDNRB specific FISH probe was deleted in all three patients. All patients had common facial features seen in proximal 13q deletion syndrome and mild mental retardation. However, findings related to WS4 were variable; Patient 1 had hypopigmentation of the irides and HSCR, Patient 2 had prominent bicolored irides and mild bilateral hearing loss, and Patient 3 had only mild unilateral hearing loss. These data contribute new insights into the pathogenesis of WS4.

  20. Including climate variability in determination of the optimum rate of N fertilizer application using a crop model: A case study for rainfed corn in eastern Canada

    Mesbah, M.; Pattey, E.; Jégo, G.; Geng, X.; Tremblay, N.; Didier, A.

    2017-12-01

    Identifying optimum nitrogen (N) application rate is essential for increasing agricultural production while limiting potential environmental contaminations caused by release of reactive N, especially for high demand N crops such as corn. The central question of N management is then how the optimum N rate is affected by climate variability for given soil. The experimental determination of optimum N rates involve the analyses of variance on the mean value of crop yield response to various N application rates used by factorial plot based experiments for a few years in several regions. This traditional approach has limitations to capture 1) the non-linear response of yield to N application rates due to large incremental N rates (often more than 40 kg N ha-1) and 2) the ecophysiological response of the crop to climate variability because of limited numbers of growing seasons considered. Modeling on the other hand, does not have such limitations and hence we use a crop model and propose a model-based methodology called Finding NEMO (N Ecophysiologically Modelled Optimum) to identify the optimum N rates for variable agro-climatic conditions and given soil properties. The performance of the methodology is illustrated using the STICS crop model adapted for rainfed corn in the Mixedwood Plains ecozone of eastern Canada (42.3oN 83oW-46.8oN 71oW) where more than 90% of Canadian corn is produced. The simulations were performed using small increment of preplant N application rate (10 kg N ha -1), long time series of daily climatic data (48 to 61 years) for 5 regions along the ecozone, and three contrasting soils per region. The results show that N recommendations should be region and soil specific. Soils with lower available water capacity required more N compared to soil with higher available water capacity. When N rates were at their ecophysiologically optimum level, 10 to 17 kg increase in dry yield could be achieved by adding 1 kg N. Expected yield also affected the optimum

  1. Delayed Gadolinium-Enhanced MRI of Cartilage (dGEMRIC): Intra- and Interobserver Variability in Standardized Drawing of Regions of Interest

    Tiderius, C.J.; Tjoernstrand, J.; Aakeson, P.; Soedersten, K.; Dahlberg, L.; Leander, P.

    2004-01-01

    Purpose: To establish the reproducibility of a standardized region of interest (ROI) drawing procedure in delayed gadolinium-enhanced magnetic resonance imaging (MRI) of cartilage (dGEMRIC). Material and Methods: A large ROI in lateral and medial femoral weight-bearing cartilage was drawn in images of 12 healthy male volunteers by 6 investigators with different skills in MRI. The procedure was done twice, with a 1-week interval. Calculated T1-values were evaluated for intra- and interobserver variability. Results: The mean interobserver variability for both compartments ranged between 1.3% and 2.3% for the 6 different investigators without correlation to their experience in MRI. Post-contrast intra-observer variability was low in both the lateral and the medial femoral cartilage, 2.6% and 1.5%, respectively. The larger variability in lateral than in medial cartilage was related to slightly longer and thinner ROIs. Conclusion: Intra-observer variability and interobserver variability are both low when a large standardized ROI is used in dGEMRIC. The experience of the investigator does not affect the variability, which further supports a clinical applicability of the method

  2. A 1D constitutive model for shape memory alloy using strain and temperature as control variables and including martensite reorientation and asymmetric behaviors

    Jaber, M Ben; Mehrez, S; Ghazouani, O

    2014-01-01

    In this paper, a new 1D constitutive model for shape memory alloy using strain and temperature as control variables is presented. The new formulation is restricted to the 1D stress case and takes into account the martensite reorientation and the asymmetry of the SMA behavior in tension and compression. Numerical implementation of the new model in a finite element code was conducted. The numerical results for superelastic behavior in tension and compression tests are presented and were compared to experimental data taken from the literature. Other numerical tests are presented, showing the model’s ability to reproduce the main aspects of SMA behavior such as the shape memory effect and the martensite reorientation under cyclic loading. Finally, to demonstrate the utility of the new constitutive model, a dynamic test of a bi-clamped SMA bending beam under forced oscillation is described. (paper)

  3. Internal state variable plasticity-damage modeling of AISI 4140 steel including microstructure-property relations: temperature and strain rate effects

    Nacif el Alaoui, Reda

    Mechanical structure-property relations have been quantified for AISI 4140 steel. under different strain rates and temperatures. The structure-property relations were used. to calibrate a microstructure-based internal state variable plasticity-damage model for. monotonic tension, compression and torsion plasticity, as well as damage evolution. Strong stress state and temperature dependences were observed for the AISI 4140 steel. Tension tests on three different notched Bridgman specimens were undertaken to study. the damage-triaxiality dependence for model validation purposes. Fracture surface. analysis was performed using Scanning Electron Microscopy (SEM) to quantify the void. nucleation and void sizes in the different specimens. The stress-strain behavior exhibited. a fairly large applied stress state (tension, compression dependence, and torsion), a. moderate temperature dependence, and a relatively small strain rate dependence.

  4. Use of CTX-I and PINP as bone turnover markers: National Bone Health Alliance recommendations to standardize sample handling and patient preparation to reduce pre-analytical variability.

    Szulc, P; Naylor, K; Hoyle, N R; Eastell, R; Leary, E T

    2017-09-01

    The National Bone Health Alliance (NBHA) recommends standardized sample handling and patient preparation for C-terminal telopeptide of type I collagen (CTX-I) and N-terminal propeptide of type I procollagen (PINP) measurements to reduce pre-analytical variability. Controllable and uncontrollable patient-related factors are reviewed to facilitate interpretation and minimize pre-analytical variability. The IOF and the International Federation of Clinical Chemistry (IFCC) Bone Marker Standards Working Group have identified PINP and CTX-I in blood to be the reference markers of bone turnover for the fracture risk prediction and monitoring of osteoporosis treatment. Although used in clinical research for many years, bone turnover markers (BTM) have not been widely adopted in clinical practice primarily due to their poor within-subject and between-lab reproducibility. The NBHA Bone Turnover Marker Project team aim to reduce pre-analytical variability of CTX-I and PINP measurements through standardized sample handling and patient preparation. Recommendations for sample handling and patient preparations were made based on review of available publications and pragmatic considerations to reduce pre-analytical variability. Controllable and un-controllable patient-related factors were reviewed to facilitate interpretation and sample collection. Samples for CTX-I must be collected consistently in the morning hours in the fasted state. EDTA plasma is preferred for CTX-I for its greater sample stability. Sample collection conditions for PINP are less critical as PINP has minimal circadian variability and is not affected by food intake. Sample stability limits should be observed. The uncontrollable aspects (age, sex, pregnancy, immobility, recent fracture, co-morbidities, anti-osteoporotic drugs, other medications) should be considered in BTM interpretation. Adopting standardized sample handling and patient preparation procedures will significantly reduce controllable pre

  5. Optimization of Standard In-House 24-Locus Variable-Number Tandem-Repeat Typing for Mycobacterium tuberculosis and Its Direct Application to Clinical Material

    de Beer, Jessica L.; Akkerman, Onno W.; Schurch, Anita C.; Mulder, Arnout; van der Werf, Tjip S.; van der Zanden, Adri G. M.; van Ingen, Jakko; van Soolingen, Dick

    Variable-number tandem-repeat (VNTR) typing with a panel of 24 loci is the current gold standard in the molecular typing of Mycobacterium tuberculosis complex isolates. However, because of technical problems, a part of the loci often cannot be amplified by multiplex PCRs. Therefore, a considerable

  6. LB02.03: EVALUATION OF DAY-BY-DAY BLOOD PRESSURE VARIABILITY IN CLINIC (DO WE STILL NEED STANDARD DEVIATION?).

    Ryuzaki, M; Nakamoto, H; Hosoya, K; Komatsu, M; Hibino, Y

    2015-06-01

    Blood pressure (BP) variability correlates with cardio-vascular disease as BP level itself. There is not known easy way to evaluate the BP variability in clinic.To evaluate the usefulness of maximum-minimum difference (MMD) of BP in a month compared to standard deviation (SD), as an index of BP variability. Study-1: Twelve patients (age 65.9 ± 12.1 y/o) were enrolled. Measurements of home systolic (S) BP were required in the morning. The 12 months consecutive data and at least 3 times measurements a month were required for including. (Mean 29.0 ± 4.5 times/month in the morning). We checked the correlation between MMD and SD. Study-2: Six hemodialized patients monitored with i-TECHO system (J of Hypertens 2007: 25: 2353-2358) for longer than one year were analyzed. As in study-1, we analyzed the correlation between SD and MMD of SBP. 17.4 ± 11.9 times per month. Study-3: The data from our previous study (FUJIYAM study Clin. Exp Hypertens 2014: 36:508-16) were extracted. 1524 patient-month morning BP data were calculated as in study-1. Picking up data measuring more than 24 times a month, 517 patient-month BP data were analyzed. We compared the ratio to 25 times measured data of SD and MMD, in the setting 5, 10, 15, 20 times measured data. Study-1: SBP, MMD was correlated very well to SD (p  2 times. If data were extracted (measurements>24 times), correlation was 0.927 (P < 0.0001). The equation of SBPSD = 1.520+ 0.201xMMD. The ratios of SD to 25 times were as follows; 0.956 in 5 times, 0.956 in 10, 0.979 in 15, 0.991 in 20 times. The ratios of MMD to 25 times were as follows; 0.558 in 5, 0.761 in 10, 0.874 in 15, 0.944 in 20. We can assume SD easily by measuring MMD as an index of day-by-day BP variability of a month. The equation formulas were very similar though the patients' groups were different. But we have to be careful how many times patients measure in a month.

  7. Characterization of Genotoxic Response to 15 Multiwalled Carbon Nanotubes with Variable Physicochemical Properties Including Surface Functionalizations in the FE1-Muta(TM) Mouse Lung Epithelial Cell Line

    Jackson, Petra; Kling, Kirsten; Jensen, Keld Alstrup

    2015-01-01

    Carbon nanotubes vary greatly in physicochemical properties. We compared cytotoxic and genotoxic response to 15 multiwalled carbon nanotubes (MWCNT) with varying physicochemical properties to identify drivers of toxic responses. The studied MWCNT included OECD Working Party on Manufactured...... Nanomaterials (WPMN) (NM-401, NM-402, and NM-403), materials (NRCWE-026 and MWCNT-XNRI-7), and three sets of surface-modified MWCNT grouped by physical characteristics (thin, thick, and short I-III, respectively). Each Groups I-III included pristine, hydroxylated and carboxylated MWCNT. Group III also included...... an amino-functionalized MWCNT. The level of surface functionalization of the MWCNT was low. The level and type of elemental impurities of the MWCNT varied by...

  8. The use of a xylosylated plant glycoprotein as an internal standard accounting for N-linked glycan cleavage and sample preparation variability.

    Walker, S Hunter; Taylor, Amber D; Muddiman, David C

    2013-06-30

    Traditionally, free oligosaccharide internal standards are used to account for variability in glycan relative quantification experiments by mass spectrometry. However, a more suitable internal standard would be a glycoprotein, which could also control for enzymatic cleavage efficiency, allowing for more accurate quantitative experiments. Hydrophobic, hydrazide N-linked glycan reagents (both native and stable-isotope labeled) are used to derivatize and differentially label N-linked glycan samples for relative quantification, and the samples are analyzed by a reversed-phase liquid chromatography chip system coupled online to a Q-Exactive mass spectrometer. The inclusion of two internal standards, maltoheptaose (previously used) and horseradish peroxidase (HRP) (novel), is studied to demonstrate the effectiveness of using a glycoprotein as an internal standard in glycan relative quantification experiments. HRP is a glycoprotein containing a xylosylated N-linked glycan, which is unique from mammalian N-linked glycans. Thus, the internal standard xylosylated glycan could be detected without interference to the sample. Additionally, it was shown that differences in cleavage efficiency can be detected by monitoring the HRP glycan. In a sample where cleavage efficiency variation is minimal, the HRP glycan performs as well as maltoheptaose. Because the HRP glycan performs as well as maltoheptaose but is also capable of correcting and accounting for cleavage variability, it is a more versatile internal standard and will be used in all subsequent biological studies. Because of the possible lot-to-lot variation of an enzyme, differences in biological matrix, and variable enzyme activity over time, it is a necessity to account for glycan cleavage variability in glycan relative quantification experiments. Copyright © 2013 John Wiley & Sons, Ltd.

  9. Mercury Concentrations in Fish and Sediment within Streams are Influenced by Watershed and Landscape Variables including Historical Gold Mining in the Sierra Nevada, California

    Alpers, C. N.; Yee, J. L.; Ackerman, J. T.; Orlando, J. L.; Slotton, D. G.; Marvin-DiPasquale, M. C.

    2015-12-01

    We compiled available data on total mercury (THg) and methylmercury (MeHg) concentrations in fish tissue and streambed sediment from stream sites in the Sierra Nevada, California, to assess whether spatial data, including information on historical mining, can be used to make robust predictions of fish fillet tissue THg concentrations. A total of 1,271 fish from five species collected at 103 sites during 1980-2012 were used for the modeling effort: 210 brown trout, 710 rainbow trout, 79 Sacramento pikeminnow, 93 Sacramento sucker, and 179 smallmouth bass. Sediment data were used from 73 sites, including 106 analyses of THg and 77 analyses of MeHg. The dataset included 391 fish (mostly rainbow trout) and 28 sediment samples collected explicitly for this study during 2011-12. Spatial data on historical mining included the USGS Mineral Resources Data System and publicly available maps and satellite photos showing the areas of hydraulic mine pits and other placer mines. Modeling was done using multivariate linear regression and multi-model inference using Akaike Information Criteria. Results indicate that fish THg, accounting for species and length, can be predicted using geospatial data on mining history together with other landscape characteristics including land use/land cover. A model requiring only geospatial data, with an R2 value of 0.61, predicted fish THg correctly with respect to over-or-under 0.2 μg/g wet weight (a California regulatory threshold) for 108 of 121 (89 %) size-species combinations tested. Data for THg in streambed sediment did not improve the geospatial-only model. However, data for sediment MeHg, loss on ignition (organic content), and percent of sediment less than 0.063 mm resulted in a slightly improved model, with an R2 value of 0.63. It is anticipated that these models will be useful to the State of California and others to predict areas where mercury concentrations in fish are likely to exceed regulatory criteria.

  10. Use of a non-linear method for including the mass uncertainty of gravimetric standards and system measurement errors in the fitting of calibration curves for XRFA freeze-dried UNO3 standards

    Pickles, W.L.; McClure, J.W.; Howell, R.H.

    1978-05-01

    A sophisticated nonlinear multiparameter fitting program was used to produce a best fit calibration curve for the response of an x-ray fluorescence analyzer to uranium nitrate, freeze dried, 0.2% accurate, gravimetric standards. The program is based on unconstrained minimization subroutine, VA02A. The program considers the mass values of the gravimetric standards as parameters to be fit along with the normal calibration curve parameters. The fitting procedure weights with the system errors and the mass errors in a consistent way. The resulting best fit calibration curve parameters reflect the fact that the masses of the standard samples are measured quantities with a known error. Error estimates for the calibration curve parameters can be obtained from the curvature of the ''Chi-Squared Matrix'' or from error relaxation techniques. It was shown that nondispersive XRFA of 0.1 to 1 mg freeze-dried UNO 3 can have an accuracy of 0.2% in 1000 s

  11. A novel synthetic quantification standard including virus and internal report targets: application for the detection and quantification of emerging begomoviruses on tomato

    Péréfarres, Frédéric; Hoareau, Murielle; Chiroleu, Frédéric; Reynaud, Bernard; Dintinger, Jacques; Lett, Jean-Michel

    2011-01-01

    Abstract Background Begomovirus is a genus of phytopathogenic single-stranded DNA viruses, transmitted by the whitefly Bemisia tabaci. This genus includes emerging and economically significant viruses such as those associated with Tomato Yellow Leaf Curl Disease, for which diagnostic tools are needed to prevent dispersion and new introductions. Five real-time PCRs with an internal tomato reporter gene were developed for accurate detection and quantification of monopartite begomoviruses, inclu...

  12. Optimal Selective Harmonic Mitigation Technique on Variable DC Link Cascaded H-Bridge Converter to Meet Power Quality Standards

    Najjar, Mohammad; Moeini, Amirhossein; Dowlatabadi, Mohammadkazem Bakhshizadeh

    2016-01-01

    In this paper, the power quality standards such as IEC 61000-3-6, IEC 61000-2-12, EN 50160, and CIGRE WG 36-05 are fulfilled for single- and three-phase medium voltage applications by using Selective Harmonic Mitigation-PWM (SHM-PWM) in a Cascaded H-Bridge (CHB) converter. Furthermore, the ER G5/...

  13. Preface of "The Second Symposium on Border Zones Between Experimental and Numerical Application Including Solution Approaches By Extensions of Standard Numerical Methods"

    Ortleb, Sigrun; Seidel, Christian

    2017-07-01

    In this second symposium at the limits of experimental and numerical methods, recent research is presented on practically relevant problems. Presentations discuss experimental investigation as well as numerical methods with a strong focus on application. In addition, problems are identified which require a hybrid experimental-numerical approach. Topics include fast explicit diffusion applied to a geothermal energy storage tank, noise in experimental measurements of electrical quantities, thermal fluid structure interaction, tensegrity structures, experimental and numerical methods for Chladni figures, optimized construction of hydroelectric power stations, experimental and numerical limits in the investigation of rain-wind induced vibrations as well as the application of exponential integrators in a domain-based IMEX setting.

  14. Variability in recording and scoring of respiratory events during sleep in Europe: a need for uniform standards.

    Arnardottir, Erna S; Verbraecken, Johan; Gonçalves, Marta; Gjerstad, Michaela D; Grote, Ludger; Puertas, Francisco Javier; Mihaicuta, Stefan; McNicholas, Walter T; Parrino, Liborio

    2016-04-01

    Uniform standards for the recording and scoring of respiratory events during sleep are lacking in Europe, although many centres follow the published recommendations of the American Academy of Sleep Medicine. The aim of this study was to assess the practice for the diagnosis of sleep-disordered breathing throughout Europe. A specially developed questionnaire was sent to representatives of the 31 national sleep societies in the Assembly of National Sleep Societies of the European Sleep Research Society, and a total of 29 countries completed the questionnaire. Polysomnography was considered the primary diagnostic method for sleep apnea diagnosis in 10 (34.5%), whereas polygraphy was used primarily in six (20.7%) European countries. In the remaining 13 countries (44.8%), no preferred methodology was used. Fifteen countries (51.7%) had developed some type of national uniform standards, but these standards varied significantly in terms of scoring criteria, device specifications and quality assurance procedures between countries. Only five countries (17.2%) had published these standards. Most respondents supported the development of uniform recording and scoring criteria for Europe, which might be based partly on the existing American Academy of Sleep Medicine rules, but also take into account differences in European practice when compared to North America. This survey highlights the current varying approaches to the assessment of patients with sleep-disordered breathing throughout Europe and supports the need for the development of practice parameters in the assessment of such patients that would be suited to European clinical practice. © 2015 European Sleep Research Society.

  15. Standardized FDG uptake as a prognostic variable and as a predictor of incomplete cytoreduction in primary advanced ovarian cancer

    Risum, Signe; Jakobsen, Annika Loft; Høgdall, Claus

    2011-01-01

    Abstract Introduction. In patients with advanced ovarian cancer undergoing preoperative PET/CT, we investigated the prognostic value of SUV in the primary tumor and we evaluated the value of SUV for predicting incomplete primary cytoreduction (macroscopic residual tumor). Material and methods. From...... debulking (no macroscopic residual tumor); median SUV(max) was 13.5 (range 2.5-39.0). Median follow-up was 30.2 months. At follow-up 57% (34/60) were alive and 43% (26/60) had died from ovarian cancer. SUV(max) in patients alive was not statistically different from SUV(max) in dead patients (p=0.......69), and SUV(max) was not correlated with the amount of residual tumor after surgery (p=0.19). Using univariate Cox regression analysis, residual tumor was a significant prognostic variable (p=0.001); SUV(max) was not a statistically significant prognostic variable (p=0.86). Discussion. FDG uptake (SUV...

  16. Cytomegalovirus sequence variability, amplicon length, and DNase-sensitive non-encapsidated genomes are obstacles to standardization and commutability of plasma viral load results.

    Naegele, Klaudia; Lautenschlager, Irmeli; Gosert, Rainer; Loginov, Raisa; Bir, Katia; Helanterä, Ilkka; Schaub, Stefan; Khanna, Nina; Hirsch, Hans H

    2018-04-22

    Cytomegalovirus (CMV) management post-transplantation relies on quantification in blood, but inter-laboratory and inter-assay variability impairs commutability. An international multicenter study demonstrated that variability is mitigated by standardizing plasma volumes, automating DNA extraction and amplification, and calibration to the 1st-CMV-WHO-International-Standard as in the FDA-approved Roche-CAP/CTM-CMV. However, Roche-CAP/CTM-CMV showed under-quantification and false-negative results in a quality assurance program (UK-NEQAS-2014). To evaluate factors contributing to quantification variability of CMV viral load and to develop optimized CMV-UL54-QNAT. The UL54 target of the UK-NEQAS-2014 variant was sequenced and compared to 329 available CMV GenBank sequences. Four Basel-CMV-UL54-QNAT assays of 361 bp, 254 bp, 151 bp, and 95 bp amplicons were developed that only differed in reverse primer positions. The assays were validated using plasmid dilutions, UK-NEQAS-2014 sample, as well as 107 frozen and 69 prospectively collected plasma samples from transplant patients submitted for CMV QNAT, with and without DNase-digestion prior to nucleic acid extraction. Eight of 43 mutations were identified as relevant in the UK-NEQAS-2014 target. All Basel-CMV-UL54 QNATs quantified the UK-NEQAS-2014 but revealed 10-fold increasing CMV loads as amplicon size decreased. The inverse correlation of amplicon size and viral loads was confirmed using 1st-WHO-International-Standard and patient samples. DNase pre-treatment reduced plasma CMV loads by >90% indicating the presence of unprotected CMV genomic DNA. Sequence variability, amplicon length, and non-encapsidated genomes obstruct standardization and commutability of CMV loads needed to develop thresholds for clinical research and management. Besides regular sequence surveys, matrix and extraction standardization, we propose developing reference calibrators using 100 bp amplicons. Copyright © 2018 Elsevier B.V. All

  17. A composite model including visfatin, tissue polypeptide-specific antigen, hyaluronic acid, and hematological variables for the diagnosis of moderate-to-severe fibrosis in nonalcoholic fatty liver disease: a preliminary study.

    Chwist, Alina; Hartleb, Marek; Lekstan, Andrzej; Kukla, Michał; Gutkowski, Krzysztof; Kajor, Maciej

    2014-01-01

    Histopathological risk factors for end-stage liver failure in patients with nonalcoholic fatty liver disease (NAFLD) include nonalcoholic steatohepatitis (NASH) and advanced liver fibrosis. There is a need for noninvasive diagnostic methods for these 2 conditions. The aim of this study was to investigate new laboratory variables with a predictive potential to detect advanced fibrosis (stages 2 and 3) in NAFLD. The study involved 70 patients with histologically proven NAFLD of varied severity. Additional laboratory variables included zonulin, haptoglobin, visfatin, adiponectin, leptin, tissue polypeptide-specific antigen (TPSA), hyaluronic acid, and interleukin 6. Patients with NASH (NAFLD activity score of ≥5) had significantly higher HOMA-IR values and serum levels of visfatin, haptoglobin, and zonulin as compared with those without NASH on histological examination. Advanced fibrosis was found in 16 patients (22.9%) and the risk factors associated with its prevalence were age, the ratio of erythrocyte count to red blood cell distribution width, platelet count, and serum levels of visfatin and TPSA. Based on these variables, we constructed a scoring system that differentiated between NAFLD patients with and without advanced fibrosis with a sensitivity of 75% and specificity of 100% (area under the receiver operating characteristic curve, 0.93). The scoring system based on the above variables allows to predict advanced fibrosis with high sensitivity and specificity. However, its clinical utility should be verified in further studies involving a larger number of patients.

  18. Introducing a true internal standard for the Comet assay to minimize intra- and inter-experiment variability in measures of DNA damage and repair

    Zainol, Murizal; Stoute, Julia; Almeida, Gabriela M.; Rapp, Alexander; Bowman, Karen J.; Jones, George D. D.

    2009-01-01

    The Comet assay (CA) is a sensitive/simple measure of genotoxicity. However, many features of CA contribute variability. To minimize these, we have introduced internal standard materials consisting of ‘reference’ cells which have their DNA substituted with BrdU. Using a fluorescent anti-BrdU antibody, plus an additional barrier filter, comets derived from these cells could be readily distinguished from the ‘test’-cell comets, present in the same gel. In experiments to evaluate the reference cell comets as external and internal standards, the reference and test cells were present in separate gels on the same slide or mixed together in the same gel, respectively, before their co-exposure to X-irradiation. Using the reference cell comets as internal standards led to substantial reductions in the coefficient of variation (CoV) for intra- and inter-experimental measures of comet formation and DNA damage repair; only minor reductions in CoV were noted when the reference and test cell comets were in separate gels. These studies indicate that differences between individual gels appreciably contribute to CA variation. Further studies using the reference cells as internal standards allowed greater significance to be obtained between groups of replicate samples. Ultimately, we anticipate that development will deliver robust quality assurance materials for CA. PMID:19828597

  19. Discrete wavelet transform-based investigation into the variability of standardized precipitation index in Northwest China during 1960-2014

    Yang, Peng; Xia, Jun; Zhan, Chesheng; Zhang, Yongyong; Hu, Sheng

    2018-04-01

    In this study, the temporal variations of the standard precipitation index (SPI) were analyzed at different scales in Northwest China (NWC). Discrete wavelet transform (DWT) was used in conjunction with the Mann-Kendall (MK) test in this study. This study also investigated the relationships between original precipitation and different periodic components of SPI series with datasets spanning 55 years (1960-2014). The results showed that with the exception of the annual and summer SPI in the Inner Mongolia Inland Rivers Basin (IMIRB), spring SPI in the Qinghai Lake Rivers Basin (QLRB), and spring SPI in the Central Asia Rivers Basin (CARB), it had an increasing trend in other regions for other time series. In the spring, summer, and autumn series, though the MK trends test in most areas was at the insignificant level, they showed an increasing trend in precipitation. Meanwhile, the SPI series in most subbasins of NWC displayed a turning point in 1980-1990, with the significant increasing levels after 2000. Additionally, there was a significant difference between the trend of the original SPI series and the largest approximations. The annual and seasonal SPI series were composed of the short periodicities, which were less than a decade. The MK value would increase by adding the multiple D components (and approximations), and the MK value of the combined series was in harmony with that of the original series. Additionally, the major trend of the annual SPI in NWC was based on the four kinds of climate indices (e.g., Atlantic Oscillation [AO], North Atlantic Oscillation [NAO], Pacific Decadal Oscillation [PDO], and El Nino-Southern Oscillation index [ENSO/NINO]), especially the ENSO.

  20. Standardised Radon Index (SRI: a normalisation of radon data-sets in terms of standard normal variables

    R. G. M. Crockett

    2011-07-01

    Full Text Available During the second half of 2002, from late June to mid December, the University of Northampton Radon Research Group operated two continuous hourly-sampling radon detectors 2.25 km apart in the English East Midlands. This period included the Dudley earthquake (ML = 5, 22 September 2002 and also a smaller earthquake in the English Channel (ML = 3, 26 August 2002. Rolling/sliding windowed cross-correlation of the paired radon time-series revealed periods of simultaneous similar radon anomalies which occurred at the time of these earthquakes but at no other times during the overall radon monitoring period. Standardising the radon data in terms of probability of magnitude, analogous to the Standardised Precipitation Indices (SPIs used in drought modelling, which effectively equalises different non-linear responses, reveals that the dissimilar relative magnitudes of the anomalies are in fact closely equiprobabilistic. Such methods could help in identifying anomalous signals in radon – and other – time-series and in evaluating their statistical significance in terms of earthquake precursory behaviour.

  1. Uganda; Financial System Stability Assessment, including Reports on the Observance of Standards and Codes on the following topics: Monetary and Financial Policy Transparency, Banking Supervision, Securities Regulation, and Payment Systems

    International Monetary Fund

    2003-01-01

    This paper presents findings of Uganda’s Financial System Stability Assessment, including Reports on the Observance of Standards and Codes on Monetary and Financial Policy Transparency, Banking Supervision, Securities Regulation, Insurance Regulation, Corporate Governance, and Payment Systems. The banking system in Uganda, which dominates the financial system, is fundamentally sound, more resilient than in the past, and currently poses no threat to macroeconomic stability. A major disruption ...

  2. Effects of central nervous system drugs on driving: speed variability versus standard deviation of lateral position as outcome measure of the on-the-road driving test.

    Verster, Joris C; Roth, Thomas

    2014-01-01

    The on-the-road driving test in normal traffic is used to examine the impact of drugs on driving performance. This paper compares the sensitivity of standard deviation of lateral position (SDLP) and SD speed in detecting driving impairment. A literature search was conducted to identify studies applying the on-the-road driving test, examining the effects of anxiolytics, antidepressants, antihistamines, and hypnotics. The proportion of comparisons (treatment versus placebo) where a significant impairment was detected with SDLP and SD speed was compared. About 40% of 53 relevant papers did not report data on SD speed and/or SDLP. After placebo administration, the correlation between SDLP and SD speed was significant but did not explain much variance (r = 0.253, p = 0.0001). A significant correlation was found between ΔSDLP and ΔSD speed (treatment-placebo), explaining 48% of variance. When using SDLP as outcome measure, 67 significant treatment-placebo comparisons were found. Only 17 (25.4%) were significant when SD speed was used as outcome measure. Alternatively, for five treatment-placebo comparisons, a significant difference was found for SD speed but not for SDLP. Standard deviation of lateral position is a more sensitive outcome measure to detect driving impairment than speed variability.

  3. Variability in a three-generation family with Pierre Robin sequence, acampomelic campomelic dysplasia, and intellectual disability due to a novel ∼1 Mb deletion upstream of SOX9, and including KCNJ2 and KCNJ16.

    Castori, Marco; Bottillo, Irene; Morlino, Silvia; Barone, Chiara; Cascone, Piero; Grammatico, Paola; Laino, Luigi

    2016-01-01

    Campomelic dysplasia and acampomelic campomelic dysplasia (ACD) are allelic disorders due to heterozygous mutations in or around SOX9. Translocations and deletions involving the SOX9 5' regulatory region are rare causes of these disorders, as well as Pierre Robin sequence (PRS) and 46,XY gonadal dysgenesis. Genotype-phenotype correlations are not straightforward due to the complex epigenetic regulation of SOX9 expression during development. We report a three-generation pedigree with a novel ∼1 Mb deletion upstream of SOX9 and including KCNJ2 and KCNJ16, and ascertained for dominant transmission of PRS. Further characterization of the family identified subtle appendicular anomalies and a variable constellation of axial skeletal features evocative of ACD in several members. Affected males showed learning disability. The identified deletion was smaller than all other chromosome rearrangements associated with ACD. Comparison with other reported translocations and deletions involving this region allowed further refining of genotype-phenotype correlations and an update of the smallest regions of overlap associated with the different phenotypes. Intrafamilial variability in this pedigree suggests a phenotypic continuity between ACD and PRS in patients carrying mutations in the SOX9 5' regulatory region. © 2015 Wiley Periodicals, Inc.

  4. Effects of heat loss as percentage of fuel's energy, friction and variable specific heats of working fluid on performance of air standard Otto cycle

    Lin, J.-C.; Hou, S.-S.

    2008-01-01

    The objective of this study is to analyze the effects of heat loss characterized by a percentage of the fuel's energy, friction and variable specific heats of working fluid on the performance of an air standard Otto cycle with a restriction of maximum cycle temperature. A more realistic and precise relationship between the fuel's chemical energy and the heat leakage that is based on a pair of inequalities is derived through the resulting temperature. The variations in power output and thermal efficiency with compression ratio, and the relations between the power output and the thermal efficiency of the cycle are presented. The results show that the power output as well as the efficiency where maximum power output occurs will increase with increase of the maximum cycle temperature. The temperature dependent specific heats of the working fluid have a significant influence on the performance. The power output and the working range of the cycle increase with the increase of specific heats of the working fluid, while the efficiency decreases with the increase of specific heats of the working fluid. The friction loss has a negative effect on the performance. Therefore, the power output and efficiency of the cycle decrease with increasing friction loss. It is noteworthy that the effects of heat loss characterized by a percentage of the fuel's energy, friction and variable specific heats of the working fluid on the performance of an Otto cycle engine are significant and should be considered in practical cycle analysis. The results obtained in the present study are of importance to provide good guidance for performance evaluation and improvement of practical Otto engines

  5. Impact of pretreatment variables on the outcome of 131I therapy with a standardized dose of 150 Gray in Graves' disease

    Pfeilschifter, J.; Elser, H.; Haufe, S.; Ziegler, R.; Georgi, P.

    1997-01-01

    Aim: We examined the impact of several pretreatment variables on thyroid size and function in 61 patients with Graves' disease one year after a standardized [131[I treatment with 150 Gray. Methods: FT3, FT4, and TSH serum concentrations were determined before and 1.5, 3, 6, and 12 months after therapy. Thyroid size was measured by ultrasound and scintigraphy before and one year after therapy. Results: One year after therapy, 30% of the patients had latent or manifest hyperthyroidism, 24% were euthyroid, and 46% had developed latent or manifest hypothyroidism. Age and initial thyroid volume were major predictors of posttherapeutical thyroid function. Thus, persistent hyperthyroidism was observed in 70% of the patients age 50 years and older with a thyroid size of more than 50 ml. With few exception, thyroid size markedly decreased after therapy. Initial thyroid size and age were also major predictors of posttherapeutical thyroid volume. Thyroid size normalized in all patients younger than 50 years of age, independent from initial thyroid size. Conclusion: Radioiodine treatment with 150 Gray causes a considerable decrease in thyroid size in most patients with Graves' disease. Age and initial thyroid volume are important determinants of thyroid function and size after therapy and should be considered in dose calculation. (orig.) [de

  6. Variable effects of high-dose adrenaline relative to standard-dose adrenaline on resuscitation outcomes according to cardiac arrest duration.

    Jeung, Kyung Woon; Ryu, Hyun Ho; Song, Kyung Hwan; Lee, Byung Kook; Lee, Hyoung Youn; Heo, Tag; Min, Yong Il

    2011-07-01

    Adjustment of adrenaline (epinephrine) dosage according to cardiac arrest (CA) duration, rather than administering the same dose, may theoretically improve resuscitation outcomes. We evaluated variable effects of high-dose adrenaline (HDA) relative to standard-dose adrenaline (SDA) on resuscitation outcomes according to CA duration. Twenty-eight male domestic pigs were randomised to the following 4 groups according to the dosage of adrenaline (SDA 0.02 mg/kg vs. HDA 0.2mg/kg) and duration of CA before beginning cardiopulmonary resuscitation (CPR): 6 min SDA, 6 min HDA, 13 min SDA, or 13 min HDA. After the predetermined duration of untreated ventricular fibrillation, CPR was provided. All animals in the 6 min SDA, 6 min HDA, and 13 min HDA groups were successfully resuscitated, while only 4 of 7 pigs in the 13 min SDA group were successfully resuscitated (p=0.043). HDA groups showed higher right atrial pressure, more frequent ventricular ectopic beats, higher blood glucose, higher troponin-I, and more severe metabolic acidosis than SDA groups. Animals of 13 min groups showed more severe metabolic acidosis and higher troponin-I than animals of 6 min groups. All successfully resuscitated animals, except two animals in the 13 min HDA group, survived for 7 days (p=0.121). Neurologic deficit score was not affected by the dose of adrenaline. HDA showed benefit in achieving restoration of spontaneous circulation in 13 min CA, when compared with 6 min CA. However, this benefit did not translate into improved long-term survival or neurologic outcome. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  7. Decommissioning standards

    Crofford, W.N.

    1980-01-01

    EPA has agreed to establish a series of environmental standards for the safe disposal of radioactive waste through participation in the Interagency Review Group on Nuclear Waste Management (IRG). One of the standards required under the IRG is the standard for decommissioning of radioactive contaminated sites, facilities, and materials. This standard is to be proposed by December 1980 and promulgated by December 1981. Several considerations are important in establishing these standards. This study includes discussions of some of these considerations and attempts to evaluate their relative importance. Items covered include: the form of the standards, timing for decommissioning, occupational radiation protection, costs and financial provisions. 4 refs

  8. Linear versus non-linear measures of temporal variability in finger tapping and their relation to performance on open- versus closed-loop motor tasks: comparing standard deviations to Lyapunov exponents.

    Christman, Stephen D; Weaver, Ryan

    2008-05-01

    The nature of temporal variability during speeded finger tapping was examined using linear (standard deviation) and non-linear (Lyapunov exponent) measures. Experiment 1 found that right hand tapping was characterised by lower amounts of both linear and non-linear measures of variability than left hand tapping, and that linear and non-linear measures of variability were often negatively correlated with one another. Experiment 2 found that increased non-linear variability was associated with relatively enhanced performance on a closed-loop motor task (mirror tracing) and relatively impaired performance on an open-loop motor task (pointing in a dark room), especially for left hand performance. The potential uses and significance of measures of non-linear variability are discussed.

  9. Toward the standard population synthesis model of the X-ray background: Evolution of X-ray luminosity and absorption functions of active galactic nuclei including Compton-thick populations

    Ueda, Yoshihiro [Department of Astronomy, Kyoto University, Kitashirakawa-Oiwake-cho, Sakyo-ku, Kyoto 606-8502 (Japan); Akiyama, Masayuki [Astronomical Institute, Tohoku University, 6-3 Aramaki, Aoba-ku, Sendai 980-8578 (Japan); Hasinger, Günther [Institute for Astronomy, 2680 Woodlawn Drive Honolulu, HI 96822-1839 (United States); Miyaji, Takamitsu [Instituto de Astronomía, Universidad Nacional Autónoma de México, Ensenada, Baja California (Mexico); Watson, Michael G. [Department of Physics and Astronomy, University of Leicester, University Road, Leicester LE1 7RH (United Kingdom)

    2014-05-10

    We present the most up to date X-ray luminosity function (XLF) and absorption function of active galactic nuclei (AGNs) over the redshift range from 0 to 5, utilizing the largest, highly complete sample ever available obtained from surveys performed with Swift/BAT, MAXI, ASCA, XMM-Newton, Chandra, and ROSAT. The combined sample, including that of the Subaru/XMM-Newton Deep Survey, consists of 4039 detections in the soft (0.5-2 keV) and/or hard (>2 keV) band. We utilize a maximum likelihood method to reproduce the count rate versus redshift distribution for each survey, by taking into account the evolution of the absorbed fraction, the contribution from Compton-thick (CTK) AGNs, and broadband spectra of AGNs, including reflection components from tori based on the luminosity- and redshift-dependent unified scheme. We find that the shape of the XLF at z ∼ 1-3 is significantly different from that in the local universe, for which the luminosity-dependent density evolution model gives much better description than the luminosity and density evolution model. These results establish the standard population synthesis model of the X-ray background (XRB), which well reproduces the source counts, the observed fractions of CTK AGNs, and the spectrum of the hard XRB. The number ratio of CTK AGNs to the absorbed Compton-thin (CTN) AGNs is constrained to be ≈0.5-1.6 to produce the 20-50 keV XRB intensity within present uncertainties, by assuming that they follow the same evolution as CTN AGNs. The growth history of supermassive black holes is discussed based on the new AGN bolometric luminosity function.

  10. Is Heart Period Variability Associated with the Administration of Lifesaving Interventions in Individual Prehospital Trauma Patients with Normal Standard Vital Signs?

    2010-08-01

    heart period variability as an indicator of mortality in intensive care unit patients many hours before death. Similarly, re- cent studies using data...the Department of Health and Kinesiology (CAR), The University of Texas at San Antonio, San An- tonio, TX; U.S. Army Institute of Surgical Research (CAR

  11. Diagnostic screening identifies a wide range of mutations involving the SHOX gene, including a common 47.5 kb deletion 160 kb downstream with a variable phenotypic effect.

    Bunyan, David J; Baker, Kevin R; Harvey, John F; Thomas, N Simon

    2013-06-01

    Léri-Weill dyschondrosteosis (LWD) results from heterozygous mutations of the SHOX gene, with homozygosity or compound heterozygosity resulting in the more severe form, Langer mesomelic dysplasia (LMD). These mutations typically take the form of whole or partial gene deletions, point mutations within the coding sequence, or large (>100 kb) 3' deletions of downstream regulatory elements. We have analyzed the coding sequence of the SHOX gene and its downstream regulatory regions in a cohort of 377 individuals referred with symptoms of LWD, LMD or short stature. A causative mutation was identified in 68% of the probands with LWD or LMD (91/134). In addition, a 47.5 kb deletion was found 160 kb downstream of the SHOX gene in 17 of the 377 patients (12% of the LWD referrals, 4.5% of all referrals). In 14 of these 17 patients, this was the only potentially causative abnormality detected (13 had symptoms consistent with LWD and one had short stature only), but the other three 47.5 kb deletions were found in patients with an additional causative SHOX mutation (with symptoms of LWD rather than LMD). Parental samples were available on 14/17 of these families, and analysis of these showed a more variable phenotype ranging from apparently unaffected to LWD. Breakpoint sequence analysis has shown that the 47.5 kb deletion is identical in all 17 patients, most likely due to an ancient founder mutation rather than recurrence. This deletion was not seen in 471 normal controls (P<0.0001), providing further evidence for a phenotypic effect, albeit one with variable penetration. Copyright © 2013 Wiley Periodicals, Inc.

  12. An examination of psychosocial variables moderating the relationship between life stress and injury time-loss among athletes of a high standard.

    Ford, I W; Eklund, R C; Gordon, S

    2000-05-01

    Based on Williams and Andersen's model of stress and athletic injury, six psychosocial variables were assessed as possible moderators of the relationship between life stress and injury among 121 athletes (65 males, 56 females) competing in a variety of sports at state, national or international level. No significant effects of the sex of the participants were evident. Correlational analyses revealed moderator effects of several variables. Specifically, dispositional optimism and hardiness were related to decreased injury time-loss in athletes when positive life change increased, and global self-esteem was associated with decreased injury time-loss when both negative life change and total life change increased. The results indicate that athletes with more optimism, hardiness or global self-esteem may cope more effectively with life change stress, resulting in reduced injury vulnerability and recovery rates.

  13. Complex variables

    Flanigan, Francis J

    2010-01-01

    A caution to mathematics professors: Complex Variables does not follow conventional outlines of course material. One reviewer noting its originality wrote: ""A standard text is often preferred [to a superior text like this] because the professor knows the order of topics and the problems, and doesn't really have to pay attention to the text. He can go to class without preparation."" Not so here-Dr. Flanigan treats this most important field of contemporary mathematics in a most unusual way. While all the material for an advanced undergraduate or first-year graduate course is covered, discussion

  14. High variability of the subjective visual vertical test of vertical perception, in some people with neck pain - Should this be a standard measure of cervical proprioception?

    Treleaven, Julia; Takasaki, Hiroshi

    2015-02-01

    Subjective visual vertical (SVV) assesses visual dependence for spacial orientation, via vertical perception testing. Using the computerized rod-and-frame test (CRFT), SVV is thought to be an important measure of cervical proprioception and might be greater in those with whiplash associated disorder (WAD), but to date research findings are inconsistent. The aim of this study was to investigate the most sensitive SVV error measurement to detect group differences between no neck pain control, idiopathic neck pain (INP) and WAD subjects. Cross sectional study. Neck Disability Index (NDI), Dizziness Handicap Inventory short form (DHIsf) and the average constant error (CE), absolute error (AE), root mean square error (RMSE), and variable error (VE) of the SVV were obtained from 142 subjects (48 asymptomatic, 36 INP, 42 WAD). The INP group had significantly (p pain or dizziness handicap. These findings are inconsistent with other measures of cervical proprioception in neck pain and more research is required before the SVV can be considered an important measure and utilized clinically. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  15. Standard practice for prediction of the long-term behavior of materials, including waste forms, used in engineered barrier systems (EBS) for geological disposal of high-level radioactive waste

    American Society for Testing and Materials. Philadelphia

    2007-01-01

    1.1 This practice describes test methods and data analyses used to develop models for the prediction of the long-term behavior of materials, such as engineered barrier system (EBS) materials and waste forms, used in the geologic disposal of spent nuclear fuel (SNF) and other high-level nuclear waste in a geologic repository. The alteration behavior of waste form and EBS materials is important because it affects the retention of radionuclides by the disposal system. The waste form and EBS materials provide a barrier to release either directly (as in the case of waste forms in which the radionuclides are initially immobilized), or indirectly (as in the case of containment materials that restrict the ingress of groundwater or the egress of radionuclides that are released as the waste forms and EBS materials degrade). 1.1.1 Steps involved in making such predictions include problem definition, testing, modeling, and model confirmation. 1.1.2 The predictions are based on models derived from theoretical considerat...

  16. Clinical Implications of Glucose Variability: Chronic Complications of Diabetes

    Hye Seung Jung

    2015-06-01

    Full Text Available Glucose variability has been identified as a potential risk factor for diabetic complications; oxidative stress is widely regarded as the mechanism by which glycemic variability induces diabetic complications. However, there remains no generally accepted gold standard for assessing glucose variability. Representative indices for measuring intraday variability include calculation of the standard deviation along with the mean amplitude of glycemic excursions (MAGE. MAGE is used to measure major intraday excursions and is easily measured using continuous glucose monitoring systems. Despite a lack of randomized controlled trials, recent clinical data suggest that long-term glycemic variability, as determined by variability in hemoglobin A1c, may contribute to the development of microvascular complications. Intraday glycemic variability is also suggested to accelerate coronary artery disease in high-risk patients.

  17. Standardization of biodosimetry operations

    Dainiak, Nicholas

    2016-01-01

    Methods and procedures for generating, interpreting and scoring the frequency of dicentric chromosomes vary among cytogenetic biodosimetry laboratories (CBLs). This variation adds to the already considerable lack of precision inherent in the dicentric chromosome assay (DCA). Although variability in sample collection, cell preparation, equipment and dicentric frequency scoring can never be eliminated with certainty, it can be substantially minimized, resulting in reduced scatter and improved precision. Use of standard operating procedures and technician exchange may help to mitigate variation. Although the development and adoption of international standards (ISO 21243 and ISO 19238) has helped to reduce variation in standard operating procedures (SOPs), all CBLs must maintain process improvement, and those with challenges may require additional assistance. Sources of variation that may not be readily apparent in the SOPs for sample collection and processing include variability in ambient laboratory conditions, media, serum lot and quantity and the use of particular combinations of cytokines. Variability in maintenance and calibration of metafer equipment, and in scoring criteria, reader proficiency and personal factors may need to be addressed. The calibration curve itself is a source of variation that requires control, using the same known-dose samples among CBLs, measurement of central tendency, and generation of common curves with periodic reassessment to detect drifts in dicentric yield. Finally, the dose estimate should be based on common scoring criteria, using of the z-statistic. Although theoretically possible, it is practically impossible to propagate uncertainty over the entire calibration curve due to the many factors contributing to variance. Periodic re-evaluation of the curve is needed by comparison with newly published curves (using statistical analysis of differences) and determining their potential causes. (author)

  18. Biological Sampling Variability Study

    Amidan, Brett G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hutchison, Janine R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-11-08

    There are many sources of variability that exist in the sample collection and analysis process. This paper addresses many, but not all, sources of variability. The main focus of this paper was to better understand and estimate variability due to differences between samplers. Variability between days was also studied, as well as random variability within each sampler. Experiments were performed using multiple surface materials (ceramic and stainless steel), multiple contaminant concentrations (10 spores and 100 spores), and with and without the presence of interfering material. All testing was done with sponge sticks using 10-inch by 10-inch coupons. Bacillus atrophaeus was used as the BA surrogate. Spores were deposited using wet deposition. Grime was coated on the coupons which were planned to include the interfering material (Section 3.3). Samples were prepared and analyzed at PNNL using CDC protocol (Section 3.4) and then cultured and counted. Five samplers were trained so that samples were taken using the same protocol. Each sampler randomly sampled eight coupons each day, four coupons with 10 spores deposited and four coupons with 100 spores deposited. Each day consisted of one material being tested. The clean samples (no interfering materials) were run first, followed by the dirty samples (coated with interfering material). There was a significant difference in recovery efficiency between the coupons with 10 spores deposited (mean of 48.9%) and those with 100 spores deposited (mean of 59.8%). There was no general significant difference between the clean and dirty (containing interfering material) coupons or between the two surface materials; however, there was a significant interaction between concentration amount and presence of interfering material. The recovery efficiency was close to the same for coupons with 10 spores deposited, but for the coupons with 100 spores deposited, the recovery efficiency for the dirty samples was significantly larger (65

  19. Neoclassical transport including collisional nonlinearity.

    Candy, J; Belli, E A

    2011-06-10

    In the standard δf theory of neoclassical transport, the zeroth-order (Maxwellian) solution is obtained analytically via the solution of a nonlinear equation. The first-order correction δf is subsequently computed as the solution of a linear, inhomogeneous equation that includes the linearized Fokker-Planck collision operator. This equation admits analytic solutions only in extreme asymptotic limits (banana, plateau, Pfirsch-Schlüter), and so must be solved numerically for realistic plasma parameters. Recently, numerical codes have appeared which attempt to compute the total distribution f more accurately than in the standard ordering by retaining some nonlinear terms related to finite-orbit width, while simultaneously reusing some form of the linearized collision operator. In this work we show that higher-order corrections to the distribution function may be unphysical if collisional nonlinearities are ignored.

  20. New seismograph includes filters

    1979-11-02

    The new Nimbus ES-1210 multichannel signal enhancement seismograph from EG and G geometrics has recently been redesigned to include multimode signal fillers on each amplifier. The ES-1210F is a shallow exploration seismograph for near subsurface exploration such as in depth-to-bedrock, geological hazard location, mineral exploration, and landslide investigations.

  1. Standard leach tests for nuclear waste materials

    Strachan, D.M.; Barnes, B.O.; Turcotte, R.P.

    1980-01-01

    Five leach tests were conducted to study time-dependent leaching of waste forms (glass). The first four tests include temperature as a variable and the use of three standard leachants. Three of the tests are static and two are dynamic (flow). This paper discusses the waste-form leach tests and presents some representative data. 4 figures

  2. Variability Bugs:

    Melo, Jean

    . Although many researchers suggest that preprocessor-based variability amplifies maintenance problems, there is little to no hard evidence on how actually variability affects programs and programmers. Specifically, how does variability affect programmers during maintenance tasks (bug finding in particular......)? How much harder is it to debug a program as variability increases? How do developers debug programs with variability? In what ways does variability affect bugs? In this Ph.D. thesis, I set off to address such issues through different perspectives using empirical research (based on controlled...... experiments) in order to understand quantitatively and qualitatively the impact of variability on programmers at bug finding and on buggy programs. From the program (and bug) perspective, the results show that variability is ubiquitous. There appears to be no specific nature of variability bugs that could...

  3. Analytic device including nanostructures

    Di Fabrizio, Enzo M.; Fratalocchi, Andrea; Totero Gongora, Juan Sebastian; Coluccio, Maria Laura; Candeloro, Patrizio; Cuda, Gianni

    2015-01-01

    A device for detecting an analyte in a sample comprising: an array including a plurality of pixels, each pixel including a nanochain comprising: a first nanostructure, a second nanostructure, and a third nanostructure, wherein size of the first nanostructure is larger than that of the second nanostructure, and size of the second nanostructure is larger than that of the third nanostructure, and wherein the first nanostructure, the second nanostructure, and the third nanostructure are positioned on a substrate such that when the nanochain is excited by an energy, an optical field between the second nanostructure and the third nanostructure is stronger than an optical field between the first nanostructure and the second nanostructure, wherein the array is configured to receive a sample; and a detector arranged to collect spectral data from a plurality of pixels of the array.

  4. Saskatchewan resources. [including uranium

    1979-09-01

    The production of chemicals and minerals for the chemical industry in Saskatchewan are featured, with some discussion of resource taxation. The commodities mentioned include potash, fatty amines, uranium, heavy oil, sodium sulfate, chlorine, sodium hydroxide, sodium chlorate and bentonite. Following the successful outcome of the Cluff Lake inquiry, the uranium industry is booming. Some developments and production figures for Gulf Minerals, Amok, Cenex and Eldorado are mentioned.

  5. Being Included and Excluded

    Korzenevica, Marina

    2016-01-01

    Following the civil war of 1996–2006, there was a dramatic increase in the labor mobility of young men and the inclusion of young women in formal education, which led to the transformation of the political landscape of rural Nepal. Mobility and schooling represent a level of prestige that rural...... politics. It analyzes how formal education and mobility either challenge or reinforce traditional gendered norms which dictate a lowly position for young married women in the household and their absence from community politics. The article concludes that women are simultaneously excluded and included from...... community politics. On the one hand, their mobility and decision-making powers decrease with the increase in the labor mobility of men and their newly gained education is politically devalued when compared to the informal education that men gain through mobility, but on the other hand, schooling strengthens...

  6. Standard Errors for Matrix Correlations.

    Ogasawara, Haruhiko

    1999-01-01

    Derives the asymptotic standard errors and intercorrelations for several matrix correlations assuming multivariate normality for manifest variables and derives the asymptotic standard errors of the matrix correlations for two factor-loading matrices. (SLD)

  7. THE INFLUENCE OF BIOFEEDBACK SESSIONS IN CLOSED LOOP OF HEART RATE VARIABILITY AND PACED BREATHING ON SYSTOLIC BLOOD PRESSURE CONTROL DURING STANDARD DRUG THERAPY IN PATIENTS WITH ARTERIAL HYPERTENSION

    S. A. S. Belal

    2015-06-01

    Full Text Available Changes of systolic blood pressure (SBP in biofeedback (BFB sessions with closed loop of paced breathing (PB and heart rate variability (HRV during standard drug therapy of arterial hypertension (AH was studied. 275 patients with 1-3 degree of AH (143 men and 132 women, mean age 58,55 ± 7,99 years was divided into two comparable groups: 1 - BFB (139 patients in investigated PB loop, 2 - control group (136 patients with BFB without PB. In both groups was performed 10 sessions of BFB. Changes of SBP depending on the stage and degree of AH, gender and age was assessed. BP was measured by the method of Korotkov’s with monometer Microlife BP AG1-20 in same conditions. Data were processed by parametric and nonparametric statistics. It is proved that the use of biofeedback in the loop of PB and HRV significantly (p < 0.01 exceeds in efficiency an isolated drug therapy in control of SBP at any stage and degree of AH in patients of both sexes in all age groups. Extent of the effect increases with the stage and degree of the disease and not related to the sex and age of the patient. Findings allow to recommend this technique in clinical practice.

  8. TEC variability over Havana

    Lazo, B.; Alazo, K.; Rodriguez, M.; Calzadilla, A.

    2003-01-01

    The variability of total electron content (TEC) measured over Havana using ATS-6, SMS-1 and GOES-3 geosynchronous satellite signals has been investigated for low, middle and high solar activity periods from 1974 to 1982. The obtained results show that standard deviation is smooth during nighttime hours and maximum at noon or postnoon hours. Strong solar activity dependence of standard deviation with a maximum values during HSA has been found. (author)

  9. Pulsating variables

    1989-01-01

    The study of stellar pulsations is a major route to the understanding of stellar structure and evolution. At the South African Astronomical Observatory (SAAO) the following stellar pulsation studies were undertaken: rapidly oscillating Ap stars; solar-like oscillations in stars; 8-Scuti type variability in a classical Am star; Beta Cephei variables; a pulsating white dwarf and its companion; RR Lyrae variables and galactic Cepheids. 4 figs

  10. Variable mechanical ventilation.

    Fontela, Paula Caitano; Prestes, Renata Bernardy; Forgiarini, Luiz Alberto; Friedman, Gilberto

    2017-01-01

    To review the literature on the use of variable mechanical ventilation and the main outcomes of this technique. Search, selection, and analysis of all original articles on variable ventilation, without restriction on the period of publication and language, available in the electronic databases LILACS, MEDLINE®, and PubMed, by searching the terms "variable ventilation" OR "noisy ventilation" OR "biologically variable ventilation". A total of 36 studies were selected. Of these, 24 were original studies, including 21 experimental studies and three clinical studies. Several experimental studies reported the beneficial effects of distinct variable ventilation strategies on lung function using different models of lung injury and healthy lungs. Variable ventilation seems to be a viable strategy for improving gas exchange and respiratory mechanics and preventing lung injury associated with mechanical ventilation. However, further clinical studies are necessary to assess the potential of variable ventilation strategies for the clinical improvement of patients undergoing mechanical ventilation.

  11. Cognitive Variability

    Siegler, Robert S.

    2007-01-01

    Children's thinking is highly variable at every level of analysis, from neural and associative levels to the level of strategies, theories, and other aspects of high-level cognition. This variability exists within people as well as between them; individual children often rely on different strategies or representations on closely related problems…

  12. Multiple variables data sets visualization in ROOT

    Couet, O

    2008-01-01

    The ROOT graphical framework provides support for many different functions including basic graphics, high-level visualization techniques, output on files, 3D viewing etc. They use well-known world standards to render graphics on screen, to produce high-quality output files, and to generate images for Web publishing. Many techniques allow visualization of all the basic ROOT data types, but the graphical framework was still a bit weak in the visualization of multiple variables data sets. This paper presents latest developments done in the ROOT framework to visualize multiple variables (>4) data sets

  13. A comparison of the hourly output between the Ambu® Smart-Infuser™ Pain Pump and the On-Q Pump® with Select-A-Flow™ Variable Rate Controller with standard and overfill volumes.

    Iliev, Peter; Bhalla, Tarun; Tobias, Joseph D

    2016-04-01

    The Ambu Smart-Infuser Pain Pump and the On-Q Pump with Select-a-Flow Variable Rate Controller are elastomeric devices with a flow regulator that controls the rate of infusion of a local anesthetic agent through a peripheral catheter. As a safety evaluation, we evaluated the infusion characteristics of these two devices when filled with manufacturer recommended standard volumes and when overfilled with a volume 50% in excess of that which is recommended. Nineteen disposable devices from the two manufacturers were used in this study. Nine were filled with 0.9% normal saline according to the respective manufacturers' recommendations (four Ambu pumps were filled with 650 ml and five On-Q pumps were filled with 550 ml) and 10 devices were 150% overfilled (five Ambu pumps were filled with 975 ml and five On-Q pumps were filled with 825 ml). All of the devices were set to infuse at 10 ml · h(-1) at room temperature (21°C) for 12 h. The fluid delivered during each 2-h period was measured using a graduated column. The On-Q pump (in the settings of normal fill and 150% overfill) delivered a significantly higher output per hour than the set rate during the first 8 h, while the Ambu pump delivered a value close to the set rate of 10 ml · h(-1). No significant difference in the hourly delivered output was noted for either device when comparing the normal fill to the 150% overfill groups. This investigation demonstrates that no change in the hourly output occurs with overfilling of these home infusion devices. However, as noted previously, the hourly output from the On-Q device is significantly higher than the set rate during the initial 8 h of infusion which could have potential clinical implications. © 2016 John Wiley & Sons Ltd.

  14. Variable volume combustor

    Ostebee, Heath Michael; Ziminsky, Willy Steve; Johnson, Thomas Edward; Keener, Christopher Paul

    2017-01-17

    The present application provides a variable volume combustor for use with a gas turbine engine. The variable volume combustor may include a liner, a number of micro-mixer fuel nozzles positioned within the liner, and a linear actuator so as to maneuver the micro-mixer fuel nozzles axially along the liner.

  15. The nebular variables

    Glasby, John S

    1974-01-01

    The Nebular Variables focuses on the nebular variables and their characteristics. Discussions are organized by type of nebular variable, namely, RW Aurigae stars, T Orionis stars, T Tauri stars, and peculiar nebular objects. Topics range from light variations of the stars to their spectroscopic and physical characteristics, spatial distribution, interaction with nebulosity, and evolutionary features. This volume is divided into four sections and consists of 25 chapters, the first of which provides general information on nebular variables, including their stellar associations and their classifi

  16. Several complex variables

    Field, M.J.

    1976-01-01

    Topics discussed include the elementary of holomorphic functions of several complex variables; the Weierstrass preparation theorem; meromorphic functions, holomorphic line bundles and divisors; elliptic operators on compact manifolds; hermitian connections; the Hodge decomposition theorem. ( author)

  17. Linear latent variable models: the lava-package

    Holst, Klaus Kähler; Budtz-Jørgensen, Esben

    2013-01-01

    are implemented including robust standard errors for clustered correlated data, multigroup analyses, non-linear parameter constraints, inference with incomplete data, maximum likelihood estimation with censored and binary observations, and instrumental variable estimators. In addition an extensive simulation......An R package for specifying and estimating linear latent variable models is presented. The philosophy of the implementation is to separate the model specification from the actual data, which leads to a dynamic and easy way of modeling complex hierarchical structures. Several advanced features...

  18. The WFCAM multiwavelength Variable Star Catalog

    Ferreira Lopes, C. E.; Dékány, I.; Catelan, M.; Cross, N. J. G.; Angeloni, R.; Leão, I. C.; De Medeiros, J. R.

    2015-01-01

    Context. Stellar variability in the near-infrared (NIR) remains largely unexplored. The exploitation of public science archives with data-mining methods offers a perspective for a time-domain exploration of the NIR sky. Aims: We perform a comprehensive search for stellar variability using the optical-NIR multiband photometric data in the public Calibration Database of the WFCAM Science Archive (WSA), with the aim of contributing to the general census of variable stars and of extending the current scarce inventory of accurate NIR light curves for a number of variable star classes. Methods: Standard data-mining methods were applied to extract and fine-tune time-series data from the WSA. We introduced new variability indices designed for multiband data with correlated sampling, and applied them for preselecting variable star candidates, i.e., light curves that are dominated by correlated variations, from noise-dominated ones. Preselection criteria were established by robust numerical tests for evaluating the response of variability indices to the colored noise characteristic of the data. We performed a period search using the string-length minimization method on an initial catalog of 6551 variable star candidates preselected by variability indices. Further frequency analysis was performed on positive candidates using three additional methods in combination, in order to cope with aliasing. Results: We find 275 periodic variable stars and an additional 44 objects with suspected variability with uncertain periods or apparently aperiodic variation. Only 44 of these objects had been previously known, including 11 RR Lyrae stars on the outskirts of the globular cluster M 3 (NGC 5272). We provide a preliminary classification of the new variable stars that have well-measured light curves, but the variability types of a large number of objects remain ambiguous. We classify most of the new variables as contact binary stars, but we also find several pulsating stars, among which

  19. Quality of semantic standards

    Folmer, Erwin Johan Albert

    2012-01-01

    Little scientific literature addresses the issue of quality of semantic standards, albeit a problem with high economic and social impact. Our problem survey, including 34 semantic Standard Setting Organizations (SSOs), gives evidence that quality of standards can be improved, but for improvement a

  20. Intra-patient variability of FDG standardized uptake values in mediastinal blood pool, liver, and myocardium during R-CHOP chemotherapy in patients with diffuse large B- cell lymphoma

    Kim, Soo Jeong; Yi, Hyun Kyung; Lim, Chae Hong; Cho, Young Seok; Choi, Joon Young; Choe, Yeam Seong; Lee, Kyung Han; Moon, Seung Hwan [Dept. of Nuclear Medicine, Samsung Medical Center, Sungkyunkwan University School of Medicine, Seoul (Korea, Republic of)

    2016-12-15

    {sup 18}F-fluorodeoxyglucose (FDG) PET/CT is useful for staging and evaluating treatment response in patients with diffuse large B-cell lymphoma (DLBCL). A five-point scale model using the mediastinal blood pool (MBP) and liver as references is a recommended method for interpreting treatment response. We evaluated the variability in standardized uptake values (SUVs) of the MBP, liver, and myocardium during chemotherapy in patients with DLBCL. We analyzed 60 patients with DLBCL who received rituximab, cyclophosphamide, doxorubicin, vincristine, and prednisolone (R-CHOP) treatment and underwent baseline, interim, and final FDG PET/CT scans. The FDG uptakes of lymphoma lesions, MBP, liver, and myocardium were assessed, and changes in the MBP and liver SUV and possible associated factors were evaluated. The SUV of the liver did not change significantly during the chemotherapy. However, the SUV{sub mean} of MBP showed a significant change though the difference was small (p = 0.019). SUV{sub mean} of MBP and liver at baseline and interim scans was significantly lower in patients with advanced Ann Arbor stage on diagnosis. The SUV{sub mean} of the MBP and liver was negatively correlated with the volumetric index of lymphoma lesions in baseline scans (r = -0.547, p < 0.001; r = -0.502, p < 0.001). Positive myocardial FDG uptake was more frequently observed in interim and final scans than in the baseline scan, but there was no significant association between the MBP and liver uptake and myocardial uptake. The SUV of the liver was not significantly changed during R-CHOP chemotherapy in patients with DLBCL, whereas the MBP SUV of the interim scan decreased slightly. However, the SUV of the reference organs may be affected by tumor burden, and this should be considered when assessing follow-up scans. Although myocardial FDG uptake was more frequently observed after R-CHOP chemotherapy, it did not affect the SUV of the MBP and liver.

  1. ['Gold standard', not 'golden standard'

    Claassen, J.A.H.R.

    2005-01-01

    In medical literature, both 'gold standard' and 'golden standard' are employed to describe a reference test used for comparison with a novel method. The term 'gold standard' in its current sense in medical research was coined by Rudd in 1979, in reference to the monetary gold standard. In the same

  2. Complex variables

    Fisher, Stephen D

    1999-01-01

    The most important topics in the theory and application of complex variables receive a thorough, coherent treatment in this introductory text. Intended for undergraduates or graduate students in science, mathematics, and engineering, this volume features hundreds of solved examples, exercises, and applications designed to foster a complete understanding of complex variables as well as an appreciation of their mathematical beauty and elegance. Prerequisites are minimal; a three-semester course in calculus will suffice to prepare students for discussions of these topics: the complex plane, basic

  3. Variable stars

    Feast, M.W.; Wenzel, W.; Fernie, J.D.; Percy, J.R.; Smak, J.; Gascoigne, S.C.B.; Grindley, J.E.; Lovell, B.; Sawyer Hogg, H.B.; Baker, N.; Fitch, W.S.; Rosino, L.; Gursky, H.

    1976-01-01

    A critical review of variable stars is presented. A fairly complete summary of major developments and discoveries during the period 1973-1975 is given. The broad developments and new trends are outlined. Essential problems for future research are identified. (B.R.H. )

  4. Accounting standards

    Stellinga, B.; Mügge, D.

    2014-01-01

    The European and global regulation of accounting standards have witnessed remarkable changes over the past twenty years. In the early 1990s, EU accounting practices were fragmented along national lines and US accounting standards were the de facto global standards. Since 2005, all EU listed

  5. Standardization Documents

    2011-08-01

    Specifications and Standards; Guide Specifications; CIDs; and NGSs . Learn. Perform. Succeed. STANDARDIZATION DOCUMENTS Federal Specifications Commercial...national or international standardization document developed by a private sector association, organization, or technical society that plans ...Maintain lessons learned • Examples: Guidance for application of a technology; Lists of options Learn. Perform. Succeed. DEFENSE HANDBOOK

  6. Air Force standards for nickel hydrogen battery

    Hwang, Warren; Milden, Martin

    1994-01-01

    The topics discussed are presented in viewgraph form and include Air Force nickel hydrogen standardization goals, philosophy, project outline, cell level standardization, battery level standardization, and schedule.

  7. Eutrophication Modeling Using Variable Chlorophyll Approach

    Abdolabadi, H.; Sarang, A.; Ardestani, M.; Mahjoobi, E.

    2016-01-01

    In this study, eutrophication was investigated in Lake Ontario to identify the interactions among effective drivers. The complexity of such phenomenon was modeled using a system dynamics approach based on a consideration of constant and variable stoichiometric ratios. The system dynamics approach is a powerful tool for developing object-oriented models to simulate complex phenomena that involve feedback effects. Utilizing stoichiometric ratios is a method for converting the concentrations of state variables. During the physical segmentation of the model, Lake Ontario was divided into two layers, i.e., the epilimnion and hypolimnion, and differential equations were developed for each layer. The model structure included 16 state variables related to phytoplankton, herbivorous zooplankton, carnivorous zooplankton, ammonium, nitrate, dissolved phosphorus, and particulate and dissolved carbon in the epilimnion and hypolimnion during a time horizon of one year. The results of several tests to verify the model, close to 1 Nash-Sutcliff coefficient (0.98), the data correlation coefficient (0.98), and lower standard errors (0.96), have indicated well-suited model’s efficiency. The results revealed that there were significant differences in the concentrations of the state variables in constant and variable stoichiometry simulations. Consequently, the consideration of variable stoichiometric ratios in algae and nutrient concentration simulations may be applied in future modeling studies to enhance the accuracy of the results and reduce the likelihood of inefficient control policies.

  8. Calculus of one variable

    Grossman, Stanley I

    1986-01-01

    Calculus of One Variable, Second Edition presents the essential topics in the study of the techniques and theorems of calculus.The book provides a comprehensive introduction to calculus. It contains examples, exercises, the history and development of calculus, and various applications. Some of the topics discussed in the text include the concept of limits, one-variable theory, the derivatives of all six trigonometric functions, exponential and logarithmic functions, and infinite series.This textbook is intended for use by college students.

  9. Statistical variability of hydro-meteorological variables as indicators ...

    Statistical variability of hydro-meteorological variables as indicators of climate change in north-east Sokoto-Rima basin, Nigeria. ... water resources development including water supply project, agriculture and tourism in the study area. Key word: Climate change, Climatic variability, Actual evapotranspiration, Global warming ...

  10. Communications standards

    Stokes, A V

    1986-01-01

    Communications Standards deals with the standardization of computer communication networks. This book examines the types of local area networks (LANs) that have been developed and looks at some of the relevant protocols in more detail. The work of Project 802 is briefly discussed, along with a protocol which has developed from one of the LAN standards and is now a de facto standard in one particular area, namely the Manufacturing Automation Protocol (MAP). Factors that affect the usage of networks, such as network management and security, are also considered. This book is divided into three se

  11. Training Standardization

    Agnihotri, Newal

    2003-01-01

    The article describes the benefits of and required process and recommendations for implementing the standardization of training in the nuclear power industry in the United States and abroad. Current Information and Communication Technologies (ICT) enable training standardization in the nuclear power industry. The delivery of training through the Internet, Intranet and video over IP will facilitate this standardization and bring multiple benefits to the nuclear power industry worldwide. As the amount of available qualified and experienced professionals decreases because of retirements and fewer nuclear engineering institutions, standardized training will help increase the number of available professionals in the industry. Technology will make it possible to use the experience of retired professionals who may be interested in working part-time from a remote location. Well-planned standardized training will prevent a fragmented approach among utilities, and it will save the industry considerable resources in the long run. It will also ensure cost-effective and safe nuclear power plant operation

  12. Effluent standards

    Geisler, G C [Pennsylvania State University (United States)

    1974-07-01

    At the conference there was a considerable interest in research reactor standards and effluent standards in particular. On the program, this is demonstrated by the panel discussion on effluents, the paper on argon 41 measured by Sims, and the summary paper by Ringle, et al. on the activities of ANS research reactor standards committee (ANS-15). As a result, a meeting was organized to discuss the proposed ANS standard on research reactor effluents (15.9). This was held on Tuesday evening, was attended by members of the ANS-15 committee who were present at the conference, participants in the panel discussion on the subject, and others interested. Out of this meeting came a number of excellent suggestions for changes which will increase the utility of the standard, and a strong recommendation that the effluent standard (15.9) be combined with the effluent monitoring standard. It is expected that these suggestions and recommendations will be incorporated and a revised draft issued for comment early this summer. (author)

  13. Nuclear standards

    Fichtner, N.; Becker, K.; Bashir, M.

    1981-01-01

    This compilation of all nuclear standards available to the authors by mid 1980 represents the third, carefully revised edition of a catalogue which was first published in 1975 as EUR 5362. In this third edition several changes have been made. The title has been condensed. The information has again been carefully up-dated, covering all changes regarding status, withdrawal of old standards, new projects, amendments, revisions, splitting of standards into several parts, combination of several standards into one, etc., as available to the authors by mid 1980. The speed with which information travels varies and requires in many cases rather tedious and cumbersome inquiries. Also, the classification scheme has been revised with the goal of better adjustment to changing situations and priorities. Whenever it turned out to be difficult to attribute a standard to a single subject category, multiple listings in all relevant categories have been made. As in previous editions, within the subcategories the standards are arranged by organization (in Categorie 2.1 by country) alphabetically and in ascending numerical order. It covers all relevant areas of power reactors, the fuel cycle, radiation protection, etc., from the basic laws and governmental regulations, regulatory guides, etc., all the way to voluntary industrial standards and codes of pratice. (orig./HP)

  14. Standards and Professional Development

    Zengler, Cynthia J.

    2017-01-01

    The purpose of this paper is to describe the professional development that has taken place in conjunction with Ohio adopting the College and Career Readiness (CCR) Standards. The professional development (PD) has changed over time to include not only training on the new standards and lesson plans but training on the concepts defined in the…

  15. MATE standardization

    Farmer, R. E.

    1982-11-01

    The MATE (Modular Automatic Test Equipment) program was developed to combat the proliferation of unique, expensive ATE within the Air Force. MATE incorporates a standard management approach and a standard architecture designed to implement a cradle-to-grave approach to the acquisition of ATE and to significantly reduce the life cycle cost of weapons systems support. These standards are detailed in the MATE Guides. The MATE Guides assist both the Air Force and Industry in implementing the MATE concept, and provide the necessary tools and guidance required for successful acquisition of ATE. The guides also provide the necessary specifications for industry to build MATE-qualifiable equipment. The MATE architecture provides standards for all key interfaces of an ATE system. The MATE approach to the acquisition and management of ATE has been jointly endorsed by the commanders of Air Force Systems Command and Air Force Logistics Command as the way of doing business in the future.

  16. Standard NIM instrumentation system

    1990-05-01

    NIM is a standard modular instrumentation system that is in wide use throughout the world. As the NIM system developed and accommodations were made to a dynamic instrumentation field and a rapidly advancing technology, additions, revisions and clarifications were made. These were incorporated into the standard in the form of addenda and errata. This standard is a revision of the NIM document, AEC Report TID-20893 (Rev. 4) dated July 1974. It includes all the addenda and errata items that were previously issued as well as numerous additional items to make the standard current with modern technology and manufacturing practice

  17. European standards for composite construction

    Stark, J.W.B.

    2000-01-01

    The European Standards Organisation (CEN) has planned to develop a complete set of harmonized European building standards. This set includes standards for composite steel and concrete buildings and bridges. The Eurocodes, being the design standards, form part of this total system of European

  18. Surfing wave climate variability

    Espejo, Antonio; Losada, Iñigo J.; Méndez, Fernando J.

    2014-10-01

    International surfing destinations are highly dependent on specific combinations of wind-wave formation, thermal conditions and local bathymetry. Surf quality depends on a vast number of geophysical variables, and analyses of surf quality require the consideration of the seasonal, interannual and long-term variability of surf conditions on a global scale. A multivariable standardized index based on expert judgment is proposed for this purpose. This index makes it possible to analyze surf conditions objectively over a global domain. A summary of global surf resources based on a new index integrating existing wave, wind, tides and sea surface temperature databases is presented. According to general atmospheric circulation and swell propagation patterns, results show that west-facing low to middle-latitude coasts are more suitable for surfing, especially those in the Southern Hemisphere. Month-to-month analysis reveals strong seasonal variations in the occurrence of surfable events, enhancing the frequency of such events in the North Atlantic and the North Pacific. Interannual variability was investigated by comparing occurrence values with global and regional modes of low-frequency climate variability such as El Niño and the North Atlantic Oscillation, revealing their strong influence at both the global and the regional scale. Results of the long-term trends demonstrate an increase in the probability of surfable events on west-facing coasts around the world in recent years. The resulting maps provide useful information for surfers, the surf tourism industry and surf-related coastal planners and stakeholders.

  19. Variable collimator

    Richey, J.B.; McBride, T.R.; Covic, J.

    1979-01-01

    This invention describes an automatic variable collimator which controls the width and thickness of X-ray beams in X-ray diagnostic medical equipment, and which is particularly adapted for use with computerized axial tomographic scanners. A two-part collimator is provided which shapes an X-ray beam both prior to its entering an object subject to radiographic analysis and after the attenuated beam has passed through the object. Interposed between a source of radiation and the object subject to radiographic analysis is a first or source collimator. The source collimator causes the X-ray beam emitted by the source of radiation to be split into a plurality of generally rectangular shaped beams. Disposed within the source collimator is a movable aperture plate which may be used to selectively vary the thickness of the plurality of generally rectangular shaped beams transmitted through the source collimator. A second or receiver collimator is interposed between the object subject to radiographic analysis and a series of radiation detectors. The receiver collimator is disposed to receive the attenuated X-ray beams passing through the object subject to radiographic analysis. Located within the receiver collimator are a plurality of movable aperture plates adapted to be displaced relative to a plurality of fixed aperture plates for the purpose of varying the width and thickness of the attenuated X-ray beams transmitted through the object subject to radiographic analysis. The movable aperture plates of the source and receiver collimators are automatically controlled by circuitry which is provided to allow remote operation of the movable aperture plates

  20. Evaluating Living Standard Indicators

    Birčiaková Naďa

    2015-09-01

    Full Text Available This paper deals with the evaluation of selected available indicators of living standards, divided into three groups, namely economic, environmental, and social. We have selected six countries of the European Union for analysis: Bulgaria, the Czech Republic, Hungary, Luxembourg, France, and Great Britain. The aim of this paper is to evaluate indicators measuring living standards and suggest the most important factors which should be included in the final measurement. We have tried to determine what factors influence each indicator and what factors affect living standards. We have chosen regression analysis as our main method. From the study of factors, we can deduce their impact on living standards, and thus the value of indicators of living standards. Indicators with a high degree of reliability include the following factors: size and density of population, health care and spending on education. Emissions of carbon dioxide in the atmosphere also have a certain lower degree of reliability.

  1. Consistency Across Standards or Standards in a New Business Model

    Russo, Dane M.

    2010-01-01

    Presentation topics include: standards in a changing business model, the new National Space Policy is driving change, a new paradigm for human spaceflight, consistency across standards, the purpose of standards, danger of over-prescriptive standards, a balance is needed (between prescriptive and general standards), enabling versus inhibiting, characteristics of success-oriented standards, characteristics of success-oriented standards, and conclusions. Additional slides include NASA Procedural Requirements 8705.2B identifies human rating standards and requirements, draft health and medical standards for human rating, what's been done, government oversight models, examples of consistency from anthropometry, examples of inconsistency from air quality and appendices of government and non-governmental human factors standards.

  2. Internal variables in thermoelasticity

    Berezovski, Arkadi

    2017-01-01

    This book describes an effective method for modeling advanced materials like polymers, composite materials and biomaterials, which are, as a rule, inhomogeneous. The thermoelastic theory with internal variables presented here provides a general framework for predicting a material’s reaction to external loading. The basic physical principles provide the primary theoretical information, including the evolution equations of the internal variables. The cornerstones of this framework are the material representation of continuum mechanics, a weak nonlocality, a non-zero extra entropy flux, and a consecutive employment of the dissipation inequality. Examples of thermoelastic phenomena are provided, accompanied by detailed procedures demonstrating how to simulate them.

  3. Frequency standards

    Riehle, Fritz

    2006-01-01

    Of all measurement units, frequency is the one that may be determined with the highest degree of accuracy. It equally allows precise measurements of other physical and technical quantities, whenever they can be measured in terms of frequency.This volume covers the central methods and techniques relevant for frequency standards developed in physics, electronics, quantum electronics, and statistics. After a review of the basic principles, the book looks at the realisation of commonly used components. It then continues with the description and characterisation of important frequency standards

  4. [Roaming through methodology. XXXVIII. Common misconceptions involving standard deviation and standard error

    Mokkink, H.G.A.

    2002-01-01

    Standard deviation and standard error have a clear mutual relationship, but at the same time they differ strongly in the type of information they supply. This can lead to confusion and misunderstandings. Standard deviation describes the variability in a sample of measures of a variable, for instance

  5. Relevant Standards

    .86: Ethernet over LAPS. Standard in China and India. G.7041: Generic Framing Procedure (GFP). Supports Ethernet as well as other data formats (e.g., Fibre Channel); Protocol of ... IEEE 802.3x for flow control of incoming Ethernet data ...

  6. Achieving Standardization

    Henningsson, Stefan

    2014-01-01

    International e-Customs is going through a standardization process. Driven by the need to increase control in the trade process to address security challenges stemming from threats of terrorists, diseases, and counterfeit products, and to lower the administrative burdens on traders to stay...

  7. Achieving Standardization

    Henningsson, Stefan

    2016-01-01

    International e-Customs is going through a standardization process. Driven by the need to increase control in the trade process to address security challenges stemming from threats of terrorists, diseases, and counterfeit products, and to lower the administrative burdens on traders to stay...

  8. Standard Fortran

    Marshall, N.H.

    1981-01-01

    Because of its vast software investment in Fortran programs, the nuclear community has an inherent interest in the evolution of Fortran. This paper reviews the impact of the new Fortran 77 standard and discusses the projected changes which can be expected in the future

  9. Generation of gaseous methanol reference standards

    Geib, R.C.

    1991-01-01

    Methanol has been proposed as an automotive fuel component. Reliable, accurate methanol standards are essential to support widespread monitoring programs. The monitoring programs may include quantification of methanol from tailpipe emissions, evaporative emissions, plus ambient air methanol measurements. This paper will present approaches and results in the author's investigation to develop high accuracy methanol standards. The variables upon which the authors will report results are as follows: (1) stability of methanol gas standards, the studies will focus on preparation requirements and stability results from 10 to 1,000 ppmv; (2) cylinder to instrument delivery system components and purge technique, these studies have dealt with materials in contact with the sample stream plus static versus flow injection; (3) optimization of gas chromatographic analytical system will be discussed; (4) gas chromatography and process analyzer results and utility for methanol analysis will be presented; (5) the accuracy of the methanol standards will be qualified using data from multiple studies including: (a) gravimetric preparation; (b) linearity studies; (c) independent standards sources such as low pressure containers and diffusion tubes. The accuracy will be provided as a propagation of error from multiple sources. The methanol target concentrations will be 10 to 500 ppmv

  10. Regional regression models of percentile flows for the contiguous United States: Expert versus data-driven independent variable selection

    Geoffrey Fouad

    2018-06-01

    New hydrological insights for the region: A set of three variables selected based on an expert assessment of factors that influence percentile flows performed similarly to larger sets of variables selected using a data-driven method. Expert assessment variables included mean annual precipitation, potential evapotranspiration, and baseflow index. Larger sets of up to 37 variables contributed little, if any, additional predictive information. Variables used to describe the distribution of basin data (e.g. standard deviation were not useful, and average values were sufficient to characterize physical and climatic basin conditions. Effectiveness of the expert assessment variables may be due to the high degree of multicollinearity (i.e. cross-correlation among additional variables. A tool is provided in the Supplementary material to predict percentile flows based on the three expert assessment variables. Future work should develop new variables with a strong understanding of the processes related to percentile flows.

  11. Trueness verification of actual creatinine assays in the European market demonstrates a disappointing variability that needs substantial improvement. An international study in the framework of the EC4 creatinine standardization working group.

    Delanghe, Joris R; Cobbaert, Christa; Galteau, Marie-Madeleine; Harmoinen, Aimo; Jansen, Rob; Kruse, Rolf; Laitinen, Päivi; Thienpont, Linda M; Wuyts, Birgitte; Weykamp, Cas; Panteghini, Mauro

    2008-01-01

    The European In Vitro Diagnostics (IVD) directive requires traceability to reference methods and materials of analytes. It is a task of the profession to verify the trueness of results and IVD compatibility. The results of a trueness verification study by the European Communities Confederation of Clinical Chemistry (EC4) working group on creatinine standardization are described, in which 189 European laboratories analyzed serum creatinine in a commutable serum-based material, using analytical systems from seven companies. Values were targeted using isotope dilution gas chromatography/mass spectrometry. Results were tested on their compliance to a set of three criteria: trueness, i.e., no significant bias relative to the target value, between-laboratory variation and within-laboratory variation relative to the maximum allowable error. For the lower and intermediate level, values differed significantly from the target value in the Jaffe and the dry chemistry methods. At the high level, dry chemistry yielded higher results. Between-laboratory coefficients of variation ranged from 4.37% to 8.74%. Total error budget was mainly consumed by the bias. Non-compensated Jaffe methods largely exceeded the total error budget. Best results were obtained for the enzymatic method. The dry chemistry method consumed a large part of its error budget due to calibration bias. Despite the European IVD directive and the growing needs for creatinine standardization, an unacceptable inter-laboratory variation was observed, which was mainly due to calibration differences. The calibration variation has major clinical consequences, in particular in pediatrics, where reference ranges for serum and plasma creatinine are low, and in the estimation of glomerular filtration rate.

  12. Wavelength standards in the infrared

    Rao, KN

    2012-01-01

    Wavelength Standards in the Infrared is a compilation of wavelength standards suitable for use with high-resolution infrared spectrographs, including both emission and absorption standards. The book presents atomic line emission standards of argon, krypton, neon, and xenon. These atomic line emission standards are from the deliberations of Commission 14 of the International Astronomical Union, which is the recognized authority for such standards. The text also explains the techniques employed in determining spectral positions in the infrared. One of the techniques used includes the grating con

  13. Musculoskeletal ultrasound including definitions for ultrasonographic pathology

    Wakefield, RJ; Balint, PV; Szkudlarek, Marcin

    2005-01-01

    Ultrasound (US) has great potential as an outcome in rheumatoid arthritis trials for detecting bone erosions, synovitis, tendon disease, and enthesopathy. It has a number of distinct advantages over magnetic resonance imaging, including good patient tolerability and ability to scan multiple joints...... in a short period of time. However, there are scarce data regarding its validity, reproducibility, and responsiveness to change, making interpretation and comparison of studies difficult. In particular, there are limited data describing standardized scanning methodology and standardized definitions of US...... pathologies. This article presents the first report from the OMERACT ultrasound special interest group, which has compared US against the criteria of the OMERACT filter. Also proposed for the first time are consensus US definitions for common pathological lesions seen in patients with inflammatory arthritis....

  14. (including travel dates) Proposed itinerary

    Ashok

    31 July to 22 August 2012 (including travel dates). Proposed itinerary: Arrival in Bangalore on 1 August. 1-5 August: Bangalore, Karnataka. Suggested institutions: Indian Institute of Science, Bangalore. St Johns Medical College & Hospital, Bangalore. Jawaharlal Nehru Centre, Bangalore. 6-8 August: Chennai, TN.

  15. Theory including future not excluded

    Nagao, K.; Nielsen, H.B.

    2013-01-01

    We study a complex action theory (CAT) whose path runs over not only past but also future. We show that, if we regard a matrix element defined in terms of the future state at time T and the past state at time TA as an expectation value in the CAT, then we are allowed to have the Heisenberg equation......, Ehrenfest's theorem, and the conserved probability current density. In addition,we showthat the expectation value at the present time t of a future-included theory for large T - t and large t - T corresponds to that of a future-not-included theory with a proper inner product for large t - T. Hence, the CAT...

  16. FINDING STANDARD DEVIATION OF A FUZZY NUMBER

    Fokrul Alom Mazarbhuiya

    2017-01-01

    Two probability laws can be root of a possibility law. Considering two probability densities over two disjoint ranges, we can define the fuzzy standard deviation of a fuzzy variable with the help of the standard deviation two random variables in two disjoint spaces.

  17. Implementation of IEC standard models for power system stability studies

    Margaris, Ioannis D.; Hansen, Anca D.; Soerensen, Poul [Technical Univ. of Denmark, Roskilde (Denmark). Dept. of Wind Energy; Bech, John; Andresen, Bjoern [Siemens Wind Power A/S, Brande (Denmark)

    2012-07-01

    This paper presents the implementation of the generic wind turbine generator (WTG) electrical simulation models proposed in the IEC 61400-27 standard which is currently in preparation. A general overview of the different WTG types is given while the main focus is on Type 4B WTG standard model, namely a model for a variable speed wind turbine with full scale power converter WTG including a 2-mass mechanical model. The generic models for fixed and variable speed WTGs models are suitable for fundamental frequency positive sequence response simulations during short events in the power system such as voltage dips. The general configuration of the models is presented and discussed; model implementation in the simulation software platform DIgSILENT PowerFactory is presented in order to illustrate the range of applicability of the generic models under discussion. A typical voltage dip is simulated and results from the basic electrical variables of the WTG are presented and discussed. (orig.)

  18. Position paper on standardization

    1991-04-01

    The ''NPOC Strategic Plan for Building New Nuclear Plants'' creates a framework within which new standardized nuclear plants may be built. The Strategic Plan is an expression of the nuclear energy industry's serious intent to create the necessary conditions for new plant construction and operation. One of the key elements of the Strategic Plan is a comprehensive industry commitment to standardization: through design certification, combined license, first-of-a-kind engineering, construction, operation and maintenance of nuclear power plants. The NPOC plan proposes four stages of standardization in advanced light water reactors (ALWRs). The first stage is established by the ALWR Utility Requirements Document which specifies owner/operator requirements at a functional level covering all elements of plant design and construction, and many aspects of operations and maintenance. The second stage of standardization is that achieved in the NRC design certification. This certification level includes requirements, design criteria and bases, functional descriptions and performance requirements for systems to assure plant safety. The third stage of standardization, commercial standardization, carries the design to a level of completion beyond that required for design certification to enable the industry to achieve potential increases in efficiency and economy. The final stage of standardization is enhanced standardization beyond design. A standardized approach is being developed in construction practices, operating, maintenance training, and procurement practices. This comprehensive standardization program enables the NRC to proceed with design certification with the confidence that standardization beyond the regulations will be achieved. This confidence should answer the question of design detail required for design certification, and demonstrate that the NRC should require no further regulatory review beyond that required by 10 CFR Part 52

  19. Soil variability in mountain areas

    Zanini, E.; Freppaz, M.; Stanchi, S.; Bonifacio, E.; Egli, M.

    2015-01-01

    The high spatial variability of soils is a relevant issue at local and global scales, and determines the complexity of soil ecosystem functions and services. This variability derives from strong dependencies of soil ecosystems on parent materials, climate, relief and biosphere, including human impact. Although present in all environments, the interactions of soils with these forming factors are particularly striking in mountain areas.

  20. Variability in human body size

    Annis, J. F.

    1978-01-01

    The range of variability found among homogeneous groups is described and illustrated. Those trends that show significantly marked differences between sexes and among a number of racial/ethnic groups are also presented. Causes of human-body size variability discussed include genetic endowment, aging, nutrition, protective garments, and occupation. The information is presented to aid design engineers of space flight hardware and equipment.

  1. An evaluation of FIA's stand age variable

    John D. Shaw

    2015-01-01

    The Forest Inventory and Analysis Database (FIADB) includes a large number of measured and computed variables. The definitions of measured variables are usually well-documented in FIA field and database manuals. Some computed variables, such as live basal area of the condition, are equally straightforward. Other computed variables, such as individual tree volume,...

  2. Device including a contact detector

    2011-01-01

    arms (12) may extend from the supporting body in co-planar relationship with the first surface. The plurality of cantilever arms (12) may extend substantially parallel to each other and each of the plurality of cantilever arms (12) may include an electrical conductive tip for contacting the area......The present invention relates to a probe for determining an electrical property of an area of a surface of a test sample, the probe is intended to be in a specific orientation relative to the test sample. The probe may comprise a supporting body defining a first surface. A plurality of cantilever...... of the test sample by movement of the probe relative to the surface of the test sample into the specific orientation.; The probe may further comprise a contact detector (14) extending from the supporting body arranged so as to contact the surface of the test sample prior to any one of the plurality...

  3. Several real variables

    Kantorovitz, Shmuel

    2016-01-01

    This undergraduate textbook is based on lectures given by the author on the differential and integral calculus of functions of several real variables. The book has a modern approach and includes topics such as: •The p-norms on vector space and their equivalence •The Weierstrass and Stone-Weierstrass approximation theorems •The differential as a linear functional; Jacobians, Hessians, and Taylor's theorem in several variables •The Implicit Function Theorem for a system of equations, proved via Banach’s Fixed Point Theorem •Applications to Ordinary Differential Equations •Line integrals and an introduction to surface integrals This book features numerous examples, detailed proofs, as well as exercises at the end of sections. Many of the exercises have detailed solutions, making the book suitable for self-study. Several Real Variables will be useful for undergraduate students in mathematics who have completed first courses in linear algebra and analysis of one real variable.

  4. The quest to standardize hemodialysis care.

    Hegbrant, Jörgen; Gentile, Giorgio; Strippoli, Giovanni F M

    2011-01-01

    A large global dialysis provider's core activities include providing dialysis care with excellent quality, ensuring a low variability across the clinic network and ensuring strong focus on patient safety. In this article, we summarize the pertinent components of the quality assurance and safety program of the Diaverum Renal Services Group. Concerning medical performance, the key components of a successful quality program are setting treatment targets; implementing evidence-based guidelines and clinical protocols; consistently, regularly, prospectively and accurately collecting data from all clinics in the network; processing collected data to provide feedback to clinics in a timely manner, incorporating information on interclinic and intercountry variations; and revising targets, guidelines and clinical protocols based on sound scientific data. The key activities for ensuring patient safety include a standardized approach to education, i.e. a uniform education program including control of theoretical knowledge and clinical competencies; implementation of clinical policies and procedures in the organization in order to reduce variability and potential defects in clinic practice; and auditing of clinical practice on a regular basis. By applying a standardized and systematic continuous quality improvement approach throughout the entire organization, it has been possible for Diaverum to progressively improve medical performance and ensure patient safety. Copyright © 2011 S. Karger AG, Basel.

  5. Manipulating continuous variable photonic entanglement

    Plenio, M.B.

    2005-01-01

    I will review our work on photonic entanglement in the continuous variable regime including both Gaussian and non-Gaussian states. The feasibility and efficiency of various entanglement purification protocols are discussed this context. (author)

  6. SOFG: Standards requirements

    Gerganov, T.; Grigorov, S.; Kozhukharov, V.; Brashkova, N.

    2005-01-01

    It is well-known that Solid Oxide Fuel Cells will have industrial application in the nearest future. In this context, the problem of SOFC materials and SOFC systems standardization is of high level of priority. In the present study the attention is focused on the methods for physical and chemical characterization of the materials for SOFC components fabrication and about requirements on single SOFC cells tests. The status of the CEN, ISO, ASTM (ANSI, ASSN) and JIS class of standards has been verified. Standards regarding the test methods for physical-chemical characterization of vitreous materials (as sealing SOFC component), ceramic materials (as electrodes and electrolyte components, including alternative materials used) and metallic materials (interconnect components) are subject of overview. It is established that electrical, mechanical, surface and interfacial phenomena, chemical durability and thermal corrosion behaviour are the key areas for standardization of the materials for SOFC components

  7. Comparing Standard Deviation Effects across Contexts

    Ost, Ben; Gangopadhyaya, Anuj; Schiman, Jeffrey C.

    2017-01-01

    Studies using tests scores as the dependent variable often report point estimates in student standard deviation units. We note that a standard deviation is not a standard unit of measurement since the distribution of test scores can vary across contexts. As such, researchers should be cautious when interpreting differences in the numerical size of…

  8. An Integrated Biochemistry Laboratory, Including Molecular Modeling

    Hall, Adele J. Wolfson Mona L.; Branham, Thomas R.

    1996-11-01

    for lysozyme activity and a colorimetric one for protein concentration. Familiarity with the assays is reinforced by an independently designed project to modify a variable in one of these assays. The assay for lysozyme activity is that of Shugar (6), based on hydrolysis of a cell-wall suspension from the bacterium Micrococcus lysodeikticus, a substrate that is particularly sensitive to lysozyme. As the cell walls are broken down by the enzyme, the turbidity of the sample decreases. This decrease can be conveniently measured by following the decrease in absorbance at a wavelength of 450 nm, using a spectrophotometer or other device for measuring light scattering. The Bradford method (7), a standard assay, is used to determine protein concentration. Using the data from both lysozyme activity assays and protein concentration assays, students can calculate the specific activity for commercial lysozyme and an egg- white solution. These calculations clearly demonstrate the increase in specific activity with increasing purity, since the purified (commercial) preparation has a specific activity approximately 20-fold higher than that of the crude egg-white solution. Lysozyme Purification by Ion-Exchange Chromatography (5 weeks) As suggested by Strang (8), students can design a rational purification of lysozyme using ion-exchange chromatography when presented with information on the isoelectric point of the enzyme and the properties of ion- exchange resins. One week is spent discussing protein purification and the relative advantages and disadvantages of different resins. Each group has a choice of anion-exchange (DEAE) or cation-exchange (CM) resins. Because lysozyme is positively charged below a pH of 11, it will not be adsorbed to an anion-exchange resin, but will be adsorbed to the cation-exchange resin. Therefore, for the cation-exchange protocols, there are further options for methods of collecting and eluting the desired protein. A purification table, including

  9. Relationship of suicide rates with climate and economic variables in Europe during 2000-2012

    Fountoulakis, Konstantinos N; Chatzikosta, Isaia; Pastiadis, Konstantinos

    2016-01-01

    BACKGROUND: It is well known that suicidal rates vary considerably among European countries and the reasons for this are unknown, although several theories have been proposed. The effect of economic variables has been extensively studied but not that of climate. METHODS: Data from 29 European...... countries covering the years 2000-2012 and concerning male and female standardized suicidal rates (according to WHO), economic variables (according World Bank) and climate variables were gathered. The statistical analysis included cluster and principal component analysis and categorical regression. RESULTS......: The derived models explained 62.4 % of the variability of male suicidal rates. Economic variables alone explained 26.9 % and climate variables 37.6 %. For females, the respective figures were 41.7, 11.5 and 28.1 %. Male suicides correlated with high unemployment rate in the frame of high growth rate and high...

  10. Classification of decays involving variable decay chains with convolutional architectures

    CERN. Geneva

    2018-01-01

    Vidyo contribution We present a technique to perform classification of decays that exhibit decay chains involving a variable number of particles, which include a broad class of $B$ meson decays sensitive to new physics. The utility of such decays as a probe of the Standard Model is dependent upon accurate determination of the decay rate, which is challenged by the combinatorial background arising in high-multiplicity decay modes. In our model, each particle in the decay event is represented as a fixed-dimensional vector of feature attributes, forming an $n \\times k$ representation of the event, where $n$ is the number of particles in the event and $k$ is the dimensionality of the feature vector. A convolutional architecture is used to capture dependencies between the embedded particle representations and perform the final classification. The proposed model performs outperforms standard machine learning approaches based on Monte Carlo studies across a range of variable final-state decays with the Belle II det...

  11. Beyond the standard model

    Cuypers, F.

    1997-05-01

    These lecture notes are intended as a pedagogical introduction to several popular extensions of the standard model of strong and electroweak interactions. The topics include the Higgs sector, the left-right symmetric model, grand unification and supersymmetry. Phenomenological consequences and search procedures are emphasized. (author) figs., tabs., 18 refs

  12. Standardized terminology in electronic

    Pallas, R.

    1985-01-01

    The correct definitions of the most usual terms on standardization and homologation are given. Then the factors concerning the safety of electrical equipments are reviewed, as they are considered in most of the present legislations. Last, the state of homologations in Spain and Europe is included. (author)

  13. Investigation of load reduction for a variable speed, variable pitch, and variable coning wind turbine

    Pierce, K. [Univ. of Utah, Salt Lake City, UT (United States)

    1997-12-31

    A two bladed, variable speed and variable pitch wind turbine was modeled using ADAMS{reg_sign} to evaluate load reduction abilities of a variable coning configuration as compared to a teetered rotor, and also to evaluate control methods. The basic dynamic behavior of the variable coning turbine was investigated and compared to the teetered rotor under constant wind conditions as well as turbulent wind conditions. Results indicate the variable coning rotor has larger flap oscillation amplitudes and much lower root flap bending moments than the teetered rotor. Three methods of control were evaluated for turbulent wind simulations. These were a standard IPD control method, a generalized predictive control method, and a bias estimate control method. Each control method was evaluated for both the variable coning configuration and the teetered configuration. The ability of the different control methods to maintain the rotor speed near the desired set point is evaluated from the RMS error of rotor speed. The activity of the control system is evaluated from cycles per second of the blade pitch angle. All three of the methods were found to produce similar results for the variable coning rotor and the teetered rotor, as well as similar results to each other.

  14. Imaging Variable Stars with HST

    Karovska, M.

    2012-06-01

    (Abstract only) The Hubble Space Telescope (HST) observations of astronomical sources, ranging from objects in our solar system to objects in the early Universe, have revolutionized our knowledge of the Universe its origins and contents. I highlight results from HST observations of variable stars obtained during the past twenty or so years. Multiwavelength observations of numerous variable stars and stellar systems were obtained using the superb HST imaging capabilities and its unprecedented angular resolution, especially in the UV and optical. The HST provided the first detailed images probing the structure of variable stars including their atmospheres and circumstellar environments. AAVSO observations and light curves have been critical for scheduling of many of these observations and provided important information and context for understanding of the imaging results of many variable sources. I describe the scientific results from the imaging observations of variable stars including AGBs, Miras, Cepheids, semiregular variables (including supergiants and giants), YSOs and interacting stellar systems with a variable stellar components. These results have led to an unprecedented understanding of the spatial and temporal characteristics of these objects and their place in the stellar evolutionary chains, and in the larger context of the dynamic evolving Universe.

  15. ISO radiation sterilization standards

    Lambert, Byron J.; Hansen, Joyce M.

    1998-01-01

    This presentation provides an overview of the current status of the ISO radiation sterilization standards. The ISO standards are voluntary standards which detail both the validation and routine control of the sterilization process. ISO 11137 was approved in 1994 and published in 1995. When reviewing the standard you will note that less than 20% of the standard is devoted to requirements and the remainder is guidance on how to comply with the requirements. Future standards developments in radiation sterilization are being focused on providing additional guidance. The guidance that is currently provided in informative annexes of ISO 11137 includes: device/packaging materials, dose setting methods, and dosimeters and dose measurement, currently, there are four Technical Reports being developed to provide additional guidance: 1. AAMI Draft TIR, 'Radiation Sterilization Material Qualification' 2. ISO TR 13409-1996, 'Sterilization of health care products - Radiation sterilization - Substantiation of 25 kGy as a sterilization dose for small or infrequent production batches' 3. ISO Draft TR, 'Sterilization of health care products - Radiation sterilization Selection of a sterilization dose for a single production batch' 4. ISO Draft TR, 'Sterilization of health care products - Radiation sterilization-Product Families, Plans for Sampling and Frequency of Dose Audits'

  16. Grand unified models including extra Z bosons

    Li Tiezhong

    1989-01-01

    The grand unified theories (GUT) of the simple Lie groups including extra Z bosons are discussed. Under authors's hypothesis there are only SU 5+m SO 6+4n and E 6 groups. The general discussion of SU 5+m is given, then the SU 6 and SU 7 are considered. In SU 6 the 15+6 * +6 * fermion representations are used, which are not same as others in fermion content, Yukawa coupling and broken scales. A conception of clans of particles, which are not families, is suggested. These clans consist of extra Z bosons and the corresponding fermions of the scale. The all of fermions in the clans are down quarks except for the standard model which consists of Z bosons and 15 fermions, therefore, the spectrum of the hadrons which are composed of these down quarks are different from hadrons at present

  17. Evaluation of and quality assurance in HER2 analysis in breast carcinomas from patients registered in Danish Breast Cancer Group (DBCG) in the period of 2002-2006. A nationwide study including correlation between HER-2 status and other prognostic variables.

    Rasmussen, Birgitte Bruun; Andersson, Michael; Christensen, Ib J; Møller, Susanne

    2008-01-01

    In Denmark, analysis for HER2 is situated in the pathology laboratories dealing with breast pathology. The analysis was introduced during the late 1990's, and was gradually intensified in the following years up to now. The present study deals with the experience with the analysis during the last 5 years, from 2002 - 2006. All patients, registered in DBCG (Danish Breast Cancer Group) and with a HER2-test were included. The analysis followed international recommendations, with an initial immunohistochemical (IHC) analysis with a semiquantitative grading of the reaction in four grades, 0 and 1+, defined as HER2-negative, 2+, equivocal and 3+, HER2-positive. In the 2+ group, a FISH-test was applied to identify the presence of gene amplification, defined as a ratio >/=2. We investigated the number of analyses performed, the number of positive cases and the relation between the result of IHC and the result of FISH. Furthermore we looked at the relation to other prognostic factors. The number of analyses gradually increased during the years of investigation, from 30% of patients in 2002 to 71% in 2006. The increase was seen in all laboratories resulting in all laboratories but one having a substantial number of analyses. Sixty-two percent of all cases were HER2-negative, 18% were equivocal and 21% positive in the IHC-analysis. Of the 2+, equivocal cases, 23% had gene-amplification. Thus, 23% of patients were defined as HER2-positive and eligible for treatment with trastuzumab. There was a significant correlation to other prognostic factors. The results are in accordance with what is found elsewhere. The quality of the test is further assured by all laboratories participating in external quality assurance schemes.

  18. Groundwater level responses to precipitation variability in Mediterranean insular aquifers

    Lorenzo-Lacruz, Jorge; Garcia, Celso; Morán-Tejeda, Enrique

    2017-09-01

    Groundwater is one of the largest and most important sources of fresh water on many regions under Mediterranean climate conditions, which are exposed to large precipitation variability that includes frequent meteorological drought episodes, and present high evapotranspiration rates and water demand during the dry season. The dependence on groundwater increases in those areas with predominant permeable lithologies, contributing to aquifer recharge and the abundance of ephemeral streams. The increasing pressure of tourism on water resources in many Mediterranean coastal areas, and uncertainty related to future precipitation and water availability, make it urgent to understand the spatio-temporal response of groundwater bodies to precipitation variability, if sustainable use of the resource is to be achieved. We present an assessment of the response of aquifers to precipitation variability based on correlations between the Standardized Precipitation Index (SPI) at various time scales and the Standardized Groundwater Index (SGI) across a Mediterranean island. We detected three main responses of aquifers to accumulated precipitation anomalies: (i) at short time scales of the SPI (24 months). The differing responses were mainly explained by differences in lithology and the percentage of highly permeable rock strata in the aquifer recharge areas. We also identified differences in the months and seasons when aquifer storages are more dependent on precipitation; these were related to climate seasonality and the degree of aquifer exploitation or underground water extraction. The recharge of some aquifers, especially in mountainous areas, is related to precipitation variability within a limited spatial extent, whereas for aquifers located in the plains, precipitation variability influence much larger areas; the topography and geological structure of the island explain these differences. Results indicate large spatial variability in the response of aquifers to precipitation in

  19. Variable geometry Darrieus wind machine

    Pytlinski, J. T.; Serrano, D.

    1983-08-01

    A variable geometry Darrieus wind machine is proposed. The lower attachment of the blades to the rotor can move freely up and down the axle allowing the blades of change shape during rotation. Experimental data for a 17 m. diameter Darrieus rotor and a theoretical model for multiple streamtube performance prediction were used to develop a computer simulation program for studying parameters that affect the machine's performance. This new variable geometry concept is described and interrelated with multiple streamtube theory through aerodynamic parameters. The computer simulation study shows that governor behavior of a Darrieus turbine can not be attained by a standard turbine operating within normally occurring rotational velocity limits. A second generation variable geometry Darrieus wind turbine which uses a telescopic blade is proposed as a potential improvement on the studied concept.

  20. Standardization of depression measurement

    Wahl, Inka; Löwe, Bernd; Bjørner, Jakob

    2014-01-01

    OBJECTIVES: To provide a standardized metric for the assessment of depression severity to enable comparability among results of established depression measures. STUDY DESIGN AND SETTING: A common metric for 11 depression questionnaires was developed applying item response theory (IRT) methods. Data...... of 33,844 adults were used for secondary analysis including routine assessments of 23,817 in- and outpatients with mental and/or medical conditions (46% with depressive disorders) and a general population sample of 10,027 randomly selected participants from three representative German household surveys....... RESULTS: A standardized metric for depression severity was defined by 143 items, and scores were normed to a general population mean of 50 (standard deviation = 10) for easy interpretability. It covers the entire range of depression severity assessed by established instruments. The metric allows...

  1. The Dynamics of Standardization

    Brunsson, Nils; Rasche, Andreas; Seidl, David

    2012-01-01

    This paper suggests that when the phenomenon of standards and standardization is examined from the perspective of organization studies, three aspects stand out: the standardization of organizations, standardization by organizations and standardization as (a form of) organization. Following a comp...

  2. IAEA Safety Standards

    2016-09-01

    The IAEA Safety Standards Series comprises publications of a regulatory nature covering nuclear safety, radiation protection, radioactive waste management, the transport of radioactive material, the safety of nuclear fuel cycle facilities and management systems. These publications are issued under the terms of Article III of the IAEA’s Statute, which authorizes the IAEA to establish “standards of safety for protection of health and minimization of danger to life and property”. Safety standards are categorized into: • Safety Fundamentals, stating the basic objective, concepts and principles of safety; • Safety Requirements, establishing the requirements that must be fulfilled to ensure safety; and • Safety Guides, recommending measures for complying with these requirements for safety. For numbering purposes, the IAEA Safety Standards Series is subdivided into General Safety Requirements and General Safety Guides (GSR and GSG), which are applicable to all types of facilities and activities, and Specific Safety Requirements and Specific Safety Guides (SSR and SSG), which are for application in particular thematic areas. This booklet lists all current IAEA Safety Standards, including those forthcoming

  3. Including gauge corrections to thermal leptogenesis

    Huetig, Janine

    2013-05-17

    This thesis provides the first approach of a systematic inclusion of gauge corrections to leading order to the ansatz of thermal leptogenesis. We have derived a complete expression for the integrated lepton number matrix including all resummations needed. For this purpose, a new class of diagram has been invented, namely the cylindrical diagram, which allows diverse investigations into the topic of leptogenesis such as the case of resonant leptogenesis. After a brief introduction of the topic of the baryon asymmetry in the universe and a discussion of its most promising solutions as well as their advantages and disadvantages, we have presented our framework of thermal leptogenesis. An effective model was described as well as the associated Feynman rules. The basis for using nonequilibrium quantum field theory has been built in chapter 3. At first, the main definitions have been presented for equilibrium thermal field theory, afterwards we have discussed the Kadanoff-Baym equations for systems out of equilibrium using the example of the Majorana neutrino. The equations have also been solved in the context of leptogenesis in chapter 4. Since gauge corrections play a crucial role throughout this thesis, we have also repeated the naive ansatz by replacing the free equilibrium propagator by propagators including thermal damping rates due to the Standard Model damping widths for lepton and Higgs fields. It is shown that this leads to a comparable result to the solutions of the Boltzmann equations for thermal leptogenesis. Thus it becomes obvious that Standard Model corrections are not negligible for thermal leptogenesis and therefore need to be included systematically from first principles. In order to achieve this we have started discussing the calculation of ladder rung diagrams for Majorana neutrinos using the HTL and the CTL approach in chapter 5. All gauge corrections are included in this framework and thus it has become the basis for the following considerations

  4. Including gauge corrections to thermal leptogenesis

    Huetig, Janine

    2013-01-01

    This thesis provides the first approach of a systematic inclusion of gauge corrections to leading order to the ansatz of thermal leptogenesis. We have derived a complete expression for the integrated lepton number matrix including all resummations needed. For this purpose, a new class of diagram has been invented, namely the cylindrical diagram, which allows diverse investigations into the topic of leptogenesis such as the case of resonant leptogenesis. After a brief introduction of the topic of the baryon asymmetry in the universe and a discussion of its most promising solutions as well as their advantages and disadvantages, we have presented our framework of thermal leptogenesis. An effective model was described as well as the associated Feynman rules. The basis for using nonequilibrium quantum field theory has been built in chapter 3. At first, the main definitions have been presented for equilibrium thermal field theory, afterwards we have discussed the Kadanoff-Baym equations for systems out of equilibrium using the example of the Majorana neutrino. The equations have also been solved in the context of leptogenesis in chapter 4. Since gauge corrections play a crucial role throughout this thesis, we have also repeated the naive ansatz by replacing the free equilibrium propagator by propagators including thermal damping rates due to the Standard Model damping widths for lepton and Higgs fields. It is shown that this leads to a comparable result to the solutions of the Boltzmann equations for thermal leptogenesis. Thus it becomes obvious that Standard Model corrections are not negligible for thermal leptogenesis and therefore need to be included systematically from first principles. In order to achieve this we have started discussing the calculation of ladder rung diagrams for Majorana neutrinos using the HTL and the CTL approach in chapter 5. All gauge corrections are included in this framework and thus it has become the basis for the following considerations

  5. Variable importance in latent variable regression models

    Kvalheim, O.M.; Arneberg, R.; Bleie, O.; Rajalahti, T.; Smilde, A.K.; Westerhuis, J.A.

    2014-01-01

    The quality and practical usefulness of a regression model are a function of both interpretability and prediction performance. This work presents some new graphical tools for improved interpretation of latent variable regression models that can also assist in improved algorithms for variable

  6. Variability in Measured Space Temperatures in 60 Homes

    Roberts, D.; Lay, K.

    2013-03-01

    This report discusses the observed variability in indoor space temperature in a set of 60 homes located in Florida, New York, Oregon, and Washington. Temperature data were collected at 15-minute intervals for an entire year, including living room, master bedroom, and outdoor air temperature (Arena, et. al). The data were examined to establish the average living room temperature for the set of homes for the heating and cooling seasons, the variability of living room temperature depending on climate, and the variability of indoor space temperature within the homes. The accuracy of software-based energy analysis depends on the accuracy of input values. Thermostat set point is one of the most influential inputs for building energy simulation. Several industry standards exist that recommend differing default thermostat settings for heating and cooling seasons. These standards were compared to the values calculated for this analysis. The data examined for this report show that there is a definite difference between the climates and that the data do not agree well with any particular standard.

  7. An Efficient Method for Synthesis of Planar Multibody Systems including Shape of Bodies as Design Variables

    Hansen, Michael R.; Hansen, John Michael

    1998-01-01

    A point contact joint has been developed and implemented in a joint coordinate based planar multibody dynamics analysis program that also supports revolute and translational joints. Further, a segment library for the definition of the contours of the point contact joints has been integrated...

  8. Enhancing the efficacy of treatment for temporomandibular patients with muscular diagnosis through cognitive-behavioral intervention, including hypnosis: a randomized study.

    Ferrando, Maite; Galdón, María José; Durá, Estrella; Andreu, Yolanda; Jiménez, Yolanda; Poveda, Rafael

    2012-01-01

    This study evaluated the efficacy of a cognitive-behavioral therapy (CBT), including hypnosis, in patients with temporomandibular disorders (TMDs) with muscular diagnosis. Seventy-two patients (65 women and 7 men with an average age of 39 years) were selected according to the Research Diagnostic Criteria for TMD, and assigned to the experimental group (n = 41), receiving the 6-session CBT program, and the control group (n = 31). All patients received conservative standard treatment for TMD. The assessment included pain variables and psychologic distress. There were significant differences between the groups, the experimental group showing a higher improvement in the variables evaluated. Specifically, 90% of the patients under CBT reported a significant reduction in frequency of pain and 70% in emotional distress. The improvement was stable over time, with no significant differences between posttreatment and 9-month follow-up. CBT, including hypnosis, significantly improved conservative standard treatment outcome in TMD patients. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. A Framework for Categorizing Important Project Variables

    Parsons, Vickie S.

    2003-01-01

    While substantial research has led to theories concerning the variables that affect project success, no universal set of such variables has been acknowledged as the standard. The identification of a specific set of controllable variables is needed to minimize project failure. Much has been hypothesized about the need to match project controls and management processes to individual projects in order to increase the chance for success. However, an accepted taxonomy for facilitating this matching process does not exist. This paper surveyed existing literature on classification of project variables. After an analysis of those proposals, a simplified categorization is offered to encourage further research.

  10. Classification and prediction of port variables

    Molina Serrano, B.

    2016-07-01

    Many variables are included in planning and management of port terminals. They can beeconomic, social, environmental and institutional. Agent needs to know relationshipbetween these variables to modify planning conditions. Use of Bayesian Networks allowsfor classifying, predicting and diagnosing these variables. Bayesian Networks allow forestimating subsequent probability of unknown variables, basing on know variables.In planning level, it means that it is not necessary to know all variables because theirrelationships are known. Agent can know interesting information about how port variablesare connected. It can be interpreted as cause-effect relationship. Bayesian Networks can beused to make optimal decisions by introduction of possible actions and utility of theirresults.In proposed methodology, a data base has been generated with more than 40 port variables.They have been classified in economic, social, environmental and institutional variables, inthe same way that smart port studies in Spanish Port System make. From this data base, anetwork has been generated using a non-cyclic conducted grafo which allows for knowingport variable relationships - parents-children relationships-. Obtained network exhibits thateconomic variables are – in cause-effect terms- cause of rest of variable typologies.Economic variables represent parent role in the most of cases. Moreover, whenenvironmental variables are known, obtained network allows for estimating subsequentprobability of social variables.It has been concluded that Bayesian Networks allow for modeling uncertainty in aprobabilistic way, even when number of variables is high as occurs in planning andmanagement of port terminals. (Author)

  11. Spontaneous temporal changes and variability of peripheral nerve conduction analyzed using a random effects model

    Krøigård, Thomas; Gaist, David; Otto, Marit

    2014-01-01

    SUMMARY: The reproducibility of variables commonly included in studies of peripheral nerve conduction in healthy individuals has not previously been analyzed using a random effects regression model. We examined the temporal changes and variability of standard nerve conduction measures in the leg...... reexamined after 2 and 26 weeks. There was no change in the variables except for a minor decrease in sural nerve sensory action potential amplitude and a minor increase in tibial nerve minimal F-wave latency. Reproducibility was best for peroneal nerve distal motor latency and motor conduction velocity......, sural nerve sensory conduction velocity, and tibial nerve minimal F-wave latency. Between-subject variability was greater than within-subject variability. Sample sizes ranging from 21 to 128 would be required to show changes twice the magnitude of the spontaneous changes observed in this study. Nerve...

  12. Efficiency of a new internal combustion engine concept with variable piston motion

    Dorić Jovan Ž.

    2014-01-01

    Full Text Available This paper presents simulation of working process in a new IC engine concept. The main feature of this new IC engine concept is the realization of variable movement of the piston. With this unconventional piston movement it is easy to provide variable compression ratio, variable displacement and combustion during constant volume. These advantages over standard piston mechanism are achieved through synthesis of the two pairs of non-circular gears. Presented mechanism is designed to obtain a specific motion law which provides better fuel consumption of IC engines. For this paper Ricardo/WAVE software was used, which provides a fully integrated treatment of time-dependent fluid dynamics and thermodynamics by means of onedimensional formulation. The results obtained herein include the efficiency characteristic of this new heat engine concept. The results show that combustion during constant volume, variable compression ratio and variable displacement have significant impact on improvement of fuel consumption.

  13. Understanding Solar Cycle Variability

    Cameron, R. H.; Schüssler, M., E-mail: cameron@mps.mpg.de [Max-Planck-Institut für Sonnensystemforschung, Justus-von-Liebig-Weg 3, D-37077 Göttingen (Germany)

    2017-07-10

    The level of solar magnetic activity, as exemplified by the number of sunspots and by energetic events in the corona, varies on a wide range of timescales. Most prominent is the 11-year solar cycle, which is significantly modulated on longer timescales. Drawing from dynamo theory, together with the empirical results of past solar activity and similar phenomena for solar-like stars, we show that the variability of the solar cycle can be essentially understood in terms of a weakly nonlinear limit cycle affected by random noise. In contrast to ad hoc “toy models” for the solar cycle, this leads to a generic normal-form model, whose parameters are all constrained by observations. The model reproduces the characteristics of the variable solar activity on timescales between decades and millennia, including the occurrence and statistics of extended periods of very low activity (grand minima). Comparison with results obtained with a Babcock–Leighton-type dynamo model confirm the validity of the normal-mode approach.

  14. Zγ production at NNLO including anomalous couplings

    Campbell, John M.; Neumann, Tobias; Williams, Ciaran

    2017-11-01

    In this paper we present a next-to-next-to-leading order (NNLO) QCD calculation of the processes pp → l + l -γ and pp\\to ν \\overline{ν}γ that we have implemented in MCFM. Our calculation includes QCD corrections at NNLO both for the Standard Model (SM) and additionally in the presence of Zγγ and ZZγ anomalous couplings. We compare our implementation, obtained using the jettiness slicing approach, with a previous SM calculation and find broad agreement. Focusing on the sensitivity of our results to the slicing parameter, we show that using our setup we are able to compute NNLO cross sections with numerical uncertainties of about 0.1%, which is small compared to residual scale uncertainties of a few percent. We study potential improvements using two different jettiness definitions and the inclusion of power corrections. At √{s}=13 TeV we present phenomenological results and consider Zγ as a background to H → Zγ production. We find that, with typical cuts, the inclusion of NNLO corrections represents a small effect and loosens the extraction of limits on anomalous couplings by about 10%.

  15. Mechanics of deformations in terms of scalar variables

    Ryabov, Valeriy A.

    2017-05-01

    Theory of particle and continuous mechanics is developed which allows a treatment of pure deformation in terms of the set of variables "coordinate-momentum-force" instead of the standard treatment in terms of tensor-valued variables "strain-stress." This approach is quite natural for a microscopic description of atomic system, according to which only pointwise forces caused by the stress act to atoms making a body deform. The new concept starts from affine transformation of spatial to material coordinates in terms of the stretch tensor or its analogs. Thus, three principal stretches and three angles related to their orientation form a set of six scalar variables to describe deformation. Instead of volume-dependent potential used in the standard theory, which requires conditions of equilibrium for surface and body forces acting to a volume element, a potential dependent on scalar variables is introduced. A consistent introduction of generalized force associated with this potential becomes possible if a deformed body is considered to be confined on the surface of torus having six genuine dimensions. Strain, constitutive equations and other fundamental laws of the continuum and particle mechanics may be neatly rewritten in terms of scalar variables. Giving a new presentation for finite deformation new approach provides a full treatment of hyperelasticity including anisotropic case. Derived equations of motion generate a new kind of thermodynamical ensemble in terms of constant tension forces. In this ensemble, six internal deformation forces proportional to the components of Irving-Kirkwood stress are controlled by applied external forces. In thermodynamical limit, instead of the pressure and volume as state variables, this ensemble employs deformation force measured in kelvin unit and stretch ratio.

  16. Electroweak interaction: Standard and beyond

    Harari, H.

    1987-02-01

    Several important topics within the standard model raise questions which are likely to be answered only by further theoretical understanding which goes beyond the standard model. In these lectures we present a discussion of some of these problems, including the quark masses and angles, the Higgs sector, neutrino masses, W and Z properties and possible deviations from a pointlike structure. 44 refs

  17. Variable cycle engine

    Adamson, A.P.; Sprunger, E.V.

    1980-09-16

    A variable cycle turboshaft engine includes a remote fan system and respective high and low pressure systems for selectively driving the fan system in such a manner as to provide VTOL takeoff capability and minimum specific fuel consumption (SFC) at cruise and loiter conditions. For takeoff the fan system is primarily driven by the relatively large low pressure system whose combustor receives the motive fluid from a core bypass duct and, for cruise and loiter conditions, the fan system is driven by both a relatively small high pressure core and the low pressure system with its combustor inoperative. A mixer is disposed downstream of the high pressure system for mixing the relatively cold air from the bypass duct and the relatively hot air from the core prior to its flow to the low pressure turbine.

  18. Focus on variability : New tools to study intra-individual variability in developmental data

    van Geert, P; van Dijk, M

    2002-01-01

    In accordance with dynamic systems theory, we assume that variability is an important developmental phenomenon. However, the standard methodological toolkit of the developmental psychologist offers few instruments for the study of variability. In this article we will present several new methods that

  19. Acoustic response variability in automotive vehicles

    Hills, E.; Mace, B. R.; Ferguson, N. S.

    2009-03-01

    A statistical analysis of a series of measurements of the audio-frequency response of a large set of automotive vehicles is presented: a small hatchback model with both a three-door (411 vehicles) and five-door (403 vehicles) derivative and a mid-sized family five-door car (316 vehicles). The sets included vehicles of various specifications, engines, gearboxes, interior trim, wheels and tyres. The tests were performed in a hemianechoic chamber with the temperature and humidity recorded. Two tests were performed on each vehicle and the interior cabin noise measured. In the first, the excitation was acoustically induced by sets of external loudspeakers. In the second test, predominantly structure-borne noise was induced by running the vehicle at a steady speed on a rough roller. For both types of excitation, it is seen that the effects of temperature are small, indicating that manufacturing variability is larger than that due to temperature for the tests conducted. It is also observed that there are no significant outlying vehicles, i.e. there are at most only a few vehicles that consistently have the lowest or highest noise levels over the whole spectrum. For the acoustically excited tests, measured 1/3-octave noise reduction levels typically have a spread of 5 dB or so and the normalised standard deviation of the linear data is typically 0.1 or higher. Regarding the statistical distribution of the linear data, a lognormal distribution is a somewhat better fit than a Gaussian distribution for lower 1/3-octave bands, while the reverse is true at higher frequencies. For the distribution of the overall linear levels, a Gaussian distribution is generally the most representative. As a simple description of the response variability, it is sufficient for this series of measurements to assume that the acoustically induced airborne cabin noise is best described by a Gaussian distribution with a normalised standard deviation between 0.09 and 0.145. There is generally

  20. Non-standard patch test

    Astri Adelia

    2018-06-01

    Full Text Available In managing contact dermatitis, identification of the causative agent is essential to prevent recurrent complaints. Patch test is the gold standard to identify the causative agent. Nowadays, there are many patch test standard materials available in the market, but do not include all the materials that potentially cause contact dermatitis. Patch test using patient’s own products or later we refer to as non-standard materials, is very helpful in identifying the causative agents of contact dermatitis. Guidance is needed in producing non-standard patch test materials in order to avoid test results discrepancy.

  1. SEEPAGE MODEL FOR PA INCLUDING DRIFT COLLAPSE

    C. Tsang

    2004-01-01

    The purpose of this report is to document the predictions and analyses performed using the seepage model for performance assessment (SMPA) for both the Topopah Spring middle nonlithophysal (Tptpmn) and lower lithophysal (Tptpll) lithostratigraphic units at Yucca Mountain, Nevada. Look-up tables of seepage flow rates into a drift (and their uncertainty) are generated by performing numerical simulations with the seepage model for many combinations of the three most important seepage-relevant parameters: the fracture permeability, the capillary-strength parameter 1/a, and the percolation flux. The percolation flux values chosen take into account flow focusing effects, which are evaluated based on a flow-focusing model. Moreover, multiple realizations of the underlying stochastic permeability field are conducted. Selected sensitivity studies are performed, including the effects of an alternative drift geometry representing a partially collapsed drift from an independent drift-degradation analysis (BSC 2004 [DIRS 166107]). The intended purpose of the seepage model is to provide results of drift-scale seepage rates under a series of parameters and scenarios in support of the Total System Performance Assessment for License Application (TSPA-LA). The SMPA is intended for the evaluation of drift-scale seepage rates under the full range of parameter values for three parameters found to be key (fracture permeability, the van Genuchten 1/a parameter, and percolation flux) and drift degradation shape scenarios in support of the TSPA-LA during the period of compliance for postclosure performance [Technical Work Plan for: Performance Assessment Unsaturated Zone (BSC 2002 [DIRS 160819], Section I-4-2-1)]. The flow-focusing model in the Topopah Spring welded (TSw) unit is intended to provide an estimate of flow focusing factors (FFFs) that (1) bridge the gap between the mountain-scale and drift-scale models, and (2) account for variability in local percolation flux due to

  2. Why We Should Establish a National System of Standards.

    Hennen, Thomas J., Jr.

    2000-01-01

    Explains the need to establish a national system of standards for public libraries. Discusses local standards, state standards, and international standards, and suggests adopting a tiered approach including three levels: minimum standards; target standards; and benchmarking standards, as found in total quality management. (LRW)

  3. Influence of attrition variables on iron ore flotation

    Fabiana Fonseca Fortes

    Full Text Available Abstract The presence of slimes is harmful to the flotation process: the performance and consumption of reagents are negatively affected. Traditionally, the desliming stage has been responsible for removing slimes. However, depending on the porosity of the mineral particles, desliming may not be sufficient to maximize the concentration results. An attrition process before the desliming operation can improve the removal of slime, especially when slimes cover the surface and/or are confined to the cavities/pores of the mineral particles. Attrition is present in the flowcharts of the beneficiation process of phosphate and industrial sand (silica sand. Research has been undertaken for its application to produce pre-concentrates of zircon and iron ore. However, there is still little knowledge of the influence of the attrition variables on the beneficiation process of iron ore. This study presents a factorial design and analysis of the effects of these variables on the reverse flotation of iron ore. The standard of the experimental procedures for all tests included the attrition of pulp, under the conditions of dispersion, desliming and flotation. The parameter analysed (variable response was the metallurgical recovery in reverse flotation tests. The planning and analysis of the full factorial experiment indicated that with 95% reliability, the rotation speed of the attrition cell impeller was the main variable in the attrition process of the iron ore. The percentage of solid variables in the pulp and the time of the attrition, as well as their interactions, were not indicated to be significant.

  4. Variability in carbon exchange of European croplands

    Eddy J, Moors; Jacobs, Cor; Jans, Wilma

    2010-01-01

    The estimated net ecosystem exchange (NEE) of CO2 based on measurements at 17 flux sites in Europe for 45 cropping periods showed an average loss of -38 gC m-2 per cropping period. The cropping period is defined as the period after sowing or planting until harvest. The variability taken as the st......The estimated net ecosystem exchange (NEE) of CO2 based on measurements at 17 flux sites in Europe for 45 cropping periods showed an average loss of -38 gC m-2 per cropping period. The cropping period is defined as the period after sowing or planting until harvest. The variability taken...... as the standard deviation of these cropping periods was 251 gC m-2. These numbers do not include lateral inputs such as the carbon content of applied manure, nor the carbon exchange out of the cropping period. Both are expected to have a major effect on the C budget of high energy summer crops such as maize. NEE...... and gross primary production (GPP) can be estimated by crop net primary production based on inventories of biomass at these sites, independent of species and regions. NEE can also be estimated by the product of photosynthetic capacity and the number of days with the average air temperature >5 °C. Yield...

  5. Amplification factor variable amplifier

    Akitsugu, Oshita; Nauta, Bram

    2007-01-01

    PROBLEM TO BE SOLVED: To provide an amplification factor variable amplifier capable of achieving temperature compensation of an amplification factor over a wide variable amplification factor range. ; SOLUTION: A Gilbert type amplification factor variable amplifier 11 amplifies an input signal and

  6. Amplification factor variable amplifier

    Akitsugu, Oshita; Nauta, Bram

    2010-01-01

    PROBLEM TO BE SOLVED: To provide an amplification factor variable amplifier capable of achieving temperature compensation of an amplification factor over a wide variable amplification factor range. ;SOLUTION: A Gilbert type amplification factor variable amplifier 11 amplifies an input signal and can

  7. PaCAL: A Python Package for Arithmetic Computations with Random Variables

    Marcin Korze?

    2014-05-01

    Full Text Available In this paper we present PaCAL, a Python package for arithmetical computations on random variables. The package is capable of performing the four arithmetic operations: addition, subtraction, multiplication and division, as well as computing many standard functions of random variables. Summary statistics, random number generation, plots, and histograms of the resulting distributions can easily be obtained and distribution parameter ?tting is also available. The operations are performed numerically and their results interpolated allowing for arbitrary arithmetic operations on random variables following practically any probability distribution encountered in practice. The package is easy to use, as operations on random variables are performed just as they are on standard Python variables. Independence of random variables is, by default, assumed on each step but some computations on dependent random variables are also possible. We demonstrate on several examples that the results are very accurate, often close to machine precision. Practical applications include statistics, physical measurements or estimation of error distributions in scienti?c computations.

  8. Modeling dynamic procurement auctions of standardized supply contracts in electricity markets including bidders adaptation

    Henry Camilo Torres-Valderrama

    2015-01-01

    Full Text Available Las subastas de reloj descendent e han incrementado su uso en lo s mercados de energía eléctrica. Las aproximaciones tradicional es a estas subastas se han enfocado en encontrar la mejor respuesta de los postores pero desconociendo la adaptación de ellos a lo largo de la subasta. Este artículo presenta un algor itmo basado en la teoría de deci sión para estimar el comportamiento de los postores a lo largo de la subasta. El modelo propuesto usa concepto s de portafolios financieros y datos históricos sobre el mercad o spot de energía eléctrica par a estimar una curva de oferta de contrato de los generadores. El modelo f ue utilizado para evaluar el Mercado Organizado (MOR en Colomb ia. Los parámetros de la curva de demand a y el tamaño de la cada ronda, fueron variados para evaluar el impacto sobre la salida de la subasta. Los resultados muestran que la curva de demanda tiene un pequeña im pacto sobre la adaptación de los pujadores y que el tamaño de r onda es útil para evitar los c omportamientos no comp etitivos. Adicional mente se muestra que los precios de inicio de la subasta tienen una gran influencia sobre los precios de cierre. Los resultados aquí pre sentados son útiles para el dise ño de estructuras de mercado en el sector eléctrico.

  9. Modification of ASTM Standard E1681 on Environmental Cracking to Include Bolt-Load Specimen Testing

    Underwood, Jean D. M

    1997-01-01

    Benet Laboratories experience with environmental cracking of cannon components has been combined with the technical expertise of various participants at ASTM technical meetings and symposia to develop...

  10. Voluntary Consensus Organization Standards for Nondestructive Evaluation of Aerospace Materials (including Additive Manufactured Parts)

    National Aeronautics and Space Administration — This NASA-industry effort accomplishes the following:1) Lead collaboration between NASA Centers, other government agencies, industry, academia, and voluntary census...

  11. Variable Selection via Partial Correlation.

    Li, Runze; Liu, Jingyuan; Lou, Lejia

    2017-07-01

    Partial correlation based variable selection method was proposed for normal linear regression models by Bühlmann, Kalisch and Maathuis (2010) as a comparable alternative method to regularization methods for variable selection. This paper addresses two important issues related to partial correlation based variable selection method: (a) whether this method is sensitive to normality assumption, and (b) whether this method is valid when the dimension of predictor increases in an exponential rate of the sample size. To address issue (a), we systematically study this method for elliptical linear regression models. Our finding indicates that the original proposal may lead to inferior performance when the marginal kurtosis of predictor is not close to that of normal distribution. Our simulation results further confirm this finding. To ensure the superior performance of partial correlation based variable selection procedure, we propose a thresholded partial correlation (TPC) approach to select significant variables in linear regression models. We establish the selection consistency of the TPC in the presence of ultrahigh dimensional predictors. Since the TPC procedure includes the original proposal as a special case, our theoretical results address the issue (b) directly. As a by-product, the sure screening property of the first step of TPC was obtained. The numerical examples also illustrate that the TPC is competitively comparable to the commonly-used regularization methods for variable selection.

  12. Environmental standards provide competitive advantage

    Chynoweth, E.; Kirshner, E.

    1993-01-01

    Quality organizations are breaking new ground with the development of international standards for environmental management. These promise to provide the platform for chemical companies wanting to establish their environmental credibility with a global audience. open-quotes It will be similar to auditing our customers to ISO 9000 close-quote, says the environmental manager for a European chemical firm. open-quote We will only want to deal with people who have got their environmental act together. And we'll be in a better competitive positions close-quote. The International Organization for Standardization (ISO;Geneva) has set up a taskforce to develop an environmental management standard, which is expected to be completed by the mid-1990s. Observers think the ISO standard will draw heavily on the British Standard Institute's (BSI;London) environmental management standard, BS7750, which will likely be the first system adopted in the world. Published last year, BS7750 has been extensively piloted in the UK (CW, Sept. 30, 1992, p. 62) and is now set to be revised before being offically adopted by BSI. The UK's Chemical Industries Association (CIA;London) is anxious to prevent a proliferation of standards, and its report on BS7750 pilot projects calls for an approach integrating quality, environment, and health and safety. But standard setters, including ISO, appear to be moving in the opposite direction. In the US, the American national Standards Institute (ANSI;Washington) has started work on an environmental management standard

  13. Quantum interference of probabilities and hidden variable theories

    Srinivas, M.D.

    1984-01-01

    One of the fundamental contributions of Louis de Broglie, which does not get cited often, has been his analysis of the basic difference between the calculus of the probabilities as predicted by quantum theory and the usual calculus of probabilities - the one employed by most mathematicians, in its standard axiomatised version due to Kolmogorov. This paper is basically devoted to a discussion of the 'quantum interference of probabilities', discovered by de Broglie. In particular, it is shown that it is this feature of the quantum theoretic probabilities which leads to some serious constraints on the possible 'hidden-variable formulations' of quantum mechanics, including the celebrated theorem of Bell. (Auth.)

  14. Decomposing global crop yield variability

    Ben-Ari, Tamara; Makowski, David

    2014-11-01

    Recent food crises have highlighted the need to better understand the between-year variability of agricultural production. Although increasing future production seems necessary, the globalization of commodity markets suggests that the food system would also benefit from enhanced supplies stability through a reduction in the year-to-year variability. Here, we develop an analytical expression decomposing global crop yield interannual variability into three informative components that quantify how evenly are croplands distributed in the world, the proportion of cultivated areas allocated to regions of above or below average variability and the covariation between yields in distinct world regions. This decomposition is used to identify drivers of interannual yield variations for four major crops (i.e., maize, rice, soybean and wheat) over the period 1961-2012. We show that maize production is fairly spread but marked by one prominent region with high levels of crop yield interannual variability (which encompasses the North American corn belt in the USA, and Canada). In contrast, global rice yields have a small variability because, although spatially concentrated, much of the production is located in regions of below-average variability (i.e., South, Eastern and South Eastern Asia). Because of these contrasted land use allocations, an even cultivated land distribution across regions would reduce global maize yield variance, but increase the variance of global yield rice. Intermediate results are obtained for soybean and wheat for which croplands are mainly located in regions with close-to-average variability. At the scale of large world regions, we find that covariances of regional yields have a negligible contribution to global yield variance. The proposed decomposition could be applied at any spatial and time scales, including the yearly time step. By addressing global crop production stability (or lack thereof) our results contribute to the understanding of a key

  15. Standards for Standardized Logistic Regression Coefficients

    Menard, Scott

    2011-01-01

    Standardized coefficients in logistic regression analysis have the same utility as standardized coefficients in linear regression analysis. Although there has been no consensus on the best way to construct standardized logistic regression coefficients, there is now sufficient evidence to suggest a single best approach to the construction of a…

  16. Instant standard concept for data standards development

    Folmer, Erwin Johan Albert; Kulcsor, Istvan Zsolt; Roes, Jasper

    2013-01-01

    This paper presents the current results of an ongoing research about a new data standards development concept. The concept is called Instant Standard referring to the pressure that is generated by shrinking the length of the standardization process. Based on this concept it is estimated that the

  17. Telerobotic Control Architecture Including Force-Reflection

    Murphy, Mark

    1998-01-01

    This report describes the implementation of a telerobotic control architecture to manipulate a standard six-degree-of-freedom robot via a unique seven-degree-of-freedom force-reflecting exoskeleton...

  18. Standardized nomenclatures: keys to continuity of care, nursing accountability and nursing effectiveness.

    Keenan, G; Aquilino, M L

    1998-01-01

    Standardized nursing nomenclatures must be included in clinical documentation systems to generate data that more accurately represent nursing practice than outcomes-related measures currently used to support important policy decisions. NANDA, NIC, and NOC--comprehensive nomenclatures for the needed variables of nursing diagnoses, interventions, and outcomes--are described. Added benefits of using NANDA, NIC, and NOC in everyday practice are outlined, including facilitation of the continuity of care of patients in integrated health systems.

  19. The Future of Geospatial Standards

    Bermudez, L. E.; Simonis, I.

    2016-12-01

    The OGC is an international not-for-profit standards development organization (SDO) committed to making quality standards for the geospatial community. A community of more than 500 member organizations with more than 6,000 people registered at the OGC communication platform drives the development of standards that are freely available for anyone to use and to improve sharing of the world's geospatial data. OGC standards are applied in a variety of application domains including Environment, Defense and Intelligence, Smart Cities, Aviation, Disaster Management, Agriculture, Business Development and Decision Support, and Meteorology. Profiles help to apply information models to different communities, thus adapting to particular needs of that community while ensuring interoperability by using common base models and appropriate support services. Other standards address orthogonal aspects such as handling of Big Data, Crowd-sourced information, Geosemantics, or container for offline data usage. Like most SDOs, the OGC develops and maintains standards through a formal consensus process under the OGC Standards Program (OGC-SP) wherein requirements and use cases are discussed in forums generally open to the public (Domain Working Groups, or DWGs), and Standards Working Groups (SWGs) are established to create standards. However, OGC is unique among SDOs in that it also operates the OGC Interoperability Program (OGC-IP) to provide real-world testing of existing and proposed standards. The OGC-IP is considered the experimental playground, where new technologies are researched and developed in a user-driven process. Its goal is to prototype, test, demonstrate, and promote OGC Standards in a structured environment. Results from the OGC-IP often become requirements for new OGC standards or identify deficiencies in existing OGC standards that can be addressed. This presentation will provide an analysis of the work advanced in the OGC consortium including standards and testbeds

  20. Extent of, and variables associated with, blood pressure variability among older subjects.

    Morano, Arianna; Ravera, Agnese; Agosta, Luca; Sappa, Matteo; Falcone, Yolanda; Fonte, Gianfranco; Isaia, Gianluca; Isaia, Giovanni Carlo; Bo, Mario

    2018-02-23

    Blood pressure variability (BPV) may have prognostic implications for cardiovascular risk and cognitive decline; however, BPV has yet to be studied in old and very old people. Aim of the present study was to evaluate the extent of BPV and to identify variables associated with BPV among older subjects. A retrospective study of patients aged ≥ 65 years who underwent 24-h ambulatory blood pressure monitoring (ABPM) was carried out. Three different BPV indexes were calculated for systolic and diastolic blood pressure (SBP and DBP): standard deviation (SD), coefficient of variation (CV), and average real variability (ARV). Demographic variables and use of antihypertensive medications were considered. The study included 738 patients. Mean age was 74.8 ± 6.8 years. Mean SBP and DBP SD were 20.5 ± 4.4 and 14.6 ± 3.4 mmHg. Mean SBP and DBP CV were 16 ± 3 and 20 ± 5%. Mean SBP and DBP ARV were 15.7 ± 3.9 and 11.8 ± 3.6 mmHg. At multivariate analysis older age, female sex and uncontrolled mean blood pressure were associated with both systolic and diastolic BPV indexes. The use of calcium channel blockers and alpha-adrenergic antagonists was associated with lower systolic and diastolic BPV indexes, respectively. Among elderly subjects undergoing 24-h ABPM, we observed remarkably high indexes of BPV, which were associated with older age, female sex, and uncontrolled blood pressure values.

  1. Malaysian NDT standards

    Khazali Mohd Zin

    2001-01-01

    In order to become a developed country, Malaysia needs to develop her own national standards. It has been projected that by the year 2020, Malaysia requires about 8,000 standards (Department of Standard Malaysia). Currently more than 2,000 Malaysian Standards have been gazette by the government which considerably too low before tire year 2020. NDT standards have been identified by the standard working group as one of the areas to promote our national standards. In this paper the author describes the steps taken to establish the Malaysian very own NDT standards. The project starts with the establishment of radiographic standards. (Author)

  2. Standard biological parts knowledgebase.

    Michal Galdzicki

    2011-02-01

    Full Text Available We have created the Knowledgebase of Standard Biological Parts (SBPkb as a publically accessible Semantic Web resource for synthetic biology (sbolstandard.org. The SBPkb allows researchers to query and retrieve standard biological parts for research and use in synthetic biology. Its initial version includes all of the information about parts stored in the Registry of Standard Biological Parts (partsregistry.org. SBPkb transforms this information so that it is computable, using our semantic framework for synthetic biology parts. This framework, known as SBOL-semantic, was built as part of the Synthetic Biology Open Language (SBOL, a project of the Synthetic Biology Data Exchange Group. SBOL-semantic represents commonly used synthetic biology entities, and its purpose is to improve the distribution and exchange of descriptions of biological parts. In this paper, we describe the data, our methods for transformation to SBPkb, and finally, we demonstrate the value of our knowledgebase with a set of sample queries. We use RDF technology and SPARQL queries to retrieve candidate "promoter" parts that are known to be both negatively and positively regulated. This method provides new web based data access to perform searches for parts that are not currently possible.

  3. Standard Biological Parts Knowledgebase

    Galdzicki, Michal; Rodriguez, Cesar; Chandran, Deepak; Sauro, Herbert M.; Gennari, John H.

    2011-01-01

    We have created the Knowledgebase of Standard Biological Parts (SBPkb) as a publically accessible Semantic Web resource for synthetic biology (sbolstandard.org). The SBPkb allows researchers to query and retrieve standard biological parts for research and use in synthetic biology. Its initial version includes all of the information about parts stored in the Registry of Standard Biological Parts (partsregistry.org). SBPkb transforms this information so that it is computable, using our semantic framework for synthetic biology parts. This framework, known as SBOL-semantic, was built as part of the Synthetic Biology Open Language (SBOL), a project of the Synthetic Biology Data Exchange Group. SBOL-semantic represents commonly used synthetic biology entities, and its purpose is to improve the distribution and exchange of descriptions of biological parts. In this paper, we describe the data, our methods for transformation to SBPkb, and finally, we demonstrate the value of our knowledgebase with a set of sample queries. We use RDF technology and SPARQL queries to retrieve candidate “promoter” parts that are known to be both negatively and positively regulated. This method provides new web based data access to perform searches for parts that are not currently possible. PMID:21390321

  4. Standard biological parts knowledgebase.

    Galdzicki, Michal; Rodriguez, Cesar; Chandran, Deepak; Sauro, Herbert M; Gennari, John H

    2011-02-24

    We have created the Knowledgebase of Standard Biological Parts (SBPkb) as a publically accessible Semantic Web resource for synthetic biology (sbolstandard.org). The SBPkb allows researchers to query and retrieve standard biological parts for research and use in synthetic biology. Its initial version includes all of the information about parts stored in the Registry of Standard Biological Parts (partsregistry.org). SBPkb transforms this information so that it is computable, using our semantic framework for synthetic biology parts. This framework, known as SBOL-semantic, was built as part of the Synthetic Biology Open Language (SBOL), a project of the Synthetic Biology Data Exchange Group. SBOL-semantic represents commonly used synthetic biology entities, and its purpose is to improve the distribution and exchange of descriptions of biological parts. In this paper, we describe the data, our methods for transformation to SBPkb, and finally, we demonstrate the value of our knowledgebase with a set of sample queries. We use RDF technology and SPARQL queries to retrieve candidate "promoter" parts that are known to be both negatively and positively regulated. This method provides new web based data access to perform searches for parts that are not currently possible.

  5. US Topo Product Standard

    Cooley, Michael J.; Davis, Larry R.; Fishburn, Kristin A.; Lestinsky, Helmut; Moore, Laurence R.

    2011-01-01

    This document defines a U.S. Geological Survey (USGS) digital topographic map. This map series, named “US Topo,” is modeled on what is referred to as the standard USGS 7.5-minute (1:24,000-scale) topographic map series that was created during the period from 1947 to approximately 1992. The US Topo map product has the same extent, scale, and general layout as the older standard topographic maps. However, unlike the previous maps, US Topo maps are published using Adobe Systems Inc. Portable Document Format (PDF) with a geospatial extension that is called Georeferenced PDF (GeoPDF), patented by TerraGo Technologies. In addition, the US Topo map products incorporate an orthorectified image along with data that was included in the standard 7.5-minute topographic maps. US Topo maps are intended to serve conventional map users by providing Geographic Information System (GIS) information in symbolized form in the customary topographic map layout. The maps are not intended for GIS analysis applications.

  6. Implementing PAT with Standards

    Chandramohan, Laakshmana Sabari; Doolla, Suryanarayana; Khaparde, S. A.

    2016-02-01

    Perform Achieve Trade (PAT) is a market-based incentive mechanism to promote energy efficiency. The purpose of this work is to address the challenges inherent to inconsistent representation of business processes, and interoperability issues in PAT like cap-and-trade mechanisms especially when scaled. Studies by various agencies have highlighted that as the mechanism evolves including more industrial sectors and industries in its ambit, implementation will become more challenging. This paper analyses the major needs of PAT (namely tracking, monitoring, auditing & verifying energy-saving reports, and providing technical support & guidance to stakeholders); and how the aforesaid reasons affect them. Though current technologies can handle these challenges to an extent, standardization activities for implementation have been scanty for PAT and this work attempts to evolve them. The inconsistent modification of business processes, rules, and procedures across stakeholders, and interoperability among heterogeneous systems are addressed. This paper proposes the adoption of specifically two standards into PAT, namely Business Process Model and Notation for maintaining consistency in business process modelling, and Common Information Model (IEC 61970, 61968, 62325 combined) for information exchange. Detailed architecture and organization of these adoptions are reported. The work can be used by PAT implementing agencies, stakeholders, and standardization bodies.

  7. Cataclysmic Variable Stars

    Hellier, Coel

    2001-01-01

    Cataclysmic variable stars are the most variable stars in the night sky, fluctuating in brightness continually on timescales from seconds to hours to weeks to years. The changes can be recorded using amateur telescopes, yet are also the subject of intensive study by professional astronomers. That study has led to an understanding of cataclysmic variables as binary stars, orbiting so closely that material transfers from one star to the other. The resulting process of accretion is one of the most important in astrophysics. This book presents the first account of cataclysmic variables at an introductory level. Assuming no previous knowledge of the field, it explains the basic principles underlying the variability, while providing an extensive compilation of cataclysmic variable light curves. Aimed at amateur astronomers, undergraduates, and researchers, the main text is accessible to those with no mathematical background, while supplementary boxes present technical details and equations.

  8. Understanding Brown Dwarf Variability

    Marley, Mark S.

    2013-01-01

    Surveys of brown dwarf variability continue to find that roughly half of all brown dwarfs are variable. While variability is observed amongst all types of brown dwarfs, amplitudes are typically greatest for L-T transition objects. In my talk I will discuss the possible physical mechanisms that are responsible for the observed variability. I will particularly focus on comparing and contrasting the effects of changes in atmospheric thermal profile and cloud opacity. The two different mechanisms will produce different variability signatures and I will discuss the extent to which the current datasets constrain both mechanisms. By combining constraints from studies of variability with existing spectral and photometric datasets we can begin to construct and test self-consistent models of brown dwarf atmospheres. These models not only aid in the interpretation of existing objects but also inform studies of directly imaged giant planets.

  9. hmF2 variability over Havana

    Lazo, B.; Alazo, K.; Rodriguez, M.; Calzadilla, A.

    2003-01-01

    The hmF2 variability over Havana station (Geo. Latitude 23 deg. N, Geo Longitude 278 deg. E; Dip 54.6 deg. N; Modip: 44.8 deg. N) is presented. In this study different solar and seasonal conditions are considered. The results show that, in general, standard deviation of hmF2 is quite irregular and reaches its values at nighttimes hours. Lower and upper quartiles variability has a similar behaviour to IQ variability, showing its higher values at nighttimes too. (author)

  10. Variability, Predictability, and Race Factors Affecting Performance in Elite Biathlon.

    Skattebo, Øyvind; Losnegard, Thomas

    2018-03-01

    To investigate variability, predictability, and smallest worthwhile performance enhancement in elite biathlon sprint events. In addition, the effects of race factors on performance were assessed. Data from 2005 to 2015 including >10,000 and >1000 observations for each sex for all athletes and annual top-10 athletes, respectively, were included. Generalized linear mixed models were constructed based on total race time, skiing time, shooting time, and proportions of targets hit. Within-athlete race-to-race variability was expressed as coefficient of variation of performance times and standard deviation (SD) in proportion units (%) of targets hit. The models were adjusted for random and fixed effects of subject identity, season, event identity, and race factors. The within-athlete variability was independent of sex and performance standard of athletes: 2.5-3.2% for total race time, 1.5-1.8% for skiing time, and 11-15% for shooting times. The SD of the proportion of hits was ∼10% in both shootings combined (meaning ±1 hit in 10 shots). The predictability in total race time was very high to extremely high for all athletes (ICC .78-.84) but trivial for top-10 athletes (ICC .05). Race times during World Championships and Olympics were ∼2-3% faster than in World Cups. Moreover, race time increased by ∼2% per 1000 m of altitude, by ∼5% per 1% of gradient, by 1-2% per 1 m/s of wind speed, and by ∼2-4% on soft vs hard tracks. Researchers and practitioners should focus on strategies that improve biathletes' performance by at least 0.8-0.9%, corresponding to the smallest worthwhile enhancement (0.3 × within-athlete variability).

  11. Including estimates of the future in today's financial statements

    Mary Barth

    2006-01-01

    This paper explains why the question is how, not if, today's financial statements should include estimates of the future. Including such estimates is not new, but their use is increasing. This increase results primarily because standard setters believe asset and liability measures that reflect current economic conditions and up-to-date expectations of the future will result in more useful information for making economic decisions, which is the objective of financial reporting. This is why sta...

  12. Ultrasonic variables affecting inspection

    Lautzenheiser, C.E.; Whiting, A.R.; McElroy, J.T.

    1977-01-01

    There are many variables which affect the detection of the effects and reproducibility of results when utilizing ultrasonic techniques. The most important variable is the procedure, as this document specifies, to a great extent, the controls that are exercised over the other variables. The most important variable is personnel with regards to training, qualification, integrity, data recording, and data analysis. Although the data is very limited, these data indicate that, if the procedure is carefully controlled, reliability of defect detection and reproducibility of results are both approximately 90 percent for reliability of detection, this applies to relatively small defects as reliability increases substantially as defect size increases above the recording limit. (author)

  13. Heart rate variability in healthy population

    Alamgir, M.; Hussain, M.M.

    2010-01-01

    Background: Heart rate variability has been considered as an indicator of autonomic status. Little work has been done on heart rate variability in normal healthy volunteers. We aimed at evolving the reference values of heart rate variability in our healthy population. Methods: Twenty-four hour holter monitoring of 37 healthy individuals was done using Holter ECG recorder 'Life card CF' from 'Reynolds Medical'. Heart rate variability in both time and frequency domains was analysed with 'Reynolds Medical Pathfinder Digital/700'. Results: The heart rate variability in normal healthy volunteers of our population was found in time domain using standard deviation of R-R intervals (SDNN), standard deviation of average NN intervals (SDANN), and Square root of the mean squared differences of successive NN intervals (RMSSD). Variation in heart rate variability indices was observed between local and foreign volunteers and RMSSD was found significantly increased (p<0.05) in local population. Conclusions: The values of heart rate variability (RMSSD) in healthy Pakistani volunteers were found increased compared to the foreign data reflecting parasympathetic dominance in our population. (author)

  14. Development of fusion safety standards

    Longhurst, G.R.; Petti, D.A.; Dinneen, G.A.; Herring, J.S.; DeLooper, J.; Levine, J.D.; Gouge, M.J.

    1996-01-01

    Two new U.S. Department of Energy (DOE) standards have been prepared to assist in the design and regulation of magnetic fusion facilities. They are DOE-STD-6002-96, 'Safety of Magnetic Fusion Facilities - Requirements,' and DOE-STD-6003-96 'Safety of Magnetic Fusion Facilities - Guidance.' The first standard sets forth requirements, mostly based on the Code of Federal Regulations, deemed necessary for the safe design and operation of fusion facilities and a set of safety principles to use in the design. The second standard provides guidance on how to meet the requirements identified in DOE-STD-6002-96. It is written specifically for a facility such as the International Thermonuclear Experimental Reactor (ITER) in the DOE regulatory environment. As technical standards, they are applicable only to the extent that compliance with these standards is included in the contracts of the developers. 7 refs., 1 fig

  15. The International Standards Organisation offshore structures standard

    Snell, R.O.

    1994-01-01

    The International Standards Organisation has initiated a program to develop a suite of ISO Codes and Standards for the Oil Industry. The Offshore Structures Standard is one of seven topics being addressed. The scope of the standard will encompass fixed steel and concrete structures, floating structures, Arctic structures and the site specific assessment of mobile drilling and accommodation units. The standard will use as base documents the existing recommended practices and standards most frequently used for each type of structure, and will develop them to incorporate best published and recognized practice and knowledge where it provides a significant improvement on the base document. Work on the Code has commenced under the direction of an internationally constituted sub-committee comprising representatives from most of the countries with a substantial offshore oil and gas industry. This paper outlines the background to the code and the format, content and work program

  16. Cosmological N -body simulations including radiation perturbations

    Brandbyge, Jacob; Rampf, Cornelius; Tram, Thomas

    2017-01-01

    CosmologicalN-body simulations are the standard tools to study the emergence of the observed large-scale structure of the Universe. Such simulations usually solve for the gravitational dynamics of matter within the Newtonian approximation, thus discarding general relativistic effects such as the ......CosmologicalN-body simulations are the standard tools to study the emergence of the observed large-scale structure of the Universe. Such simulations usually solve for the gravitational dynamics of matter within the Newtonian approximation, thus discarding general relativistic effects...

  17. Standard Industry Fare Level

    Department of Transportation — Standard Industry Fare Level was establish after airline deregulation to serve as the standard against which a statutory zone of operating expense reasonableness was...

  18. Standard Reference Tables -

    Department of Transportation — The Standard Reference Tables (SRT) provide consistent reference data for the various applications that support Flight Standards Service (AFS) business processes and...

  19. Process variables in organizational stress management intervention evaluation research: a systematic review.

    Havermans, Bo M; Schlevis, Roosmarijn Mc; Boot, Cécile Rl; Brouwers, Evelien Pm; Anema, Johannes; van der Beek, Allard J

    2016-09-01

    This systematic review aimed to explore which process variables are used in stress management intervention (SMI) evaluation research. A systematic review was conducted using seven electronic databases. Studies were included if they reported on an SMI aimed at primary or secondary stress prevention, were directed at paid employees, and reported process data. Two independent researchers checked all records and selected the articles for inclusion. Nielsen and Randall's model for process evaluation was used to cluster the process variables. The three main clusters were context, intervention, and mental models. In the 44 articles included, 47 process variables were found, clustered into three main categories: context (two variables), intervention (31 variables), and mental models (14 variables). Half of the articles contained no reference to process evaluation literature. The collection of process evaluation data mostly took place after the intervention and at the level of the employee. The findings suggest that there is great heterogeneity in methods and process variables used in process evaluations of SMI. This, together with the lack of use of a standardized framework for evaluation, hinders the advancement of process evaluation theory development.

  20. Nuclear Data Verification and Standardization

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  1. 1998 federal technical standards workshop: Proceedings

    NONE

    1998-10-01

    The theme for the 1998 workshop was Standards Management -- A World of Change and Opportunities. The workshop`s goal was to further the implementation of the National Technology Transfer and Advancement Act of 1995 (Public Law 104-113) through the sharing of standards management success stories, lessons learned, and emerging initiatives within the Executive Branch of the Federal Government. The target audience for this workshop included agency/department and contractor personnel and representatives of standards developing organizations that either used technical standards in their work for the Federal Government of participated in standards writing/management activities in support of the missions and programs of Federal agencies/departments. As with previous standards workshops sponsored by the DOE, views on the technical subject areas under the workshop theme were solicited from and provided by agency Standards Executives and standards program managers, voluntary standards organizations, and the private sector. This report includes vugraphs of the presentations.

  2. Double standards: a cross-European study on differences in norms on voluntary childlessness for men and women. Paper presentation

    Rijken, A.J.; Merz, E.-M.

    2011-01-01

    We examine double standards in norms on voluntary childlessness. Whether choosing childlessness is more accepted for men or for women is not a priori clear; we formulate arguments in both directions. Multilevel analyses are conducted, including individual and societal-level variables. Our sample

  3. Collective variables and dissipation

    Balian, R.

    1984-09-01

    This is an introduction to some basic concepts of non-equilibrium statistical mechanics. We emphasize in particular the relevant entropy relative to a given set of collective variables, the meaning of the projection method in the Liouville space, its use to establish the generalized transport equations for these variables, and the interpretation of dissipation in the framework of information theory

  4. Variability: A Pernicious Hypothesis.

    Noddings, Nel

    1992-01-01

    The hypothesis of greater male variability in test results is discussed in its historical context, and reasons feminists have objected to the hypothesis are considered. The hypothesis acquires political importance if it is considered that variability results from biological, rather than cultural, differences. (SLD)

  5. Reinforcing Saccadic Amplitude Variability

    Paeye, Celine; Madelain, Laurent

    2011-01-01

    Saccadic endpoint variability is often viewed as the outcome of neural noise occurring during sensorimotor processing. However, part of this variability might result from operant learning. We tested this hypothesis by reinforcing dispersions of saccadic amplitude distributions, while maintaining constant their medians. In a first experiment we…

  6. POVMs and hidden variables

    Stairs, Allen

    2007-01-01

    Recent results by Paul Busch and Adan Cabello claim to show that by appealing to POVMs, non-contextual hidden variables can be ruled out in two dimensions. While the results of Busch and Cabello are mathematically correct, interpretive problems render them problematic as no hidden variable proofs

  7. Interdependence Among Organizational Variables

    Knowles, M. C.

    1975-01-01

    The interrelationship between a set of organizational variables was investigated at 14 work organizations within a company. The variables were production, quality, costs, job satisfaction of operatives, job satisfaction of supervisors, work anxiety, accidents, absence, labor turnover, and industrial unrest. (Author)

  8. Genotypic variability enhances the reproducibility of an ecological study.

    Milcu, Alexandru; Puga-Freitas, Ruben; Ellison, Aaron M; Blouin, Manuel; Scheu, Stefan; Freschet, Grégoire T; Rose, Laura; Barot, Sebastien; Cesarz, Simone; Eisenhauer, Nico; Girin, Thomas; Assandri, Davide; Bonkowski, Michael; Buchmann, Nina; Butenschoen, Olaf; Devidal, Sebastien; Gleixner, Gerd; Gessler, Arthur; Gigon, Agnès; Greiner, Anna; Grignani, Carlo; Hansart, Amandine; Kayler, Zachary; Lange, Markus; Lata, Jean-Christophe; Le Galliard, Jean-François; Lukac, Martin; Mannerheim, Neringa; Müller, Marina E H; Pando, Anne; Rotter, Paula; Scherer-Lorenzen, Michael; Seyhun, Rahme; Urban-Mead, Katherine; Weigelt, Alexandra; Zavattaro, Laura; Roy, Jacques

    2018-02-01

    Many scientific disciplines are currently experiencing a 'reproducibility crisis' because numerous scientific findings cannot be repeated consistently. A novel but controversial hypothesis postulates that stringent levels of environmental and biotic standardization in experimental studies reduce reproducibility by amplifying the impacts of laboratory-specific environmental factors not accounted for in study designs. A corollary to this hypothesis is that a deliberate introduction of controlled systematic variability (CSV) in experimental designs may lead to increased reproducibility. To test this hypothesis, we had 14 European laboratories run a simple microcosm experiment using grass (Brachypodium distachyon L.) monocultures and grass and legume (Medicago truncatula Gaertn.) mixtures. Each laboratory introduced environmental and genotypic CSV within and among replicated microcosms established in either growth chambers (with stringent control of environmental conditions) or glasshouses (with more variable environmental conditions). The introduction of genotypic CSV led to 18% lower among-laboratory variability in growth chambers, indicating increased reproducibility, but had no significant effect in glasshouses where reproducibility was generally lower. Environmental CSV had little effect on reproducibility. Although there are multiple causes for the 'reproducibility crisis', deliberately including genetic variability may be a simple solution for increasing the reproducibility of ecological studies performed under stringently controlled environmental conditions.

  9. Simplified propagation of standard uncertainties

    Shull, A.H.

    1997-01-01

    An essential part of any measurement control program is adequate knowledge of the uncertainties of the measurement system standards. Only with an estimate of the standards'' uncertainties can one determine if the standard is adequate for its intended use or can one calculate the total uncertainty of the measurement process. Purchased standards usually have estimates of uncertainty on their certificates. However, when standards are prepared and characterized by a laboratory, variance propagation is required to estimate the uncertainty of the standard. Traditional variance propagation typically involves tedious use of partial derivatives, unfriendly software and the availability of statistical expertise. As a result, the uncertainty of prepared standards is often not determined or determined incorrectly. For situations meeting stated assumptions, easier shortcut methods of estimation are now available which eliminate the need for partial derivatives and require only a spreadsheet or calculator. A system of simplifying the calculations by dividing into subgroups of absolute and relative uncertainties is utilized. These methods also incorporate the International Standards Organization (ISO) concepts for combining systematic and random uncertainties as published in their Guide to the Expression of Measurement Uncertainty. Details of the simplified methods and examples of their use are included in the paper

  10. Effects of variable transformations on errors in FORM results

    Qin Quan; Lin Daojin; Mei Gang; Chen Hao

    2006-01-01

    On the basis of studies on second partial derivatives of the variable transformation functions for nine different non-normal variables the paper comprehensively discusses the effects of the transformation on FORM results and shows that senses and values of the errors in FORM results depend on distributions of the basic variables, whether resistances or actions basic variables represent, and the design point locations in the standard normal space. The transformations of the exponential or Gamma resistance variables can generate +24% errors in the FORM failure probability, and the transformation of Frechet action variables could generate -31% errors

  11. Variable Work Hours--The MONY Experience

    Fields, Cynthia J.

    1974-01-01

    An experiment with variable work hours in one department of a large company was so successful that it has become standard procedure in various corporate areas, both staff and line. The result? Increased production, fewer errors, improved employee morale, and a significant reduction in lateness and absenteeism. (Author)

  12. 106-17 Telemetry Standards Digitized Audio Telemetry Standard Chapter 5

    2017-07-01

    Digitized Audio Telemetry Standard 5.1 General This chapter defines continuously variable slope delta (CVSD) modulation as the standard for digitizing...audio signal. The CVSD modulator is, in essence , a 1-bit analog-to-digital converter. The output of this 1-bit encoder is a serial bit stream, where

  13. Standards for holdup measurement

    Zucker, M.S.

    1982-01-01

    Holdup measurement, needed for material balance, depend intensively on standards and on interpretation of the calibration procedure. More than other measurements, the calibration procedure using the standard becomes part of the standard. Standards practical for field use and calibration techniques have been developed. While accuracy in holdup measurements is comparatively poor, avoidance of bias is a necessary goal

  14. Creating standards: Creating illusions?

    Linneberg, Mai Skjøtt

    written standards may open up for the creation of illusions. These are created when written standards' content is not in accordance with the perception standard adopters and standard users have of the specific practice phenomenon's content. This general theoretical argument is exemplified by the specific...

  15. Short-timescale variability in cataclysmic binaries

    Cordova, F.A.; Mason, K.O.

    1982-01-01

    Rapid variability, including flickering and pulsations, has been detected in cataclysmic binaries at optical and x-ray frequencies. In the case of the novalike variable TT Arietis, simultaneous observations reveal that the x-ray and optical flickering activity is strongly correlated, while short period pulsations are observed that occur at the same frequencies in both wavelength bands

  16. Beyond the Standard Model

    Lykken, Joseph D.

    2010-01-01

    - new directions for BSM model building. Contrary to popular shorthand jargon, supersymmetry (SUSY) is not a BSM model: it is a symmetry principle characterizing a BSM framework with an infinite number of models. Indeed we do not even know the full dimensionality of the SUSY parameter space, since this presumably includes as-yet-unexplored SUSY-breaking mechanisms and combinations of SUSY with other BSM principles. The SUSY framework plays an important role in BSM physics partly because it includes examples of models that are 'complete' in the same sense as the Standard Model, i.e. in principle the model predicts consequences for any observable, from cosmology to b physics to precision electroweak data to LHC collisions. Complete models, in addition to being more explanatory and making connections between diverse phenomena, are also much more experimentally constrained than strawman scenarios that focus more narrowly. One sometimes hears: 'Anything that is discovered at the LHC will be called supersymmetry.' There is truth behind this joke in the sense that the SUSY framework incorporates a vast number of possible signatures accessible to TeV colliders. This is not to say that the SUSY framework is not testable, but we are warned that one should pay attention to other promising frameworks, and should be prepared to make experimental distinctions between them. Since there is no formal classification of BSM frameworks I have invented my own. At the highest level there are six parent frameworks: (1) Terascale supersymmetry; (2) PNGB Higgs; (3) New strong dynamics; (4) Warped extra dimensions; (5) Flat extra dimensions; and (6) Hidden valleys. Here is the briefest possible survey of each framework, with the basic idea, the generic new phenomena, and the energy regime over which the framework purports to make comprehensive predictions.

  17. Beyond the Standard Model

    Lykken, Joseph D.; /Fermilab

    2010-05-01

    - to those who get close enough to listen - new directions for BSM model building. Contrary to popular shorthand jargon, supersymmetry (SUSY) is not a BSM model: it is a symmetry principle characterizing a BSM framework with an infinite number of models. Indeed we do not even know the full dimensionality of the SUSY parameter space, since this presumably includes as-yet-unexplored SUSY-breaking mechanisms and combinations of SUSY with other BSM principles. The SUSY framework plays an important role in BSM physics partly because it includes examples of models that are 'complete' in the same sense as the Standard Model, i.e. in principle the model predicts consequences for any observable, from cosmology to b physics to precision electroweak data to LHC collisions. Complete models, in addition to being more explanatory and making connections between diverse phenomena, are also much more experimentally constrained than strawman scenarios that focus more narrowly. One sometimes hears: 'Anything that is discovered at the LHC will be called supersymmetry.' There is truth behind this joke in the sense that the SUSY framework incorporates a vast number of possible signatures accessible to TeV colliders. This is not to say that the SUSY framework is not testable, but we are warned that one should pay attention to other promising frameworks, and should be prepared to make experimental distinctions between them. Since there is no formal classification of BSM frameworks I have invented my own. At the highest level there are six parent frameworks: (1) Terascale supersymmetry; (2) PNGB Higgs; (3) New strong dynamics; (4) Warped extra dimensions; (5) Flat extra dimensions; and (6) Hidden valleys. Here is the briefest possible survey of each framework, with the basic idea, the generic new phenomena, and the energy regime over which the framework purports to make comprehensive predictions.

  18. United States Shipbuilding Standards Master Plan

    Horsmon, Jr, Albert W

    1992-01-01

    This Shipbuilding Standards Master Plan was developed using extensive surveys, interviews, and an iterative editing process to include the views and opinions of key persons and organizations involved...

  19. 42 CFR 410.100 - Included services.

    2010-10-01

    ... service; however, maintenance therapy itself is not covered as part of these services. (c) Occupational... increase respiratory function, such as graded activity services; these services include physiologic... rehabilitation plan of treatment, including physical therapy services, occupational therapy services, speech...

  20. Rapidly variable relatvistic absorption

    Parker, M.; Pinto, C.; Fabian, A.; Lohfink, A.; Buisson, D.; Alston, W.; Jiang, J.

    2017-10-01

    I will present results from the 1.5Ms XMM-Newton observing campaign on the most X-ray variable AGN, IRAS 13224-3809. We find a series of nine absorption lines with a velocity of 0.24c from an ultra-fast outflow. For the first time, we are able to see extremely rapid variability of the UFO features, and can link this to the X-ray variability from the inner accretion disk. We find a clear flux dependence of the outflow features, suggesting that the wind is ionized by increasing X-ray emission.

  1. Collaboration Between Multistakeholder Standards

    Rasche, Andreas; Maclean, Camilla

    Public interest in corporate social responsibility (CSR) has resulted in a wide variety of multistakeholder CSR standards in which companies can choose to participate. While such standards reflect collaborative governance arrangements between public and private actors, the market for corporate...... responsibility is unlikely to support a great variety of partly competing and overlapping standards. Increased collaboration between these standards would enhance both their impact and their adoption by firms. This report examines the nature, benefits, and shortcomings of existing multistakeholder standards...

  2. Static, Lightweight Includes Resolution for PHP

    M.A. Hills (Mark); P. Klint (Paul); J.J. Vinju (Jurgen)

    2014-01-01

    htmlabstractDynamic languages include a number of features that are challenging to model properly in static analysis tools. In PHP, one of these features is the include expression, where an arbitrary expression provides the path of the file to include at runtime. In this paper we present two

  3. Article Including Environmental Barrier Coating System

    Lee, Kang N. (Inventor)

    2015-01-01

    An enhanced environmental barrier coating for a silicon containing substrate. The enhanced barrier coating may include a bond coat doped with at least one of an alkali metal oxide and an alkali earth metal oxide. The enhanced barrier coating may include a composite mullite bond coat including BSAS and another distinct second phase oxide applied over said surface.

  4. Rare thoracic cancers, including peritoneum mesothelioma

    Siesling, Sabine; van der Zwan, Jan Maarten; Izarzugaza, Isabel; Jaal, Jana; Treasure, Tom; Foschi, Roberto; Ricardi, Umberto; Groen, Harry; Tavilla, Andrea; Ardanaz, Eva

    Rare thoracic cancers include those of the trachea, thymus and mesothelioma (including peritoneum mesothelioma). The aim of this study was to describe the incidence, prevalence and survival of rare thoracic tumours using a large database, which includes cancer patients diagnosed from 1978 to 2002,

  5. Rare thoracic cancers, including peritoneum mesothelioma

    Siesling, Sabine; Zwan, J.M.V.D.; Izarzugaza, I.; Jaal, J.; Treasure, T.; Foschi, R.; Ricardi, U.; Groen, H.; Tavilla, A.; Ardanaz, E.

    2012-01-01

    Rare thoracic cancers include those of the trachea, thymus and mesothelioma (including peritoneum mesothelioma). The aim of this study was to describe the incidence, prevalence and survival of rare thoracic tumours using a large database, which includes cancer patients diagnosed from 1978 to 2002,

  6. Eternity Variables to Simulate Specifications

    Hesselink, WH; Boiten, EA; Moller, B

    2002-01-01

    Simulation of specifications is introduced as a unification and generalization of refinement mappings, history variables, forward simulations, prophecy variables, and backward simulations. Eternity variables are introduced as a more powerful alternative for prophecy variables and backward

  7. Intraspecific chromosome variability

    N Dubinin

    2010-12-01

    Full Text Available (Editorial preface. The publication is presented in order to remind us of one of dramatic pages of the history of genetics. It re-opens for the contemporary reader a comprehensive work marking the priority change from plant cytogenetics to animal cytogenetics led by wide population studies which were conducted on Drosophila polytene chromosomes. The year of the publication (1937 became the point of irretrievable branching between the directions of Old World and New World genetics connected with the problems of chromosome variability and its significance for the evolution of the species. The famous book of T. Dobzhansky (1937 was published by Columbia University in the US under the title “Genetics and the origin of species”, and in the shadow of this American ‘skybuilding’ all other works grew dim. It is remarkable that both Dobzhansky and Dubinin come to similar conclusions about the role of chromosomes in speciation. This is not surprising given that they both might be considered as representatives of the Russian genetic school, by their birth and education. Interestingly, Dobzhansky had never referred to the full paper of Dubinin et al. (1937, though a previous short communication in Nature (1936 was included together with all former papers on the related subject. In full, the volume of the original publication printed in the Biological Journal in Moscow comprised 47 pages, in that number 41 pages of the Russian text accompanied by 16 Figs, a table and reference list, and, above all, 6 pages of the English summary. This final part in English is now reproduced in the authors’ version with the only addition being the reference list in the originally printed form.

  8. Clinical quality standards for radiotherapy

    2012-01-01

    Aim of the study The technological progress that is currently being witnessed in the areas of diagnostic imaging, treatment planning systems and therapeutic equipment has caused radiotherapy to become a high-tech and interdisciplinary domain involving staff of various backgrounds. This allows steady improvement in therapy results, but at the same time makes the diagnostic, imaging and therapeutic processes more complex and complicated, requiring every stage of those processes to be planned, organized, controlled and improved so as to assure high quality of services provided. The aim of this paper is to present clinical quality standards for radiotherapy as developed by the author. Material and methods In order to develop the quality standards, a comparative analysis was performed between European and Polish legal acts adopted in the period of 1980-2006 and the universal industrial ISO 9001:2008 standard, defining requirements for quality management systems, and relevant articles published in 1984-2009 were reviewed, including applicable guidelines and recommendations of American, international, European and Polish bodies, such as the American Association of Physicists in Medicine (AAPM), the European Society for Radiotherapy & Oncology (ESTRO), the International Atomic Energy Agency (IAEA), and the Organisation of European Cancer Institutes (OECI) on quality assurance and management in radiotherapy. Results As a result, 352 quality standards for radiotherapy were developed and categorized into the following three groups: 1 – organizational standards; 2 – physico-technical standards and 3 – clinical standards. Conclusion Proposed clinical quality standards for radiotherapy can be used by any institution using ionizing radiation for medical purposes. However, standards are of value only if they are implemented, reviewed, audited and improved, and if there is a clear mechanism in place to monitor and address failure to meet agreed standards. PMID:23788854

  9. 41 CFR 101-29.218 - Voluntary standards.

    2010-07-01

    ... Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 29-FEDERAL PRODUCT... standards,” but does not include professional standards of personal conduct, institutional codes of ethics...

  10. Normalization of High Dimensional Genomics Data Where the Distribution of the Altered Variables Is Skewed

    Landfors, Mattias; Philip, Philge; Rydén, Patrik; Stenberg, Per

    2011-01-01

    Genome-wide analysis of gene expression or protein binding patterns using different array or sequencing based technologies is now routinely performed to compare different populations, such as treatment and reference groups. It is often necessary to normalize the data obtained to remove technical variation introduced in the course of conducting experimental work, but standard normalization techniques are not capable of eliminating technical bias in cases where the distribution of the truly altered variables is skewed, i.e. when a large fraction of the variables are either positively or negatively affected by the treatment. However, several experiments are likely to generate such skewed distributions, including ChIP-chip experiments for the study of chromatin, gene expression experiments for the study of apoptosis, and SNP-studies of copy number variation in normal and tumour tissues. A preliminary study using spike-in array data established that the capacity of an experiment to identify altered variables and generate unbiased estimates of the fold change decreases as the fraction of altered variables and the skewness increases. We propose the following work-flow for analyzing high-dimensional experiments with regions of altered variables: (1) Pre-process raw data using one of the standard normalization techniques. (2) Investigate if the distribution of the altered variables is skewed. (3) If the distribution is not believed to be skewed, no additional normalization is needed. Otherwise, re-normalize the data using a novel HMM-assisted normalization procedure. (4) Perform downstream analysis. Here, ChIP-chip data and simulated data were used to evaluate the performance of the work-flow. It was found that skewed distributions can be detected by using the novel DSE-test (Detection of Skewed Experiments). Furthermore, applying the HMM-assisted normalization to experiments where the distribution of the truly altered variables is skewed results in considerably higher

  11. Variable-Period Undulators for Synchrotron Radiation

    Shenoy, Gopal; Lewellen, John; Shu, Deming; Vinokurov, Nikolai

    2005-02-22

    A new and improved undulator design is provided that enables a variable period length for the production of synchrotron radiation from both medium-energy and high energy storage rings. The variable period length is achieved using a staggered array of pole pieces made up of high permeability material, permanent magnet material, or an electromagnetic structure. The pole pieces are separated by a variable width space. The sum of the variable width space and the pole width would therefore define the period of the undulator. Features and advantages of the invention include broad photon energy tunability, constant power operation and constant brilliance operation.

  12. Variable-Period Undulators For Synchrotron Radiation

    Shenoy, Gopal; Lewellen, John; Shu, Deming; Vinokurov, Nikolai

    2005-02-22

    A new and improved undulator design is provided that enables a variable period length for the production of synchrotron radiation from both medium-energy and high-energy storage rings. The variable period length is achieved using a staggered array of pole pieces made up of high permeability material, permanent magnet material, or an electromagnetic structure. The pole pieces are separated by a variable width space. The sum of the variable width space and the pole width would therefore define the period of the undulator. Features and advantages of the invention include broad photon energy tunability, constant power operation and constant brilliance operation.

  13. Variable Attitude Test Stand

    Federal Laboratory Consortium — The Variable Attitude Test Stand designed and built for testing of the V-22 tilt rotor aircraft propulsion system, is used to evaluate the effect of aircraft flight...

  14. Variable-Rate Premiums

    Pension Benefit Guaranty Corporation — These interest rates are used to value vested benefits for variable rate premium purposes as described in PBGC's regulation on Premium Rates (29 CFR Part 4006) and...

  15. Variable Pricing Feasibility Assessment

    2004-01-01

    ...) and Willard Bishop Consulting (Barrington, IL) to evaluate the practicality of using a variable pricing system within DeCA to maintain an average of 30 percent customer savings and lower appropriated fund costs...

  16. Evolution of variable stars

    Becker, S.A.

    1986-08-01

    Throughout the domain of the H R diagram lie groupings of stars whose luminosity varies with time. These variable stars can be classified based on their observed properties into distinct types such as β Cephei stars, δ Cephei stars, and Miras, as well as many other categories. The underlying mechanism for the variability is generally felt to be due to four different causes: geometric effects, rotation, eruptive processes, and pulsation. In this review the focus will be on pulsation variables and how the theory of stellar evolution can be used to explain how the various regions of variability on the H R diagram are populated. To this end a generalized discussion of the evolutionary behavior of a massive star, an intermediate mass star, and a low mass star will be presented. 19 refs., 1 fig., 1 tab

  17. Software Testing Requires Variability

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic...... impact in software production. As is also apparent from the call for papers these perspectives focus on qualities such as reuse, adaptability, and maintainability....

  18. Suspended graphene variable capacitor

    AbdelGhany, M.; Mahvash, F.; Mukhopadhyay, M.; Favron, A.; Martel, R.; Siaj, M.; Szkopek, T.

    2016-01-01

    The tuning of electrical circuit resonance with a variable capacitor, or varactor, finds wide application with the most important being wireless telecommunication. We demonstrate an electromechanical graphene varactor, a variable capacitor wherein the capacitance is tuned by voltage controlled deflection of a dense array of suspended graphene membranes. The low flexural rigidity of graphene monolayers is exploited to achieve low actuation voltage in an ultra-thin structure. Large arrays compr...

  19. Standardisation in standards

    McDonald, J. C.

    2012-01-01

    The following observations are offered by one who has served on national and international standards-writing committees and standards review committees. Service on working groups consists of either updating previous standards or developing new standards. The process of writing either type of document proceeds along similar lines. The first order of business is to recognise the need for developing or updating a standard and to identify the potential user community. It is also necessary to ensure that there is a required number of members willing to do the writing. A justification is required as to why a new standard should be developed, and this is written as a new work item proposal or a project initiation notification system form. This document must be filed officially and approved, and a search is then undertaken to ensure that the proposed new standard will not duplicate a standard that has already been published or is underway in another standards organisation. (author)

  20. Imaging standards for smart cards

    Ellson, Richard N.; Ray, Lawrence A.

    1996-02-01

    "Smart cards" are plastic cards the size of credit cards which contain integrated circuits for the storage of digital information. The applications of these cards for image storage has been growing as card data capacities have moved from tens of bytes to thousands of bytes. This has prompted the recommendation of standards by the X3B10 committee of ANSI for inclusion in ISO standards for card image storage of a variety of image data types including digitized signatures and color portrait images. This paper will review imaging requirements of the smart card industry, challenges of image storage for small memory devices, card image communications, and the present status of standards. The paper will conclude with recommendations for the evolution of smart card image standards towards image formats customized to the image content and more optimized for smart card memory constraints.

  1. Global Standards of Market Civilization

    Global Standards of Market Civilization brings together leading scholars, representing a range of political views, to investigate how global 'standards of market civilization' have emerged, their justification, and their political, economic and social impact. Key chapters show how as the modern...... thought, as well as its historical application part II presents original case studies that demonstrate the emergence of such standards and explore the diffusion of liberal capitalist ideas through the global political economy and the consequences for development and governance; the International Monetary...... Fund's capacity to formulate a global standard of civilization in its reform programs; and problems in the development of the global trade, including the issue of intellectual property rights. This book will be of strong interest to students and scholars in wide range of fields relating to the study...

  2. Radiation control standards and procedures

    1956-12-14

    This manual contains the Radiation Control Standards'' and Radiation Control Procedures'' at Hanford Operations which have been established to provide the necessary control radiation exposures within Irradiation Processing Department. Provision is also made for including, in the form of Bulletins'', other radiological information of general interest to IPD personnel. The purpose of the standards is to establish firm radiological limits within which the Irradiation Processing Department will operate, and to outline our radiation control program in sufficient detail to insure uniform and consistent application throughout all IPD facilities. Radiation Control Procedures are intended to prescribe the best method of accomplishing an objective within the limitations of the Radiation Control Standards. A procedure may be changed at any time provided the suggested changes is generally agreeable to management involved, and is consistent with department policies and the Radiation Control Standards.

  3. NASA's Software Safety Standard

    Ramsay, Christopher M.

    2007-01-01

    requirements. This allows the projects leeway to meet these requirements in many forms that best suit a particular project's needs and safety risk. In other words, it tells the project what to do, not how to do it. This update also incorporated advances in the state of the practice of software safety from academia and private industry. It addresses some of the more common issues now facing software developers in the NASA environment such as the use of Commercial-Off-the-Shelf Software (COTS), Modified OTS (MOTS), Government OTS (GOTS), and reused software. A team from across NASA developed the update and it has had both NASA-wide internal reviews by software engineering, quality, safety, and project management. It has also had expert external review. This presentation and paper will discuss the new NASA Software Safety Standard, its organization, and key features. It will start with a brief discussion of some NASA mission failures and incidents that had software as one of their root causes. It will then give a brief overview of the NASA Software Safety Process. This will include an overview of the key personnel responsibilities and functions that must be performed for safety-critical software.

  4. Impact of pretreatment variables on the outcome of {sup 131}I therapy with a standardized dose of 150 Gray in Graves` disease; Einfluss praetherapeutischer Variablen auf die Wirkung einer standardisierten {sup 131}J-Therapie mit 150 Gray beim Morbus Basedow

    Pfeilschifter, J. [Heidelberg Univ., Radiologische Klinik (Germany). Abt. fuer Nuklearmedizin; Elser, H. [Medizinische Universitaetsklinik und Poliklinik Heidelberg (Germany). Abt. fuer Innere Medizin I; Haufe, S. [Medizinische Universitaetsklinik und Poliklinik Heidelberg (Germany). Abt. fuer Innere Medizin I; Ziegler, R. [Heidelberg Univ., Radiologische Klinik (Germany). Abt. fuer Nuklearmedizin; Georgi, P. [Medizinische Universitaetsklinik und Poliklinik Heidelberg (Germany). Abt. fuer Innere Medizin I

    1997-04-01

    Aim: We examined the impact of several pretreatment variables on thyroid size and function in 61 patients with Graves` disease one year after a standardized [131]I treatment with 150 Gray. Methods: FT3, FT4, and TSH serum concentrations were determined before and 1.5, 3, 6, and 12 months after therapy. Thyroid size was measured by ultrasound and scintigraphy before and one year after therapy. Results: One year after therapy, 30% of the patients had latent or manifest hyperthyroidism, 24% were euthyroid, and 46% had developed latent or manifest hypothyroidism. Age and initial thyroid volume were major predictors of posttherapeutical thyroid function. Thus, persistent hyperthyroidism was observed in 70% of the patients age 50 years and older with a thyroid size of more than 50 ml. With few exception, thyroid size markedly decreased after therapy. Initial thyroid size and age were also major predictors of posttherapeutical thyroid volume. Thyroid size normalized in all patients younger than 50 years of age, independent from initial thyroid size. Conclusion: Radioiodine treatment with 150 Gray causes a considerable decrease in thyroid size in most patients with Graves` disease. Age and initial thyroid volume are important determinants of thyroid function and size after therapy and should be considered in dose calculation. (orig.) [Deutsch] Ziel: Bei 61 Patienten mit einem Morbus Basedow haben wir den Einfluss praetherapeutischer Variablen auf die Funktion und das Volumen der Schilddruese ein Jahr nach einer {sup 131}J-Therapie mit 150 Gray untersucht. Methoden: FT3, FT4, und TSH wurden vor Therapie und eineinhalb, 3, 6 und 12 Monate nach Therapie gemessen. Das Schilddruesenvolumen wurde vor Therapie und ein Jahr nach Therapie sonographisch und szintigraphisch bestimmt. Ergebnisse: Ein Jahr nach Therapie waren 30% der Patienten latent oder manifest hyperthyreot, 24% euthyreot, und 46% latent oder manifest hypothyreot. Lebensalter und Ausgangsvolumen der Schilddruese

  5. Nuclear standardization development study

    Pan Jianjun

    2010-01-01

    Nuclear industry is the important part of national security and national economic development is key area of national new energy supported by government. nuclear standardization is the important force for nuclear industry development, is the fundamental guarantee of nuclear safe production, is the valuable means of China's nuclear industry technology to the world market. Now nuclear standardization faces to the new development opportunity, nuclear standardization should implement strategy in standard system building, foreign standard research, company standard building, and talented people building to meet the requirement of nuclear industry development. (author)

  6. Development of a quantitative risk standard

    Temme, M.I.

    1982-01-01

    IEEE Working Group SC-5.4 is developing a quantitative risk standard for LWR plant design and operation. The paper describes the Working Group's conclusions on significant issues, including the scope of the standard, the need to define the process (i.e., PRA calculation) for meeting risk criteria, the need for PRA quality requirements and the importance of distinguishing standards from goals. The paper also describes the Working Group's approach to writing this standard

  7. Beyond the standard model; Au-dela du modele standard

    Cuypers, F. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1997-05-01

    These lecture notes are intended as a pedagogical introduction to several popular extensions of the standard model of strong and electroweak interactions. The topics include the Higgs sector, the left-right symmetric model, grand unification and supersymmetry. Phenomenological consequences and search procedures are emphasized. (author) figs., tabs., 18 refs.

  8. An empirical assessment of the impact of technical standards on the export of meat in Nigeria

    Queeneth Odichi Ekeocha

    2017-10-01

    Full Text Available The study is an assessment of the impact of technical standards on meat export in Nigeria. Several literatures were reviewed in relation to meat standards, issues associated with standards compliance, the effects of SPS standards on food exports in developing countries, causes of non-export of meat in Nigeria, amongst others. A survey method was used and a cross tabulation analysis was made to ascertain the relationship among various variables and how significant they were in relation to food product standards. The findings of the study among others include- sanitary conditions for meat processing is a significant factor for meat export; standards compliance is a step in the right direction towards agricultural export diversification, food standard compliance can create market access for meat exports, etc. The study concluded that technical standard is very significant to meat exports in Nigeria. Therefore, the study recommends among others that the government should invest in the productive capacity of SPS requirements for meat export, standard abattoirs should be built and maintained, policymakers should re-think flexible export diversification policy that could attract foreign investor and meat companies in Nigeria.

  9. Analytical and Experimental Performance Evaluation of BLE Neighbor Discovery Process Including Non-Idealities of Real Chipsets

    David Perez-Diaz de Cerio

    2017-03-01

    Full Text Available The purpose of this paper is to evaluate from a real perspective the performance of Bluetooth Low Energy (BLE as a technology that enables fast and reliable discovery of a large number of users/devices in a short period of time. The BLE standard specifies a wide range of configurable parameter values that determine the discovery process and need to be set according to the particular application requirements. Many previous works have been addressed to investigate the discovery process through analytical and simulation models, according to the ideal specification of the standard. However, measurements show that additional scanning gaps appear in the scanning process, which reduce the discovery capabilities. These gaps have been identified in all of the analyzed devices and respond to both regular patterns and variable events associated with the decoding process. We have demonstrated that these non-idealities, which are not taken into account in other studies, have a severe impact on the discovery process performance. Extensive performance evaluation for a varying number of devices and feasible parameter combinations has been done by comparing simulations and experimental measurements. This work also includes a simple mathematical model that closely matches both the standard implementation and the different chipset peculiarities for any possible parameter value specified in the standard and for any number of simultaneous advertising devices under scanner coverage.

  10. Variable frequency microwave heating apparatus

    Bible, D.W.; Lauf, R.J.; Johnson, A.C.; Thigpen, L.T.

    1999-10-05

    A variable frequency microwave heating apparatus (10) designed to allow modulation of the frequency of the microwaves introduced into a multi-mode microwave cavity (34) for testing or other selected applications. The variable frequency microwave heating apparatus (10) includes a microwave signal generator (12) and a high-power microwave amplifier (20) or a high-power microwave oscillator (14). A power supply (22) is provided for operation of the high-power microwave oscillator (14) or microwave amplifier (20). A directional coupler (24) is provided for detecting the direction and amplitude of signals incident upon and reflected from the microwave cavity (34). A first power meter (30) is provided for measuring the power delivered to the microwave furnace (32). A second power meter (26) detects the magnitude of reflected power. Reflected power is dissipated in the reflected power load (28).

  11. The EPICS process variable Gateway Version 2

    Evans, K.

    2005-01-01

    The EPICS Process Variable Gateway is both a Channel Access Server and Channel Access Client that provides a means for many clients, typically on different subnets, to access a process variable while making only one connection to the server that owns the process variable. It also provides additional access security beyond that implemented on the server. It thus protects critical servers while providing suitably restricted access to needed process variables. The original version of the Gateway worked with EPICS Base 3.13 but required a special version, since the changes necessary for its operation were never incorporated into EPICS Base. Version 2 works with any standard EPICS Base 3.14.6 or later and has many improvements in both performance and features over the older version. The Gateway is now used at many institutions and has become a stable, high-performance application. It is capable of handling tens of thousands of process variables with hundreds of thousands of events per second. It has run for over three months in a production environment without having to be restarted. It has many internal process variables that can be used to monitor its state using standard EPICS client tools, such as MEDM and StripTool. Other internal process variables can be used to stop the Gateway, make several kinds of reports, or change the access security without stopping the Gateway. It can even be started on remote workstations from MEDM by using a Secure Shell script. This paper will describe the new Gateway and how it is used. The Gateway is both a server (like an EPICS Input/Output Controller (IOC)) and a client (like the EPICS Motif Editor and Display Manager (MEDM), StripTool, and others). Clients connect to the server side, and the client side connects to IOCs and other servers, possibly other Gateways. See Fig. 1. There are perhaps three principal reasons for using the Gateway: (1) it allows many clients to access a process variable while making only one connection to

  12. Truck Drivers And Risk Of STDs Including HIV

    Bansal R.K

    1995-01-01

    Full Text Available Research Question: Whether long distance truck drivers are at a higher risk of contracting and transmitting STDs including HIV? Objectives: i To study the degree of knowledge of HIV and AIDS among long- distance truck drivers. ii Assess their sexual behaviour including condom use. iii Explore their prevailing social influences and substance abuse patterns. iv Explore their treatment seeking bahaviour as regards STDs. v Deduce their risk of contracting and transmitting STDs including HIV. Study Design: Cross- sectional interview. Setting: Transport Nagar, Indore (M.P Participants: 210 senior drivers (First drivers and 210 junior drivers (Second drivers. Study Variables: Extra-Marital sexual intercourse, condom usage, past and present history of STDs, treatment and counseling, substance abuse, social â€" cultural milieu. Outcome Variables: Risk of contraction of STDs. Statistical Analysis: Univariate analysis. Results: 94% of the drivers were totally ignorant about AIDS. 82.9% and 43.8 % of the senior and junior drivers had a history of extra- marital sex and of these only 2 regularly used condoms. 13.8% and 3.3 % of the senior and junior drivers had a past or present history suggestive of STD infection. Alcohol and Opium were regularly used by them. Conclusion: The studied drivers are at a high risk of contracting and transmitting STDs including HIV.

  13. Some considerations about standardization

    Dewez, Ph L; Fanjas, Y R [C.E.R.C.A., Romans (France)

    1985-07-01

    Complete standardization of research reactor fuel is not possible. However the transition from HEU to LEU should be an opportunity for a double effort towards standardization and optimization in order to reduce cost. (author)

  14. BTS statistical standards manual

    2005-10-01

    The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...

  15. Dental Assisting Program Standards.

    Georgia Univ., Athens. Dept. of Vocational Education.

    This publication contains statewide standards for the dental assisting program in Georgia. The standards are divided into 12 categories: foundations (philosophy, purpose, goals, program objectives, availability, evaluation); admissions (admission requirements, provisional admission requirements, recruitment, evaluation and planning); program…

  16. Some considerations about standardization

    Dewez, Ph.L.; Fanjas, Y.R.

    1985-01-01

    Complete standardization of research reactor fuel is not possible. However the transition from HEU to LEU should be an opportunity for a double effort towards standardization and optimization in order to reduce cost. (author)

  17. Control system architecture: The standard and non-standard models

    Thuot, M.E.; Dalesio, L.R.

    1993-01-01

    Control system architecture development has followed the advances in computer technology through mainframes to minicomputers to micros and workstations. This technology advance and increasingly challenging accelerator data acquisition and automation requirements have driven control system architecture development. In summarizing the progress of control system architecture at the last International Conference on Accelerator and Large Experimental Physics Control Systems (ICALEPCS) B. Kuiper asserted that the system architecture issue was resolved and presented a ''standard model''. The ''standard model'' consists of a local area network (Ethernet or FDDI) providing communication between front end microcomputers, connected to the accelerator, and workstations, providing the operator interface and computational support. Although this model represents many present designs, there are exceptions including reflected memory and hierarchical architectures driven by requirements for widely dispersed, large channel count or tightly coupled systems. This paper describes the performance characteristics and features of the ''standard model'' to determine if the requirements of ''non-standard'' architectures can be met. Several possible extensions to the ''standard model'' are suggested including software as well as the hardware architectural feature

  18. The temporal variability of species densities

    Redfearn, A.; Pimm, S.L.

    1993-01-01

    Ecologists use the term 'stability' to mean to number of different things (Pimm 1984a). One use is to equate stability with low variability in population density over time (henceforth, temporal variability). Temporal variability varies greatly from species to species, so what effects it? There are at least three sets of factors: the variability of extrinsic abiotic factors, food web structure, and the intrinsic features of the species themselves. We can measure temporal variability using at least three statistics: the coefficient of variation of density (CV); the standard deviation of the logarithms of density (SDL); and the variance in the differences between logarithms of density for pairs of consecutive years (called annual variability, hence AV, b y Wolda 1978). There are advantages and disadvantages to each measure (Williamson 1984), though in our experience, the measures are strongly correlated across sets of taxonomically related species. The increasing availability of long-term data sets allows one to calculate these statistics for many species and so to begin to understand the various causes of species differences in temporal variability

  19. Inter-Trial Gait Variability Reduction Using Continous Curve Registration

    Sadeghi, H

    2001-01-01

    Timing in peak gait values shifts slightly between gait trials. When gait data are averaged, some of the standard deviation can be associated to this inter-trial variability unless normalization is carried out beforehand...

  20. Variability in the Anthropometric Status of Four South Mrican ...

    1974-03-30

    optimal' nutrition and undernutrition. It is shown that confidence limits based on a central value of the standard deviation (a) do not take into account the increasing variability with age noted in most parameters in populations.

  1. Evaluation of variable advisory speed limits in work zones.

    2013-08-01

    Variable advisory speed limit (VASL) systems could be effective at both urban and rural work zones, at both uncongested and congested sites. At uncongested urban work zones, the average speeds with VASL were lower than without VASL. But the standard ...

  2. Generalized Network Psychometrics : Combining Network and Latent Variable Models

    Epskamp, S.; Rhemtulla, M.; Borsboom, D.

    2017-01-01

    We introduce the network model as a formal psychometric model, conceptualizing the covariance between psychometric indicators as resulting from pairwise interactions between observable variables in a network structure. This contrasts with standard psychometric models, in which the covariance between

  3. FACTORS AFFECTING THE COMPLIANCE OF MYANMAR NURSES IN PERFORMING STANDARD PRECAUTION

    Sa Sa Aung

    2017-06-01

    Full Text Available Introduction: Exposure to pathogens is a serious issue for nurses. The literature explains that standard precaution have not consistently done in nursing. The purpose of this study was to analyze the factors affecting the compliance of nurses in Myanmar in performing standard precautions. Methods: This study used a cross-sectional design. Samples included 34 nurses in Waibagi Specialist Hospital (SHW, Myanmar. The independent variables were the characteristics of nurses, knowledge of standard precaution, and exposure to blood / body fluids and needle puncture wounds. The dependent variable was the performance of standard prevention. Data analyzed using descriptive analysis and logistic regression. Results: The result showed that almost respondents (91.18% had a good knowledge about prevention standards and 73.5% of respondents had good adherence in performing standard precaution. However, in practice nurses have not been consistent in closing the needles that have been used correctly. The results showed that nurse characteristics did not significantly affect adherence to standard precaution with statistical test results as follows: age (p = 0.97, gender (p = 1.00, religion (p = 0.72, education (p = 0.85, work experience at SHW (p = 0, 84, education training program (p = 0.71, knowledge (p = 0.76, and needle stick injury (p = 0,17. But, there was a significant influence between adherence to standard precaution on the incidence of injury due to puncture needle with p value = 0.01. Discussion: The barriers to applying standard precautions by Myanmar nurses can be reduced by providing basic training, supervision and improvement of operational standard procedures.

  4. The Distance Standard Deviation

    Edelmann, Dominic; Richards, Donald; Vogel, Daniel

    2017-01-01

    The distance standard deviation, which arises in distance correlation analysis of multivariate data, is studied as a measure of spread. New representations for the distance standard deviation are obtained in terms of Gini's mean difference and in terms of the moments of spacings of order statistics. Inequalities for the distance variance are derived, proving that the distance standard deviation is bounded above by the classical standard deviation and by Gini's mean difference. Further, it is ...

  5. Making standards work

    Stigzelius, Ingrid

    2009-01-01

    Social and environmental standards can function as tools for companies that want to improve their conduct in social and environmental areas in the supply chain. However, relatively little attention has been given to how the adoption of social and environmental standards may influence the actual business practices in the supply chain. The overall aim of this thesis is to examine the institutional context surrounding the adoption of social and environmental standards and how these standards inf...

  6. Standards, the users perspective

    Nason, W.D.

    1993-01-01

    The term standard has little meaning until put into the proper context. What is being standardized? What are the standard conditions to be applied? The list of questions that arise goes on and on. In this presentation, answers to these questions are considered in the interest of providing a basic understanding of what might be useful to the electrical power industry in the way of standards and what the limitations on application of them would be as well. 16 figs

  7. Variables associated with active spondylolysis.

    Gregg, Chris D; Dean, Sarah; Schneiders, Anthony G

    2009-11-01

    Retrospective non-experimental study. To investigate variables associated with active spondylolysis. A retrospective study audited clinical data over a two year period from patients with suspected spondylolysis that were referred for a SPECT bone scan. Six exploratory variables were identified and analysed using uni- and multi-variate regression from 82 patient records to determine the association between symptomatic, physical and demographic characteristics, and the presence of an active spondylolysis. Tertiary level multidisciplinary private practice sports medicine clinic. All patients with low back pain that required a SPECT bone scan to confirm suspected spondylolysis. 82 subjects were included in the final sample group. The six exploratory variables included Age, Gender, Injury duration, Injury onset, Sports participation and the result of the Single Leg Hyperextension Test. The dependent outcome variable was the result of the SPECT bone scan (scan-positive or scan-negative). Adolescent males had a higher incidence of spondylolysis detected by SPECT bone scan compared to other patients and a statistically significant association was demonstrated for both age (p=0.01) and gender (p=0.01). Subjects with an active spondylolysis were nearly five times more likely to be male and aged less than 20 years. Furthermore, the likelihood ratio indicated that adolescent males with suspected spondylolysis were three and a half times more likely to have a positive bone scan result. The Single Leg Hyperextension Test did not demonstrate a statistically significant association with spondylolysis (p=0.47). Clinicians assessing for a predisposition to the development of spondylolysis should consider the gender and age of the patient and not rely on the predictive ability of the Single Leg Hyperextension Test.

  8. Variability of femoral muscle attachments.

    Duda, G N; Brand, D; Freitag, S; Lierse, W; Schneider, E

    1996-09-01

    Analytical and experimental models of the musculoskeletal system often assume single values rather than ranges for anatomical input parameters. The hypothesis of the present study was that anatomical variability significantly influences the results of biomechanical analyses, specifically regarding the moment arms of the various thigh muscles. Insertions and origins of muscles crossing or attaching to the femur were digitized in six specimens. Muscle volumes were measured; muscle attachment area and centroid location were computed. To demonstrate the influence of inter-individual anatomic variability on a mechanical modeling parameter, the corresponding range of muscle moment arms were calculated. Standard deviations, as a percentage of the mean, were about 70% for attachment area and 80% for muscle volume and attachment centroid location. The resulting moment arms of the m. gluteus maximus and m. rectus femoris were especially sensitive to anatomical variations (SD 65%). The results indicate that sensitivity to anatomical variations should be analyzed in any investigation simulating musculoskeletal interactions. To avoid misinterpretations, investigators should consider using several anatomical configurations rather than relying on a mean data set.

  9. Consensus standard requirements and guidance

    Putman, V.L.

    1995-01-01

    This report presents information from the ANS Criticality Alarm System Workshop relating to the consensus standard requirements and guidance. Topics presented include: definition; nomenclature; requirements and recommendations; purpose of criticality alarms; design criteria; signal characteristics; reliability, dependability and durability; tests; and emergency preparedness and planning

  10. Scientific Reporting: Raising the Standards

    McLeroy, Kenneth R.; Garney, Whitney; Mayo-Wilson, Evan; Grant, Sean

    2016-01-01

    This article is based on a presentation that was made at the 2014 annual meeting of the editorial board of "Health Education & Behavior." The article addresses critical issues related to standards of scientific reporting in journals, including concerns about external and internal validity and reporting bias. It reviews current…

  11. Radiological Control Technician: Standardized technician Qualification Standard

    1992-10-01

    The Qualification Standard states and defines the knowledge and skill requirements necessary for successful completion of the Radiological Control Technician Training Program. The standard is divided into three phases: Phase I concerns RCT Academic training. There are 13 lessons associated with the core academics program and 19 lessons associated with the site academics program. The staff member should sign the appropriate blocks upon successful completion of the examination for that lesson or group of lessons. In addition, facility specific lesson plans may be added to meet the knowledge requirements in the Job Performance Measures (JPM) of the practical program. Phase II concerns RCT core/site practical (JPMs) training. There are thirteen generic tasks associated with the core practical program. Both the trainer/evaluator and student should sign the appropriate block upon successful completion of the JPM. In addition, facility specific tasks may be added or generic tasks deleted based on the results of the facility job evaluation. Phase III concerns the oral examination board successful completion of the oral examination board is documented by the signature of the chairperson of the board. Upon completion of all of the standardized technician qualification requirements, final qualification is verified by the student and the manager of the Radiological Control Department and acknowledged by signatures on the qualification standard. The completed Qualification Standard shall be maintained as an official training record

  12. Automotive Technology Skill Standards

    Garrett, Tom; Asay, Don; Evans, Richard; Barbie, Bill; Herdener, John; Teague, Todd; Allen, Scott; Benshoof, James

    2009-01-01

    The standards in this document are for Automotive Technology programs and are designed to clearly state what the student should know and be able to do upon completion of an advanced high-school automotive program. Minimally, the student will complete a three-year program to achieve all standards. Although these exit-level standards are designed…

  13. Variability of the Wind Turbine Power Curve

    Mahesh M. Bandi

    2016-09-01

    Full Text Available Wind turbine power curves are calibrated by turbine manufacturers under requirements stipulated by the International Electrotechnical Commission to provide a functional mapping between the mean wind speed v ¯ and the mean turbine power output P ¯ . Wind plant operators employ these power curves to estimate or forecast wind power generation under given wind conditions. However, it is general knowledge that wide variability exists in these mean calibration values. We first analyse how the standard deviation in wind speed σ v affects the mean P ¯ and the standard deviation σ P of wind power. We find that the magnitude of wind power fluctuations scales as the square of the mean wind speed. Using data from three planetary locations, we find that the wind speed standard deviation σ v systematically varies with mean wind speed v ¯ , and in some instances, follows a scaling of the form σ v = C × v ¯ α ; C being a constant and α a fractional power. We show that, when applicable, this scaling form provides a minimal parameter description of the power curve in terms of v ¯ alone. Wind data from different locations establishes that (in instances when this scaling exists the exponent α varies with location, owing to the influence of local environmental conditions on wind speed variability. Since manufacturer-calibrated power curves cannot account for variability influenced by local conditions, this variability translates to forecast uncertainty in power generation. We close with a proposal for operators to perform post-installation recalibration of their turbine power curves to account for the influence of local environmental factors on wind speed variability in order to reduce the uncertainty of wind power forecasts. Understanding the relationship between wind’s speed and its variability is likely to lead to lower costs for the integration of wind power into the electric grid.

  14. Pulsating red variables

    Whitelock, P.A.

    1990-01-01

    The observational characteristics of pulsating red variables are reviewed with particular emphasis on the Miras. These variables represent the last stage in the evolution of stars on the Asymptotic Giant Branch (AGB). A large fraction of the IRAS sources in the Bulge are Mira variables and a subset of these are also OH/IR sources. Their periods range up to 720 days, though most are between 360 and 560 days. At a given period those stars with the highest pulsation amplitudes have the highest mass-loss rates; this is interpreted as evidence for a causal connection between mass-loss and pulsation. It is suggested that once an AGB star has become a Mira it will evolve with increasing pulsation amplitude and mass-loss, but with very little change of luminosity or logarithmic period. 26 refs

  15. Variable stator radial turbine

    Rogo, C.; Hajek, T.; Chen, A. G.

    1984-01-01

    A radial turbine stage with a variable area nozzle was investigated. A high work capacity turbine design with a known high performance base was modified to accept a fixed vane stagger angle moveable sidewall nozzle. The nozzle area was varied by moving the forward and rearward sidewalls. Diffusing and accelerating rotor inlet ramps were evaluated in combinations with hub and shroud rotor exit rings. Performance of contoured sidewalls and the location of the sidewall split line with respect to the rotor inlet was compared to the baseline. Performance and rotor exit survey data are presented for 31 different geometries. Detail survey data at the nozzle exit are given in contour plot format for five configurations. A data base is provided for a variable geometry concept that is a viable alternative to the more common pivoted vane variable geometry radial turbine.

  16. Births and deaths including fetal deaths

    U.S. Department of Health & Human Services — Access to a variety of United States birth and death files including fetal deaths: Birth Files, 1968-2009; 1995-2005; Fetal death file, 1982-2005; Mortality files,...

  17. A Proposed Framework for Applying the National Standards of Quality Assurance in Higher Education in Sudan from the Teaching Staff’s Perspective - Faculties of Business Administration

    Alfatih Alamin Elfaki

    2017-08-01

    Full Text Available This study aimed to clarify the importance of having national standards and their role in achieving quality, as well as establishing a framework for the actual application of national standards in quality assurance so as to achieve quality in higher education institutions. The researchers  followed a descriptive analytical method to achieve the objectives of the study and developed a questionnaire covering primary and secondary variables that have a role in the design of specific models to help in applying the national standards by the Sudanese universities. The questionnaire included one dependent variable; the effective application of national standards of quality assurance in higher education institutions, and the four main variables (independent are: the national standards of quality assurance in higher education in Sudan, the standard of quality assurance, the standard of teaching and learning and the standard of scientific research and publication. The study revealed a number of conclusions: there were statistically significant differences in the extent of familiarity with the  national quality assurance standards in Sudan according to the academic rank of the  faculty members; there were also significant differences in the extent of compliance with the  national quality assurance standards in Sudan according to the academic rank of the faculty members; there was full agreement between the national standards for quality assurance in Sudan and the international standards for quality assurance; and  there were statistically significant differences in that the absence of specific models would have a negative impact on effective application of national standards of quality assurance in higher education in Sudan, according to the academic rank of the faculty members. Keywords: Quality, The program, Standards, University, Total quality management.

  18. The minimal non-minimal standard model

    Bij, J.J. van der

    2006-01-01

    In this Letter I discuss a class of extensions of the standard model that have a minimal number of possible parameters, but can in principle explain dark matter and inflation. It is pointed out that the so-called new minimal standard model contains a large number of parameters that can be put to zero, without affecting the renormalizability of the model. With the extra restrictions one might call it the minimal (new) non-minimal standard model (MNMSM). A few hidden discrete variables are present. It is argued that the inflaton should be higher-dimensional. Experimental consequences for the LHC and the ILC are discussed

  19. Including Indigenous Minorities in Decision-Making

    Pristed Nielsen, Helene

    Based on theories of public sphere participation and deliberative democracy, this book presents empirical results from a study of experiences with including Aboriginal and Maori groups in political decision-making in respectively Western Australia and New Zealand......Based on theories of public sphere participation and deliberative democracy, this book presents empirical results from a study of experiences with including Aboriginal and Maori groups in political decision-making in respectively Western Australia and New Zealand...

  20. Gas storage materials, including hydrogen storage materials

    Mohtadi, Rana F; Wicks, George G; Heung, Leung K; Nakamura, Kenji

    2013-02-19

    A material for the storage and release of gases comprises a plurality of hollow elements, each hollow element comprising a porous wall enclosing an interior cavity, the interior cavity including structures of a solid-state storage material. In particular examples, the storage material is a hydrogen storage material such as a solid state hydride. An improved method for forming such materials includes the solution diffusion of a storage material solution through a porous wall of a hollow element into an interior cavity.

  1. Resiliencia y variables sociodemograficas

    Calero Martinez, Edgar David

    2015-01-01

    En el presente trabajo se aborda la definición de una de las variables dentro de lo que se denomina Capital psicológico positivo, la Resiliencia, sus principales características y algunas de las variables socio demográficas que en el estudio pretenden ver el nivel de relación existente entre cada una de ellas (indirecta o directamente) en el proceso resiliente de una persona para posteriores discusiones y su implicación dentro de la gestión empresarial y sus direcciones futuras.

  2. Impact of Subsurface Temperature Variability on Meteorological Variability: An AGCM Study

    Mahanama, S. P.; Koster, R. D.; Liu, P.

    2006-05-01

    Anomalous atmospheric conditions can lead to surface temperature anomalies, which in turn can lead to temperature anomalies deep in the soil. The deep soil temperature (and the associated ground heat content) has significant memory -- the dissipation of a temperature anomaly may take weeks to months -- and thus deep soil temperature may contribute to the low frequency variability of energy and water variables elsewhere in the system. The memory may even provide some skill to subseasonal and seasonal forecasts. This study uses two long-term AGCM experiments to isolate the contribution of deep soil temperature variability to variability elsewhere in the climate system. The first experiment consists of a standard ensemble of AMIP-type simulations, simulations in which the deep soil temperature variable is allowed to interact with the rest of the system. In the second experiment, the coupling of the deep soil temperature to the rest of the climate system is disabled -- at each grid cell, the local climatological seasonal cycle of deep soil temperature (as determined from the first experiment) is prescribed. By comparing the variability of various atmospheric quantities as generated in the two experiments, we isolate the contribution of interactive deep soil temperature to that variability. The results show that interactive deep soil temperature contributes significantly to surface temperature variability. Interactive deep soil temperature, however, reduces the variability of the hydrological cycle (evaporation and precipitation), largely because it allows for a negative feedback between evaporation and temperature.

  3. Diabetic emergencies including hypoglycemia during Ramadan

    Jamal Ahmad

    2012-01-01

    Full Text Available Majority of physicians are of the opinion that Ramadan fasting is acceptable for well-balanced type 2 patients conscious of their disease and compliant with their diet and drug intake. Fasting during Ramadan for patients with diabetes carries a risk of an assortment of complications. Islamic rules allow patients not to fast. However, if patient with diabetes wish to fast, it is necessary to advice them to undertake regular monitoring of blood glucose levels several times a day, to reduce the risk of hypoglycemia during day time fasting or hyperglycemia during the night. Patient with type 1 diabetes who fast during Ramadan may be better managed with fast-acting insulin. They should have basic knowledge of carbohydrate metabolism, the standard principles of diabetes care, and pharmacology of various antidiabetic drugs. This Consensus Statement describes the management of the various diabetic emergencies that may occur during Ramadan.

  4. Diabetic emergencies including hypoglycemia during Ramadan

    Ahmad, Jamal; Pathan, Md Faruque; Jaleel, Mohammed Abdul; Fathima, Farah Naaz; Raza, Syed Abbas; Khan, A. K. Azad; Ishtiaq, Osama; Sheikh, Aisha

    2012-01-01

    Majority of physicians are of the opinion that Ramadan fasting is acceptable for well-balanced type 2 patients conscious of their disease and compliant with their diet and drug intake. Fasting during Ramadan for patients with diabetes carries a risk of an assortment of complications. Islamic rules allow patients not to fast. However, if patient with diabetes wish to fast, it is necessary to advice them to undertake regular monitoring of blood glucose levels several times a day, to reduce the risk of hypoglycemia during day time fasting or hyperglycemia during the night. Patient with type 1 diabetes who fast during Ramadan may be better managed with fast-acting insulin. They should have basic knowledge of carbohydrate metabolism, the standard principles of diabetes care, and pharmacology of various antidiabetic drugs. This Consensus Statement describes the management of the various diabetic emergencies that may occur during Ramadan. PMID:22837906

  5. Ambulatory blood pressure monitoring-derived short-term blood pressure variability in primary hyperparathyroidism.

    Concistrè, A; Grillo, A; La Torre, G; Carretta, R; Fabris, B; Petramala, L; Marinelli, C; Rebellato, A; Fallo, F; Letizia, C

    2018-04-01

    Primary hyperparathyroidism is associated with a cluster of cardiovascular manifestations, including hypertension, leading to increased cardiovascular risk. The aim of our study was to investigate the ambulatory blood pressure monitoring-derived short-term blood pressure variability in patients with primary hyperparathyroidism, in comparison with patients with essential hypertension and normotensive controls. Twenty-five patients with primary hyperparathyroidism (7 normotensive,18 hypertensive) underwent ambulatory blood pressure monitoring at diagnosis, and fifteen out of them were re-evaluated after parathyroidectomy. Short-term-blood pressure variability was derived from ambulatory blood pressure monitoring and calculated as the following: 1) Standard Deviation of 24-h, day-time and night-time-BP; 2) the average of day-time and night-time-Standard Deviation, weighted for the duration of the day and night periods (24-h "weighted" Standard Deviation of BP); 3) average real variability, i.e., the average of the absolute differences between all consecutive BP measurements. Baseline data of normotensive and essential hypertension patients were matched for age, sex, BMI and 24-h ambulatory blood pressure monitoring values with normotensive and hypertensive-primary hyperparathyroidism patients, respectively. Normotensive-primary hyperparathyroidism patients showed a 24-h weighted Standard Deviation (P blood pressure higher than that of 12 normotensive controls. 24-h average real variability of systolic BP, as well as serum calcium and parathyroid hormone levels, were reduced in operated patients (P blood pressure variability is increased in normotensive patients with primary hyperparathyroidism and is reduced by parathyroidectomy, and may potentially represent an additional cardiovascular risk factor in this disease.

  6. Avatar Embodiment. Towards a Standardized Questionnaire

    Mar Gonzalez-Franco

    2018-06-01

    Full Text Available Inside virtual reality, users can embody avatars that are collocated from a first-person perspective. When doing so, participants have the feeling that the own body has been substituted by the self-avatar, and that the new body is the source of the sensations. Embodiment is complex as it includes not only body ownership over the avatar, but also agency, co-location, and external appearance. Despite the multiple variables that influence it, the illusion is quite robust, and it can be produced even if the self-avatar is of a different age, size, gender, or race from the participant's own body. Embodiment illusions are therefore the basis for many social VR experiences and a current active research area among the community. Researchers are interested both in the body manipulations that can be accepted, as well as studying how different self-avatars produce different attitudinal, social, perceptual, and behavioral effects. However, findings suggest that despite embodiment being strongly associated with the performance and reactions inside virtual reality, the extent to which the illusion is experienced varies between participants. In this paper, we review the questionnaires used in past experiments and propose a standardized embodiment questionnaire based on 25 questions that are prevalent in the literature. We encourage future virtual reality experiments that include first-person virtual avatars to administer this questionnaire in order to evaluate the degree of embodiment.

  7. Standards in neurosonology. Part I

    Joanna Wojcza

    2015-09-01

    Full Text Available The paper presents standards related to ultrasound imaging of the cerebral vasculature and structures. The aim of this paper is to standardize both the performance and description of ultrasound imaging of the extracranial and intracranial cerebral arteries as well as a study of a specifi c brain structure, i.e. substantia nigra hyperechogenicity. The following aspects are included in the description of standards for each ultrasonographic method: equipment requirements, patient preparation, study technique and documentation as well as the required elements of ultrasound description. Practical criteria for the diagnosis of certain pathologies in accordance with the latest literature were also presented. Furthermore, additional comments were included in some of the sections. Part I discusses standards for the performance, documentation and description of different ultrasound methods (Duplex, Doppler. Part II and III are devoted to standards for specifi c clinical situations (vasospasm, monitoring after the acute stage of stroke, detection of a right-toleft shunts, confi rmation of the arrest of the cerebral circulation, an assessment of the functional effi ciency of circle of Willis, an assessment of the cerebrovascular vasomotor reserve as well as the measurement of substantia nigra hyperechogenicity.

  8. Standards in neurosonology. Part III

    Joanna Wojczal

    2016-06-01

    Full Text Available The paper presents standards related to ultrasound imaging of the cerebral vasculature and structures. The aim of this paper is to standardize both the performance and description of ultrasound imaging of the extracranial and intracranial cerebral arteries as well as a study of a specific brain structure, i.e. substantia nigra hyperechogenicity. The following aspects are included in the description of standards for each ultrasonographic method: equipment requirements, patient preparation, study technique and documentation as well as the required elements of ultrasound description. Practical criteria for the diagnosis of certain pathologies in accordance with the latest literature were also presented. Furthermore, additional comments were included in some of the sections. Part I discusses standards for the performance, documentation and description of different ultrasound methods (Duplex, Doppler. Part II and III are devoted to standards for specific clinical situations (vasospasm, monitoring after the acute stage of stroke, detection of a right-to-left shunts, confirmation of the arrest of the cerebral circulation, an assessment of the functional efficiency of circle of Willis, an assessment of the cerebrovascular vasomotor reserve as well as the measurement of substantia nigra hyperechogenicity.

  9. Soil variability in engineering applications

    Vessia, Giovanna

    2014-05-01

    Finite Element Method (RFEM). This method has been used to investigate the random behavior of soils in the context of a variety of classical geotechnical problems. Afterward, some following studies collected the worldwide variability values of many technical parameters of soils (Phoon and Kulhawy 1999a) and their spatial correlation functions (Phoon and Kulhawy 1999b). In Italy, Cherubini et al. (2007) calculated the spatial variability structure of sandy and clayey soils from the standard cone penetration test readings. The large extent of the worldwide measured spatial variability of soils and rocks heavily affects the reliability of geotechnical designing as well as other uncertainties introduced by testing devices and engineering models. So far, several methods have been provided to deal with the preceding sources of uncertainties in engineering designing models (e.g. First Order Reliability Method, Second Order Reliability Method, Response Surface Method, High Dimensional Model Representation, etc.). Nowadays, the efforts in this field have been focusing on (1) measuring spatial variability of different rocks and soils and (2) developing numerical models that take into account the spatial variability as additional physical variable. References Cherubini C., Vessia G. and Pula W. 2007. Statistical soil characterization of Italian sites for reliability analyses. Proc. 2nd Int. Workshop. on Characterization and Engineering Properties of Natural Soils, 3-4: 2681-2706. Griffiths D.V. and Fenton G.A. 1993. Seepage beneath water retaining structures founded on spatially random soil, Géotechnique, 43(6): 577-587. Mandelbrot B.B. 1983. The Fractal Geometry of Nature. San Francisco: W H Freeman. Matheron G. 1962. Traité de Géostatistique appliquée. Tome 1, Editions Technip, Paris, 334 p. Phoon K.K. and Kulhawy F.H. 1999a. Characterization of geotechnical variability. Can Geotech J, 36(4): 612-624. Phoon K.K. and Kulhawy F.H. 1999b. Evaluation of geotechnical property

  10. Calibration of Flick standards

    Thalmann, Ruedi; Spiller, Jürg; Küng, Alain; Jusko, Otto

    2012-01-01

    Flick standards or magnification standards are widely used for an efficient and functional calibration of the sensitivity of form measuring instruments. The results of a recent measurement comparison have shown to be partially unsatisfactory and revealed problems related to the calibration of these standards. In this paper the influence factors for the calibration of Flick standards using roundness measurement instruments are discussed in detail, in particular the bandwidth of the measurement chain, residual form errors of the device under test, profile distortions due to the diameter of the probing element and questions related to the definition of the measurand. The different contributions are estimated using simulations and are experimentally verified. Also alternative methods to calibrate Flick standards are investigated. Finally the practical limitations of Flick standard calibration are shown and the usability of Flick standards both to calibrate the sensitivity of roundness instruments and to check the filter function of such instruments is analysed. (paper)

  11. Dosimetry standards for radiation processing

    Farrar, H. IV

    1999-01-01

    For irradiation treatments to be reproducible in the laboratory and then in the commercial environment, and for products to have certified absorbed doses, standardized dosimetry techniques are needed. This need is being satisfied by standards being developed by experts from around the world under the auspices of Subcommittee E10.01 of the American Society for Testing and Materials (ASTM). In the time period since it was formed in 1984, the subcommittee has grown to 150 members from 43 countries, representing a broad cross-section of industry, government and university interests. With cooperation from other international organizations, it has taken the combined part-time effort of all these people more than 13 years to complete 24 dosimetry standards. Four are specifically for food irradiation or agricultural applications, but the majority apply to all forms of gamma, x-ray, Bremsstrahlung and electron beam radiation processing, including dosimetry for sterilization of health care products and the radiation processing of fruits, vegetables, meats, spices, processed foods, plastics, inks, medical wastes and paper. An additional 6 standards are under development. Most of the standards provide exact procedures for using individual dosimetry systems or for characterizing various types of irradiation facilities, but one covers the selection and calibration of dosimetry systems, and another covers the treatment of uncertainties. Together, this set of standards covers essentially all aspects of dosimetry for radiation processing. The first 20 of these standards have been adopted in their present form by the International Organization of Standardization (ISO), and will be published by ISO in 1999. (author)

  12. Importance of international standards on hydrogen technologies

    Bose, T.K.; Gingras, S.

    2001-01-01

    This presentation provided some basic information regarding standards and the International Organization for Standardization (ISO). It also explained the importance of standardization activities, particularly ISO/TC 197 which applies to hydrogen technologies. Standards are established by consensus. They define the minimum requirements that will ensure that products and services are reliable and effective. Standards contribute to the elimination of technical barriers to trade (TBT). The harmonization of standards around the world is desirable in a free trade environment. The influence of the TBT on international standardization was discussed with particular reference to the objectives of ISO/TC 197 hydrogen technologies. One of the priorities for ISO/TC 197 is a hydrogen fuel infrastructure which includes refuelling stations, fuelling connectors, and storage technologies for gaseous and liquid hydrogen. Other priorities include an agreement between the International Electrotechnical Commission (IEC) and the ISO, in particular the IEC/TC 105 and ISO/TC 197 for the development of fuel cell standards. The international standards that have been published thus far include ISO 13984:1999 for liquid hydrogen, land vehicle fuelling system interface, and ISO 14687:1999 for hydrogen fuel product specification. Standards are currently under development for: liquid hydrogen; airport hydrogen fuelling facilities; gaseous hydrogen blends; basic considerations for the safety of hydrogen systems; gaseous hydrogen and hydrogen blends; and gaseous hydrogen for land vehicle filling connectors. It was concluded that the widespread use of hydrogen is dependent on international standardization

  13. Productivity standards for histology laboratories.

    Buesa, René J

    2010-04-01

    The information from 221 US histology laboratories (histolabs) and 104 from 24 other countries with workloads from 600 to 116 000 cases per year was used to calculate productivity standards for 23 technical and 27 nontechnical tasks and for 4 types of work flow indicators. The sample includes 254 human, 40 forensic, and 31 veterinary pathology services. Statistical analyses demonstrate that most productivity standards are not different between services or worldwide. The total workload for the US human pathology histolabs averaged 26 061 cases per year, with 54% between 10 000 and less than 30 000. The total workload for 70% of the histolabs from other countries was less than 20 000, with an average of 15 226 cases per year. The fundamental manual technical tasks in the histolab and their productivity standards are as follows: grossing (14 cases per hour), cassetting (54 cassettes per hour), embedding (50 blocks per hour), and cutting (24 blocks per hour). All the other tasks, each with their own productivity standards, can be completed by auxiliary staff or using automatic instruments. Depending on the level of automation of the histolab, all the tasks derived from a workload of 25 cases will require 15.8 to 17.7 hours of work completed by 2.4 to 2.7 employees with 18% of their working time not directly dedicated to the production of diagnostic slides. This article explains how to extrapolate this productivity calculation for any workload and different levels of automation. The overall performance standard for all the tasks, including 8 hours for automated tissue processing, is 3.2 to 3.5 blocks per hour; and its best indicator is the value of the gross work flow productivity that is essentially dependent on how the work is organized. This article also includes productivity standards for forensic and veterinary histolabs, but the staffing benchmarks for histolabs will be the subject of a separate article. Copyright 2010 Elsevier Inc. All rights reserved.

  14. INTER-EXAMINER VARIABILITY

    Objective: To establish whether inter-examiner variability is still a significant factor for the undergraduate orthopaedic clinical ... D. The scores for each student were tabulated and the range, mean, and pass rate determined for each of the examiners. ... has not the heart to reject the man”, consistently gave higher scores (1).

  15. Variability in GPS sources

    Jauncey, DL; King, EA; Bignall, HE; Lovell, JEJ; Kedziora-Chudczer, L; Tzioumis, AK; Tingay, SJ; Macquart, JP; McCulloch, PM

    2003-01-01

    Flux density monitoring data at 2.3 and 8.4 GHz is presented for a sample of 33 southern hemisphere GPS sources, drawn from the 2.7 GHz Parkes survey. This monitoring data, together with VLBI monitoring data, shows that a small fraction of these sources, similar to10%, vary. Their variability falls

  16. All Those Independent Variables.

    Meacham, Merle L.

    This paper presents a case study of a sixth grade remedial math class which illustrates the thesis that only the "experimental attitude," not the "experimental method," is appropriate in the classroom. The thesis is based on the fact that too many independent variables exist in a classroom situation to allow precise measurement. The case study…

  17. Variable speed generators

    Boldea, Ion

    2005-01-01

    With the deregulation of electrical energy production and distribution, says Boldea (Polytechnical Institute, Timisoara, Romania) producers are looking for ways to tailor their electricity for different markets. Variable-speed electric generators are serving that purpose, up to the 400 megavolt ampere unit size, in Japan since 1996 and Germany sinc

  18. Instrumented Impact Testing: Influence of Machine Variables and Specimen Position

    Lucon, E.; McCowan, C. N.; Santoyo, R. A.

    2008-09-15

    An investigation has been conducted on the influence of impact machine variables and specimen positioning on characteristic forces and absorbed energies from instrumented Charpy tests. Brittle and ductile fracture behavior has been investigated by testing NIST reference samples of low, high and super-high energy levels. Test machine variables included tightness of foundation, anvil and striker bolts, and the position of the center of percussion with respect to the center of strike. For specimen positioning, we tested samples which had been moved away or sideways with respect to the anvils. In order to assess the influence of the various factors, we compared mean values in the reference (unaltered) and altered conditions; for machine variables, t-test analyses were also performed in order to evaluate the statistical significance of the observed differences. Our results indicate that the only circumstance which resulted in variations larger than 5 percent for both brittle and ductile specimens is when the sample is not in contact with the anvils. These findings should be taken into account in future revisions of instrumented Charpy test standards.

  19. Instrumented Impact Testing: Influence of Machine Variables and Specimen Position

    Lucon, E.; McCowan, C. N.; Santoyo, R. A.

    2008-01-01

    An investigation has been conducted on the influence of impact machine variables and specimen positioning on characteristic forces and absorbed energies from instrumented Charpy tests. Brittle and ductile fracture behavior has been investigated by testing NIST reference samples of low, high and super-high energy levels. Test machine variables included tightness of foundation, anvil and striker bolts, and the position of the center of percussion with respect to the center of strike. For specimen positioning, we tested samples which had been moved away or sideways with respect to the anvils. In order to assess the influence of the various factors, we compared mean values in the reference (unaltered) and altered conditions; for machine variables, t-test analyses were also performed in order to evaluate the statistical significance of the observed differences. Our results indicate that the only circumstance which resulted in variations larger than 5 percent for both brittle and ductile specimens is when the sample is not in contact with the anvils. These findings should be taken into account in future revisions of instrumented Charpy test standards.

  20. 14 CFR 35.21 - Variable and reversible pitch propellers.

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Variable and reversible pitch propellers. 35.21 Section 35.21 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT AIRWORTHINESS STANDARDS: PROPELLERS Design and Construction § 35.21 Variable and...

  1. [Standard algorithm of molecular typing of Yersinia pestis strains].

    Eroshenko, G A; Odinokov, G N; Kukleva, L M; Pavlova, A I; Krasnov, Ia M; Shavina, N Iu; Guseva, N P; Vinogradova, N A; Kutyrev, V V

    2012-01-01

    Development of the standard algorithm of molecular typing of Yersinia pestis that ensures establishing of subspecies, biovar and focus membership of the studied isolate. Determination of the characteristic strain genotypes of plague infectious agent of main and nonmain subspecies from various natural foci of plague of the Russian Federation and the near abroad. Genotyping of 192 natural Y. pestis strains of main and nonmain subspecies was performed by using PCR methods, multilocus sequencing and multilocus analysis of variable tandem repeat number. A standard algorithm of molecular typing of plague infectious agent including several stages of Yersinia pestis differentiation by membership: in main and nonmain subspecies, various biovars of the main subspecies, specific subspecies; natural foci and geographic territories was developed. The algorithm is based on 3 typing methods--PCR, multilocus sequence typing and multilocus analysis of variable tandem repeat number using standard DNA targets--life support genes (terC, ilvN, inv, glpD, napA, rhaS and araC) and 7 loci of variable tandem repeats (ms01, ms04, ms06, ms07, ms46, ms62, ms70). The effectiveness of the developed algorithm is shown on the large number of natural Y. pestis strains. Characteristic sequence types of Y. pestis strains of various subspecies and biovars as well as MLVA7 genotypes of strains from natural foci of plague of the Russian Federation and the near abroad were established. The application of the developed algorithm will increase the effectiveness of epidemiologic monitoring of plague infectious agent, and analysis of epidemics and outbreaks of plague with establishing the source of origin of the strain and routes of introduction of the infection.

  2. Electric Power Monthly, August 1990. [Glossary included

    1990-11-29

    The Electric Power Monthly (EPM) presents monthly summaries of electric utility statistics at the national, Census division, and State level. The purpose of this publication is to provide energy decisionmakers with accurate and timely information that may be used in forming various perspectives on electric issues that lie ahead. Data includes generation by energy source (coal, oil, gas, hydroelectric, and nuclear); generation by region; consumption of fossil fuels for power generation; sales of electric power, cost data; and unusual occurrences. A glossary is included.

  3. Electrochemical cell structure including an ionomeric barrier

    Lambert, Timothy N.; Hibbs, Michael

    2017-06-20

    An apparatus includes an electrochemical half-cell comprising: an electrolyte, an anode; and an ionomeric barrier positioned between the electrolyte and the anode. The anode may comprise a multi-electron vanadium phosphorous alloy, such as VP.sub.x, wherein x is 1-5. The electrochemical half-cell is configured to oxidize the vanadium and phosphorous alloy to release electrons. A method of mitigating corrosion in an electrochemical cell includes disposing an ionomeric barrier in a path of electrolyte or ion flow to an anode and mitigating anion accumulation on the surface of the anode.

  4. Isolators Including Main Spring Linear Guide Systems

    Goold, Ryan (Inventor); Buchele, Paul (Inventor); Hindle, Timothy (Inventor); Ruebsamen, Dale Thomas (Inventor)

    2017-01-01

    Embodiments of isolators, such as three parameter isolators, including a main spring linear guide system are provided. In one embodiment, the isolator includes first and second opposing end portions, a main spring mechanically coupled between the first and second end portions, and a linear guide system extending from the first end portion, across the main spring, and toward the second end portion. The linear guide system expands and contracts in conjunction with deflection of the main spring along the working axis, while restricting displacement and rotation of the main spring along first and second axes orthogonal to the working axis.

  5. 76 FR 70037 - Federal Regulations; OMB Circulars, OFPP Policy Letters, and CASB Cost Accounting Standards...

    2011-11-10

    ... Circulars, OFPP Policy Letters, and CASB Cost Accounting Standards Included in the Semiannual Agenda of..., and Cost Accounting Standards Board (CASB) Cost Accounting Standards. DATES: The withdrawal is...

  6. Tides and Decadal Variability

    Ray, Richard D.

    2003-01-01

    This paper reviews the mechanisms by which oceanic tides and decadal variability in the oceans are connected. We distinguish between variability caused by tides and variability observed in the tides themselves. Both effects have been detected at some level. The most obvious connection with decadal timescales is through the 18.6-year precession of the moon's orbit plane. This precession gives rise to a small tide of the same period and to 18.6-year modulations in the phase and amplitudes of short-period tides. The 18.6-year "node tide" is very small, no more than 2 cm anywhere, and in sea level data it is dominated by the ocean's natural Variability. Some authors have naively attributed climate variations with periods near 19 years directly to the node tide, but the amplitude of the tide is too small for this mechanism to be operative. The more likely explanation (Loder and Garrett, JGR, 83, 1967-70, 1978) is that the 18.6-y modulations in short-period tides, especially h e principal tide M2, cause variations in ocean mixing, which is then observed in temperature and other climatic indicators. Tidally forced variability has also been proposed by some authors, either in response to occasional (and highly predictable) tidal extremes or as a nonlinear low-frequency oscillation caused by interactions between short-period tides. The former mechanism can produce only short-duration events hardly more significant than normal tidal ranges, but the latter mechanism can in principle induce low-frequency oscillations. The most recent proposal of this type is by Keeling and Whorf, who highlight the 1800-year spectral peak discovered by Bond et al. (1997). But the proposal appears contrived and should be considered, in the words of Munk et al. (2002), "as the most likely among unlikely candidates."

  7. Dynamics of Variable Mass Systems

    Eke, Fidelis O.

    1998-01-01

    This report presents the results of an investigation of the effects of mass loss on the attitude behavior of spinning bodies in flight. The principal goal is to determine whether there are circumstances under which the motion of variable mass systems can become unstable in the sense that their transverse angular velocities become unbounded. Obviously, results from a study of this kind would find immediate application in the aerospace field. The first part of this study features a complete and mathematically rigorous derivation of a set of equations that govern both the translational and rotational motions of general variable mass systems. The remainder of the study is then devoted to the application of the equations obtained to a systematic investigation of the effect of various mass loss scenarios on the dynamics of increasingly complex models of variable mass systems. It is found that mass loss can have a major impact on the dynamics of mechanical systems, including a possible change in the systems stability picture. Factors such as nozzle geometry, combustion chamber geometry, propellant's initial shape, size and relative mass, and propellant location can all have important influences on the system's dynamic behavior. The relative importance of these parameters on-system motion are quantified in a way that is useful for design purposes.

  8. Upper abdominal malignancies (not including esophagus)

    Rich, Tyvin A.

    1996-01-01

    Objective: This course will give an overview of the role of radiation therapy in the treatment of gastrointestinal malignancies in the upper abdomen, with an emphasis on carcinomas of the stomach, pancreas and biliary tract. For each site, information will be presented related to failure patterns with conventional surgical treatment and the indications for surgery for different stages of disease. The possible uses of radiation therapy as an adjuvant to surgical resection will be discussed as well as the use of radiation therapy alone. In addition, the combination of radiation therapy with chemotherapy will be discussed for each of these sites, as well as the information available at present as to the optimal way to combine chemotherapy with radiation therapy. Radiation therapy is not generally accepted to have a role in the treatment of patients with adenocarcinomas of the stomach. This is related to the fact that gastric cancer has been standardly treated with surgical resection alone, and delivery of high dose radiation therapy to the upper abdomen can be difficult because of the sensitivity of nearby normal tissues. Nonetheless, data on failure patterns suggest that local recurrence is common in patients with disease through the gastric wall and with positive nodes. Although there is some suggestive data to indicate that radiation therapy is effective as an adjuvant, results of an ongoing trial will be necessary for determination of the exact role of radiation therapy. Possible uses of radiation therapy as preoperative therapy or given alone will also be briefly discussed. Radiation therapy has been often used in the treatment of pancreatic adenocarcinomas, either alone or combined with surgical resection. Its use is more common for this site both because of the extremely poor prognosis of standard therapies, and because of the difficulty in performing an adequate surgical resection. Data will be reviewed suggesting that radiation therapy has a role when

  9. THE CHANDRA VARIABLE GUIDE STAR CATALOG

    Nichols, Joy S.; Lauer, Jennifer L.; Morgan, Douglas L.; Sundheim, Beth A.; Henden, Arne A.; Huenemoerder, David P.; Martin, Eric

    2010-01-01

    Variable stars have been identified among the optical-wavelength light curves of guide stars used for pointing control of the Chandra X-ray Observatory. We present a catalog of these variable stars along with their light curves and ancillary data. Variability was detected to a lower limit of 0.02 mag amplitude in the 4000-10000 A range using the photometrically stable Aspect Camera on board the Chandra spacecraft. The Chandra Variable Guide Star Catalog (VGUIDE) contains 827 stars, of which 586 are classified as definitely variable and 241 are identified as possibly variable. Of the 586 definite variable stars, we believe 319 are new variable star identifications. Types of variables in the catalog include eclipsing binaries, pulsating stars, and rotating stars. The variability was detected during the course of normal verification of each Chandra pointing and results from analysis of over 75,000 guide star light curves from the Chandra mission. The VGUIDE catalog represents data from only about 9 years of the Chandra mission. Future releases of VGUIDE will include newly identified variable guide stars as the mission proceeds. An important advantage of the use of space data to identify and analyze variable stars is the relatively long observations that are available. The Chandra orbit allows for observations up to 2 days in length. Also, guide stars were often used multiple times for Chandra observations, so many of the stars in the VGUIDE catalog have multiple light curves available from various times in the mission. The catalog is presented as both online data associated with this paper and as a public Web interface. Light curves with data at the instrumental time resolution of about 2 s, overplotted with the data binned at 1 ks, can be viewed on the public Web interface and downloaded for further analysis. VGUIDE is a unique project using data collected during the mission that would otherwise be ignored. The stars available for use as Chandra guide stars are

  10. Treatise on water hammer in hydropower standards and guidelines

    Bergant, A; Mazij, J; Karney, B; Pejović, S

    2014-01-01

    This paper reviews critical water hammer parameters as they are presented in official hydropower standards and guidelines. A particular emphasize is given to a number of IEC standards and guidelines that are used worldwide. The paper critically assesses water hammer control strategies including operational scenarios (closing and opening laws), surge control devices (surge tank, pressure regulating valve, flywheel, etc.), redesign of the water conveyance system components (tunnel, penstock), or limitation of operating conditions (limited operating range) that are variably covered in standards and guidelines. Little information is given on industrial water hammer models and solutions elsewhere. These are briefly introduced and discussed in the light of capability (simple versus complex systems), availability of expertise (in house and/or commercial) and uncertainty. The paper concludes with an interesting water hammer case study referencing the rules and recommendations from existing hydropower standards and guidelines in a view of effective water hammer control. Recommendations are given for further work on development of a special guideline on water hammer (hydraulic transients) in hydropower plants

  11. Treatise on water hammer in hydropower standards and guidelines

    Bergant, A.; Karney, B.; Pejović, S.; Mazij, J.

    2014-03-01

    This paper reviews critical water hammer parameters as they are presented in official hydropower standards and guidelines. A particular emphasize is given to a number of IEC standards and guidelines that are used worldwide. The paper critically assesses water hammer control strategies including operational scenarios (closing and opening laws), surge control devices (surge tank, pressure regulating valve, flywheel, etc.), redesign of the water conveyance system components (tunnel, penstock), or limitation of operating conditions (limited operating range) that are variably covered in standards and guidelines. Little information is given on industrial water hammer models and solutions elsewhere. These are briefly introduced and discussed in the light of capability (simple versus complex systems), availability of expertise (in house and/or commercial) and uncertainty. The paper concludes with an interesting water hammer case study referencing the rules and recommendations from existing hydropower standards and guidelines in a view of effective water hammer control. Recommendations are given for further work on development of a special guideline on water hammer (hydraulic transients) in hydropower plants.

  12. Requirements of quality standards

    Mueller, J.

    1977-01-01

    The lecture traces the development of nuclear standards, codes, and Federal regulations on quality assurance (QA) for nuclear power plants and associated facilities. The technical evolution of the last twelve years, especially in the area of nuclear technology, led to different activities and regulatory initiatives, and the present result is: several nations have their own homemade standards. The lecture discusses the former and especially current activities in standard development, and gives a description of the requirements of QA-standards used in USA and Europe, especially Western Germany. Furthermore the lecture attempts to give a comparison and an evaluation of the international quality standards from the author's viewpoint. Finally the lecture presents an outlook for the future international implications of QA-standards. There is an urgent need within the nuclear industry for simplification and standardization of QA-standards. The relationship between the various standards, and the applicability of the standards need clarification and a better transparancy. To point out these problems is the purpose of the lecture. (orig.) [de

  13. 28 CFR 20.32 - Includable offenses.

    2010-07-01

    ... Exchange of Criminal History Record Information § 20.32 Includable offenses. (a) Criminal history record... vehicular manslaughter, driving under the influence of drugs or liquor, and hit and run), when unaccompanied by a § 20.32(a) offense. These exclusions may not be applicable to criminal history records...

  14. Including Students with Visual Impairments: Softball

    Brian, Ali; Haegele, Justin A.

    2014-01-01

    Research has shown that while students with visual impairments are likely to be included in general physical education programs, they may not be as active as their typically developing peers. This article provides ideas for equipment modifications and game-like progressions for one popular physical education unit, softball. The purpose of these…

  15. Extending flood damage assessment methodology to include ...

    Optimal and sustainable flood plain management, including flood control, can only be achieved when the impacts of flood control measures are considered for both the man-made and natural environments, and the sociological aspects are fully considered. Until now, methods/models developed to determine the influences ...

  16. BIOLOGIC AND ECONOMIC EFFECTS OF INCLUDING DIFFERENT ...

    The biologic and economic effects of including three agro-industrial by-products as ingredients in turkey poult diets were investigated using 48 turkey poults in a completely randomised design experiment. Diets were formulated to contain the three by-products – wheat offal, rice husk and palm kernel meal, each at 20% level ...

  17. Including Children Dependent on Ventilators in School.

    Levine, Jack M.

    1996-01-01

    Guidelines for including ventilator-dependent children in school are offered, based on experience with six such students at a New York State school. Guidelines stress adherence to the medical management plan, the school-family partnership, roles of the social worker and psychologist, orientation, transportation, classroom issues, and steps toward…

  18. 40 CFR 60.2220 - What must I include in the deviation report?

    2010-07-01

    ... 40 Protection of Environment 6 2010-07-01 2010-07-01 false What must I include in the deviation... PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for... Recordkeeping and Reporting § 60.2220 What must I include in the deviation report? In each report required under...

  19. Management plan for the Nuclear Standards Program

    1979-11-01

    This Management Plan was prepared to describe the manner in which Oak Ridge National Laboratory will provide technical management of the Nuclear Standards Program. The organizational structure that has been established within ORNL for this function is the Nuclear Standards Management Center, which includes the Nuclear Standards Office (NSO) already in existence at ORNL. This plan is intended to support the policies and practices for the development and application of technical standards in ETN projects, programs, and technology developments as set forth in a standards policy memorandum from the DOE Program Director for Nuclear Energy

  20. Survey of standards for electronic image displays

    Rowe, William A.

    1996-02-01

    Electronic visual displays have been evolving from the 1960's basis of cathode ray tube (CRT) technology. Now, many other technologies are also available, including both flat panels and projection displays. Standards for these displays are being developed at both the national level and the international levels. Standards activity within the United States is in its infancy and is fragmented according to the inclination of each of the standards developing organizations. The latest round of flat panel display technology was primarily developed in Japan. Initially standards arose from component vendor-to-OEM customer relationships. As a result, Japanese standards for components are the best developed. The Electronics Industries Association of Japan (EIAJ) is providing their standards to the International Electrotechnical Commission (IEC) for adoption. On the international level, professional societies such as the human factors society (hfs) and the International Organization for Standardization (ISO) have completed major standards, hfs developed the first ergonomic standard hfs-100 and the ISO has developed some sections of a broader ergonomic standard ISO 9241. This paper addresses the organization of standards activity. Active organizations and their areas of focus are identified. The major standards that have been completed or are in development are described. Finally, suggestions for improving the this standards activity are proposed.

  1. Implementation of the Brazilian primary standard for x-rays

    Peixoto, J.G.P.; Almeida, C.E.V. de

    2002-01-01

    In the field of ionizing radiation metrology, a primary standard of a given physical quantity is essentially an experimental set-up which allows one to attribute a numerical value to a particular sample of that quantity in terms of a unit given by an abstract definition. The absolute measurement of the radiation quantity air kerma, is performed with a free-air ionization chamber. A great deal of research to determine the absolute measurement resulted in different designs for primary standard free-air ionization chambers such as cilindrics or plane parallel chambers. The implementation of primary standard dosimetry with free-air ionization chambers is limited to the National Metrology Institutes - NMIs. Since 1975, the Bureau International des Poids et Mesures - BIPM has been conducting comparisons of NMIs primary free-air standard chambers in the medium energy x-rays range. These comparisons are carried out indirectly through the calibration at both the BIPM and at the NMI of one or more transfer ionization chambers at a series of four reference radiation qualities. The scientific work programme of the National Laboratory for Ionizing Radiation Metrology - LNMRI of the Institute of Radioprotection and Dosimetry - IRD, which belongs to the National Commission of Nuclear Energy - CNEN, includes the establishment of a primary standard for x-rays of medium energy x-ray range. This activity is justified by the demand to calibrate periodically Brazilian network of the secondary standards without losing quality of the measurement. The LNMRI decided to implement four reference radiation qualities establishing the use of a transfer chamber calibrated at BIPM. The LNMRI decided to implement the primary standard dosimetry using a free-air ionization chamber with variable volume, made by Victoreen, model 480. Parameters related to the measurement of the quantity air kerma were evaluated, such as: air absorption, scattering inside the ionization chamber, saturation, beam

  2. Classical mechanics including an introduction to the theory of elasticity

    Hentschke, Reinhard

    2017-01-01

    This textbook teaches classical mechanics as one of the foundations of physics. It describes the mechanical stability and motion in physical systems ranging from the molecular to the galactic scale. Aside from the standard topics of mechanics in the physics curriculum, this book includes an introduction to the theory of elasticity and its use in selected modern engineering applications, e.g. dynamic mechanical analysis of viscoelastic materials. The text also covers many aspects of numerical mechanics, ranging from the solution of ordinary differential equations, including molecular dynamics simulation of many particle systems, to the finite element method. Attendant Mathematica programs or parts thereof are provided in conjunction with selected examples. Numerous links allow the reader to connect to related subjects and research topics. Among others this includes statistical mechanics (separate chapter), quantum mechanics, space flight, galactic dynamics, friction, and vibration spectroscopy. An introductory...

  3. Control system architecture: The standard and non-standard models

    Thuot, M.E.; Dalesio, L.R.

    1993-01-01

    Control system architecture development has followed the advances in computer technology through mainframes to minicomputers to micros and workstations. This technology advance and increasingly challenging accelerator data acquisition and automation requirements have driven control system architecture development. In summarizing the progress of control system architecture at the last International Conference on Accelerator and Large Experimental Physics Control Systems (ICALEPCS) B. Kuiper asserted that the system architecture issue was resolved and presented a open-quotes standard modelclose quotes. The open-quotes standard modelclose quotes consists of a local area network (Ethernet or FDDI) providing communication between front end microcomputers, connected to the accelerator, and workstations, providing the operator interface and computational support. Although this model represents many present designs, there are exceptions including reflected memory and hierarchical architectures driven by requirements for widely dispersed, large channel count or tightly coupled systems. This paper describes the performance characteristics and features of the open-quotes standard modelclose quotes to determine if the requirements of open-quotes non-standardclose quotes architectures can be met. Several possible extensions to the open-quotes standard modelclose quotes are suggested including software as well as the hardware architectural features

  4. Neonatal therapeutic hypothermia outside of standard guidelines: a survey of U.S. neonatologists.

    Burnsed, Jennifer; Zanelli, Santina A

    2017-11-01

    Therapeutic hypothermia is standard of care in term infants with moderate-to-severe hypoxic-ischaemic encephalopathy (HIE). The goal of this survey was to explore the attitudes of U.S. neonatologists caring for infants with HIE who fall outside of current guidelines. Case-based survey administered to members of the Section on Neonatal-Perinatal Medicine of the American Academy of Pediatrics. A total of 447 responses were analysed, a response rate of 19%. We found significant variability amongst U.S. neonatologists with regard to the use of therapeutic hypothermia for infants with HIE who fall outside standard inclusion criteria. Scenarios with the most variability included HIE in a late preterm infant and HIE following a postnatal code. Provision of therapeutic hypothermia outside of standard guidelines was not influenced by number of years in practice, neonatal intensive care type (NICU) or NICU size. Significant variability in practice exists when caring for infants with HIE who do not meet standard inclusion criteria, emphasizing the need for continued and rigorous research in this area. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.

  5. Dynamics of the standard model

    Donoghue, John F; Holstein, Barry R

    2014-01-01

    Describing the fundamental theory of particle physics and its applications, this book provides a detailed account of the Standard Model, focusing on techniques that can produce information about real observed phenomena. The book begins with a pedagogic account of the Standard Model, introducing essential techniques such as effective field theory and path integral methods. It then focuses on the use of the Standard Model in the calculation of physical properties of particles. Rigorous methods are emphasized, but other useful models are also described. This second edition has been updated to include recent theoretical and experimental advances, such as the discovery of the Higgs boson. A new chapter is devoted to the theoretical and experimental understanding of neutrinos, and major advances in CP violation and electroweak physics have been given a modern treatment. This book is valuable to graduate students and researchers in particle physics, nuclear physics and related fields.

  6. Perspectives in the standard model

    Ellis, R.K.; Hill, C.T.; Lykken, J.D.

    1992-01-01

    Particle physics is an experimentally based science, with a need for the best theorists to make contact with data and to enlarge and enhance their theoretical descriptions as the subject evolves. The authors felt it imperative that the TASI (Theoretical Advanced Study Institute) program reflect this need. The goal of this conference, was to provide the students with a comprehensive look at the current understanding of the standard model, as well as the techniques which promise to advance that understanding in the future. Topics covered include: symmetry breaking in the standard model; physics beyond the standard model; chiral effective Lagrangians; semi-classical string theory; renormalization of electroweak gauge interactions; electroweak experiments at LEP; the CKM matrix and CP violation; axion searches; lattice QCD; perturbative QCD; heavy quark effective field theory; heavy flavor physics on the lattice; and neutrinos. Separate abstracts were prepared for 13 papers in this conference

  7. Photoactive devices including porphyrinoids with coordinating additives

    Forrest, Stephen R; Zimmerman, Jeramy; Yu, Eric K; Thompson, Mark E; Trinh, Cong; Whited, Matthew; Diev, Vlacheslav

    2015-05-12

    Coordinating additives are included in porphyrinoid-based materials to promote intermolecular organization and improve one or more photoelectric characteristics of the materials. The coordinating additives are selected from fullerene compounds and organic compounds having free electron pairs. Combinations of different coordinating additives can be used to tailor the characteristic properties of such porphyrinoid-based materials, including porphyrin oligomers. Bidentate ligands are one type of coordinating additive that can form coordination bonds with a central metal ion of two different porphyrinoid compounds to promote porphyrinoid alignment and/or pi-stacking. The coordinating additives can shift the absorption spectrum of a photoactive material toward higher wavelengths, increase the external quantum efficiency of the material, or both.

  8. Beyond the standard model

    Wilczek, F.

    1993-01-01

    The standard model of particle physics is highly successful, although it is obviously not a complete or final theory. In this presentation the author argues that the structure of the standard model gives some quite concrete, compelling hints regarding what lies beyond. Essentially, this presentation is a record of the author's own judgement of what the central clues for physics beyond the standard model are, and also it is an attempt at some pedagogy. 14 refs., 6 figs

  9. Standard Model processes

    Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  10. International Construction Measurement Standard

    Mitchell, Charles

    2016-01-01

    The International Construction Measurement Standard Coalition (the Coalition) was formed on 17 June 2015 after meeting at the International Monetary Fund in Washington DC, USA. The Coalition, comprising the organisations listed below at the date of publication, aims to bring about consistency in construction cost reporting standards internationally. This is achieved by the creation and adoption of this ICMS, an agreed international standard for the structuring and presentation of cost reports...

  11. Electric power monthly, September 1990. [Glossary included

    1990-12-17

    The purpose of this report is to provide energy decision makers with accurate and timely information that may be used in forming various perspectives on electric issues. The power plants considered include coal, petroleum, natural gas, hydroelectric, and nuclear power plants. Data are presented for power generation, fuel consumption, fuel receipts and cost, sales of electricity, and unusual occurrences at power plants. Data are compared at the national, Census division, and state levels. 4 figs., 52 tabs. (CK)

  12. Nuclear reactor shield including magnesium oxide

    Rouse, C.A.; Simnad, M.T.

    1981-01-01

    An improvement is described for nuclear reactor shielding of a type used in reactor applications involving significant amounts of fast neutron flux. The reactor shielding includes means providing structural support, neutron moderator material, neutron absorber material and other components, wherein at least a portion of the neutron moderator material is magnesium in the form of magnesium oxide either alone or in combination with other moderator materials such as graphite and iron

  13. Model for safety reports including descriptive examples

    1995-12-01

    Several safety reports will be produced in the process of planning and constructing the system for disposal of high-level radioactive waste in Sweden. The present report gives a model, with detailed examples, of how these reports should be organized and what steps they should include. In the near future safety reports will deal with the encapsulation plant and the repository. Later reports will treat operation of the handling systems and the repository

  14. Jet-calculus approach including coherence effects

    Jones, L.M.; Migneron, R.; Narayanan, K.S.S.

    1987-01-01

    We show how integrodifferential equations typical of jet calculus can be combined with an averaging procedure to obtain jet-calculus-based results including the Mueller interference graphs. Results in longitudinal-momentum fraction x for physical quantities are higher at intermediate x and lower at large x than with the conventional ''incoherent'' jet calculus. These results resemble those of Marchesini and Webber, who used a Monte Carlo approach based on the same dynamics

  15. Variable Permanent Magnet Quadrupole

    Mihara, T.; Iwashita, Y.; Kyoto U.; Kumada, M.; NIRS, Chiba; Spencer, C.M.; SLAC

    2007-01-01

    A permanent magnet quadrupole (PMQ) is one of the candidates for the final focus lens in a linear collider. An over 120 T/m strong variable permanent magnet quadrupole is achieved by the introduction of saturated iron and a 'double ring structure'. A fabricated PMQ achieved 24 T integrated gradient with 20 mm bore diameter, 100 mm magnet diameter and 20 cm pole length. The strength of the PMQ is adjustable in 1.4 T steps, due to its 'double ring structure': the PMQ is split into two nested rings; the outer ring is sliced along the beam line into four parts and is rotated to change the strength. This paper describes the variable PMQ from fabrication to recent adjustments

  16. On Complex Random Variables

    Anwer Khurshid

    2012-07-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE In this paper, it is shown that a complex multivariate random variable  is a complex multivariate normal random variable of dimensionality if and only if all nondegenerate complex linear combinations of  have a complex univariate normal distribution. The characteristic function of  has been derived, and simpler forms of some theorems have been given using this characterization theorem without assuming that the variance-covariance matrix of the vector  is Hermitian positive definite. Marginal distributions of  have been given. In addition, a complex multivariate t-distribution has been defined and the density derived. A characterization of the complex multivariate t-distribution is given. A few possible uses of this distribution have been suggested.

  17. Technological Capability's Predictor Variables

    Fernanda Maciel Reichert

    2011-03-01

    Full Text Available The aim of this study was to identify the factors that influence in configuration of the technological capability of companies in sectors with medium-low technological intensity. To achieve the goal proposed in this article a survey was carried out. Based on the framework developed by Lall (1992 which classifies firms in basic, intermediate and advanced level of technological capability; it was found that the predominant technological capability is intermediate, with 83.7% of respondent companies (plastics companies in Brazil. It is believed that the main contribution of this study is the finding that the dependent variable named “Technological Capability” can be explained at a rate of 65% by six variables: development of new processes; selection of the best equipment supplier; sales of internally developed new technology to third parties; design and manufacture of equipment; study of the work methods and perform inventory control; and improvement of product quality.

  18. Towards common technical standards

    Rahmat, H.; Suardi, A.R.

    1993-01-01

    In 1989, PETRONAS launched its Total Quality Management (TQM) program. In the same year the decision was taken by the PETRONAS Management to introduce common technical standards group wide. These standards apply to the design, construction, operation and maintenance of all PETRONAS installations in the upstream, downstream and petrochemical sectors. The introduction of common company standards is seen as part of an overall technical management system, which is an integral part of Total Quality Management. The Engineering and Safety Unit in the PETRONAS Central Office in Kuala Lumpur has been charged with the task of putting in place a set of technical standards throughout PETRONAS and its operating units

  19. The Standard Model course

    CERN. Geneva HR-RFA

    2006-01-01

    Suggested Readings: Aspects of Quantum Chromodynamics/A Pich, arXiv:hep-ph/0001118. - The Standard Model of Electroweak Interactions/A Pich, arXiv:hep-ph/0502010. - The Standard Model of Particle Physics/A Pich The Standard Model of Elementary Particle Physics will be described. A detailed discussion of the particle content, structure and symmetries of the theory will be given, together with an overview of the most important experimental facts which have established this theoretical framework as the Standard Theory of particle interactions.

  20. Flight Standards Automation System -

    Department of Transportation — FAVSIS supports Flight Standards Service (AFS) by maintaining their information on entities such as air carriers, air agencies, designated airmen, and check airmen....

  1. Variable Kernel Density Estimation

    Terrell, George R.; Scott, David W.

    1992-01-01

    We investigate some of the possibilities for improvement of univariate and multivariate kernel density estimates by varying the window over the domain of estimation, pointwise and globally. Two general approaches are to vary the window width by the point of estimation and by point of the sample observation. The first possibility is shown to be of little efficacy in one variable. In particular, nearest-neighbor estimators in all versions perform poorly in one and two dimensions, but begin to b...

  2. Internet interventions for chronic pain including headache: A systematic review

    Monica Buhrman

    2016-05-01

    Full Text Available Chronic pain is a major health problem and behavioral based treatments have been shown to be effective. However, the availability of these kinds of treatments is scarce and internet-based treatments have been shown to be promising in this area. The objective of the present systematic review is to evaluate internet-based interventions for persons with chronic pain. The specific aims are to do an updated review with a broad inclusion of different chronic pain diagnoses and to assess disability and pain and also measures of catastrophizing, depression and anxiety. A systematic search identified 891 studies and 22 trials were selected as eligible for review. Two of the selected trials included children/youth and five included individuals with chronic headache and/or migraine. The most frequently measured domain reflected in the primary outcomes was interference/disability, followed by catastrophizing. Result across the studies showed a number of beneficial effects. Twelve trials reported significant effects on disability/interference outcomes and pain intensity. Positive effects were also found on psychological variable such as catastrophizing, depression and anxiety. Several studies (n = 12 were assessed to have an unclear level of risk bias. The attrition levels ranged from 4% to 54% where the headache trials had the highest drop-out levels. However, findings suggest that internet-based treatments based on cognitive behavioural therapy (CBT are efficacious measured with different outcome variables. Results are in line with trials in clinical settings. Meta-analytic statistics were calculated for interference/disability, pain intensity, catastrophizing and mood ratings. Results showed that the effect size for interference/disability was Hedge's g = −0.39, for pain intensity Hedge's g = −0.33, for catastrophizing Hedge's g = −0.49 and for mood variables (depression Hedge's g = −0.26.

  3. Short timescale variability in the faint sky variability survey

    Morales-Rueda, L.; Groot, P.J.; Augusteijn, T.; Nelemans, G.A.; Vreeswijk, P.M.; Besselaar, E.J.M. van den

    2006-01-01

    We present the V-band variability analysis of the Faint Sky Variability Survey (FSVS). The FSVS combines colour and time variability information, from timescales of 24 minutes to tens of days, down to V = 24. We find that �1% of all point sources are variable along the main sequence reaching �3.5%

  4. State Standards and State Assessment Systems: A Guide to Alignment. Series on Standards and Assessments.

    La Marca, Paul M.; Redfield, Doris; Winter, Phoebe C.

    Alignment of content standards, performance standards, and assessments is crucial. This guide contains information to assist states and districts in aligning their assessment systems to their content and performance standards. It includes a review of current literature, both published and fugitive. The research is woven together with a few basic…

  5. Can Images Obtained With High Field Strength Magnetic Resonance Imaging Reduce Contouring Variability of the Prostate?

    Usmani, Nawaid; Sloboda, Ron; Kamal, Wafa; Ghosh, Sunita; Pervez, Nadeem; Pedersen, John; Yee, Don; Danielson, Brita; Murtha, Albert; Amanie, John; Monajemi, Tara

    2011-01-01

    Purpose: The objective of this study is to determine whether there is less contouring variability of the prostate using higher-strength magnetic resonance images (MRI) compared with standard MRI and computed tomography (CT). Methods and Materials: Forty patients treated with prostate brachytherapy were accrued to a prospective study that included the acquisition of 1.5-T MR and CT images at specified time points. A subset of 10 patients had additional 3.0-T MR images acquired at the same time as their 1.5-T MR scans. Images from each of these patients were contoured by 5 radiation oncologists, with a random subset of patients repeated to quantify intraobserver contouring variability. To minimize bias in contouring the prostate, the image sets were placed in folders in a random order with all identifiers removed from the images. Results: Although there was less interobserver contouring variability in the overall prostate volumes in 1.5-T MRI compared with 3.0-T MRI (p < 0.01), there was no significant differences in contouring variability in the different regions of the prostate between 1.5-T MRI and 3.0-T MRI. MRI demonstrated significantly less interobserver contouring variability in both 1.5-T and 3.0-T compared with CT in overall prostate volumes (p < 0.01, p = 0.01), with the greatest benefits being appreciated in the base of the prostate. Overall, there was less intraobserver contouring variability than interobserver contouring variability for all of the measurements analyzed. Conclusions: Use of 3.0-T MRI does not demonstrate a significant improvement in contouring variability compared with 1.5-T MRI, although both magnetic strengths demonstrated less contouring variability compared with CT.

  6. Articulatory variability in cluttering.

    Hartinger, Mariam; Mooshammer, Christine

    2008-01-01

    In order to investigate the articulatory processes of the hasty and mumbled speech in cluttering, the kinematic variability was analysed by means of electromagnetic midsagittal articulography. In contrast to persons with stuttering, those with cluttering improve their intelligibility by concentrating on their speech task. Variability has always been an important criterion in comparable studies of stuttering and is discussed in terms of the stability of the speech motor system. The aim of the current study was to analyse the spatial and temporal variability in the speech of three persons with cluttering (PWC) and three control speakers. All participants were native speakers of German. The speech material consisted of repetitive CV syllables and loan words such as 'emotionalisieren', because PWC have the severest problems with long words with a complex syllable structure. The results showed a significantly higher coefficient of variation for PWC in loan word production, both in the temporal and in the spatial domain, whereas the means of displacements and durations did not differ between groups. These findings were discussed in terms of the effects of the linguistic complexity, since for the syllable repetition task, no significant differences between PWC and controls were found. Copyright 2008 S. Karger AG, Basel.

  7. Penalized variable selection in competing risks regression.

    Fu, Zhixuan; Parikh, Chirag R; Zhou, Bingqing

    2017-07-01

    Penalized variable selection methods have been extensively studied for standard time-to-event data. Such methods cannot be directly applied when subjects are at risk of multiple mutually exclusive events, known as competing risks. The proportional subdistribution hazard (PSH) model proposed by Fine and Gray (J Am Stat Assoc 94:496-509, 1999) has become a popular semi-parametric model for time-to-event data with competing risks. It allows for direct assessment of covariate effects on the cumulative incidence function. In this paper, we propose a general penalized variable selection strategy that simultaneously handles variable selection and parameter estimation in the PSH model. We rigorously establish the asymptotic properties of the proposed penalized estimators and modify the coordinate descent algorithm for implementation. Simulation studies are conducted to demonstrate the good performance of the proposed method. Data from deceased donor kidney transplants from the United Network of Organ Sharing illustrate the utility of the proposed method.

  8. Measuring Variability in the Presence of Noise

    Welsh, W. F.

    Quantitative measurements of a variable signal in the presence of noise requires very careful attention to subtle affects which can easily bias the measurements. This is not limited to the low-count rate regime, nor is the bias error necessarily small. In this talk I will mention some of the dangers in applying standard techniques which are appropriate for high signal to noise data but fail in the cases where the S/N is low. I will discuss methods for correcting the bias in the these cases, both for periodic and non-periodic variability, and will introduce the concept of the ``filtered de-biased RMS''. I will also illustrate some common abuses of power spectrum interpretation. All of these points will be illustrated with examples from recent work on CV and AGN variability.

  9. Evolving spiking networks with variable resistive memories.

    Howard, Gerard; Bull, Larry; de Lacy Costello, Ben; Gale, Ella; Adamatzky, Andrew

    2014-01-01

    Neuromorphic computing is a brainlike information processing paradigm that requires adaptive learning mechanisms. A spiking neuro-evolutionary system is used for this purpose; plastic resistive memories are implemented as synapses in spiking neural networks. The evolutionary design process exploits parameter self-adaptation and allows the topology and synaptic weights to be evolved for each network in an autonomous manner. Variable resistive memories are the focus of this research; each synapse has its own conductance profile which modifies the plastic behaviour of the device and may be altered during evolution. These variable resistive networks are evaluated on a noisy robotic dynamic-reward scenario against two static resistive memories and a system containing standard connections only. The results indicate that the extra behavioural degrees of freedom available to the networks incorporating variable resistive memories enable them to outperform the comparative synapse types.

  10. Size and Topology Optimization for Trusses with Discrete Design Variables by Improved Firefly Algorithm

    Yue Wu

    2017-01-01

    Full Text Available Firefly Algorithm (FA, for short is inspired by the social behavior of fireflies and their phenomenon of bioluminescent communication. Based on the fundamentals of FA, two improved strategies are proposed to conduct size and topology optimization for trusses with discrete design variables. Firstly, development of structural topology optimization method and the basic principle of standard FA are introduced in detail. Then, in order to apply the algorithm to optimization problems with discrete variables, the initial positions of fireflies and the position updating formula are discretized. By embedding the random-weight and enhancing the attractiveness, the performance of this algorithm is improved, and thus an Improved Firefly Algorithm (IFA, for short is proposed. Furthermore, using size variables which are capable of including topology variables and size and topology optimization for trusses with discrete variables is formulated based on the Ground Structure Approach. The essential techniques of variable elastic modulus technology and geometric construction analysis are applied in the structural analysis process. Subsequently, an optimization method for the size and topological design of trusses based on the IFA is introduced. Finally, two numerical examples are shown to verify the feasibility and efficiency of the proposed method by comparing with different deterministic methods.

  11. Climatological variability in regional air pollution

    Shannon, J.D.; Trexler, E.C. Jr.

    1995-01-01

    Although some air pollution modeling studies examine events that have already occurred (e.g., the Chernobyl plume) with relevant meteorological conditions largely known, most pollution modeling studies address expected or potential scenarios for the future. Future meteorological conditions, the major pollutant forcing function other than emissions, are inherently uncertain although much relevant information is contained in past observational data. For convenience in our discussions of regional pollutant variability unrelated to emission changes, we define meteorological variability as short-term (within-season) pollutant variability and climatological variability as year-to-year changes in seasonal averages and accumulations of pollutant variables. In observations and in some of our simulations the effects are confounded because for seasons of two different years both the mean and the within-season character of a pollutant variable may change. Effects of climatological and meteorological variability on means and distributions of air pollution parameters, particularly those related to regional visibility, are illustrated. Over periods of up to a decade climatological variability may mask or overstate improvements resulting from emission controls. The importance of including climatological uncertainties in assessing potential policies, particularly when based partly on calculated source-receptor relationships, is highlighted

  12. Construction of Database for Pulsating Variable Stars

    Chen, B. Q.; Yang, M.; Jiang, B. W.

    2011-07-01

    A database for the pulsating variable stars is constructed for Chinese astronomers to study the variable stars conveniently. The database includes about 230000 variable stars in the Galactic bulge, LMC and SMC observed by the MACHO (MAssive Compact Halo Objects) and OGLE (Optical Gravitational Lensing Experiment) projects at present. The software used for the construction is LAMP, i.e., Linux+Apache+MySQL+PHP. A web page is provided to search the photometric data and the light curve in the database through the right ascension and declination of the object. More data will be incorporated into the database.

  13. Including investment risk in large-scale power market models

    Lemming, Jørgen Kjærgaard; Meibom, P.

    2003-01-01

    Long-term energy market models can be used to examine investments in production technologies, however, with market liberalisation it is crucial that such models include investment risks and investor behaviour. This paper analyses how the effect of investment risk on production technology selection...... can be included in large-scale partial equilibrium models of the power market. The analyses are divided into a part about risk measures appropriate for power market investors and a more technical part about the combination of a risk-adjustment model and a partial-equilibrium model. To illustrate...... the analyses quantitatively, a framework based on an iterative interaction between the equilibrium model and a separate risk-adjustment module was constructed. To illustrate the features of the proposed modelling approach we examined how uncertainty in demand and variable costs affects the optimal choice...

  14. Variable Cycle Intake for Reverse Core Engine

    Suciu, Gabriel L (Inventor); Chandler, Jesse M (Inventor); Staubach, Joseph B (Inventor)

    2016-01-01

    A gas generator for a reverse core engine propulsion system has a variable cycle intake for the gas generator, which variable cycle intake includes a duct system. The duct system is configured for being selectively disposed in a first position and a second position, wherein free stream air is fed to the gas generator when in the first position, and fan stream air is fed to the gas generator when in the second position.

  15. Handbook of latent variable and related models

    Lee, Sik-Yum

    2011-01-01

    This Handbook covers latent variable models, which are a flexible class of models for modeling multivariate data to explore relationships among observed and latent variables.- Covers a wide class of important models- Models and statistical methods described provide tools for analyzing a wide spectrum of complicated data- Includes illustrative examples with real data sets from business, education, medicine, public health and sociology.- Demonstrates the use of a wide variety of statistical, computational, and mathematical techniques.

  16. [Renal patient's diet: Can fish be included?].

    Castro González, M I; Maafs Rodríguez, A G; Galindo Gómez, C

    2012-01-01

    Medical and nutritional treatment for renal disease, now a major public health issue, is highly complicated. Nutritional therapy must seek to retard renal dysfunction, maintain an optimal nutritional status and prevent the development of underlying pathologies. To analyze ten fish species to identify those that, because of their low phosphorus content, high biological value protein and elevated n-3 fatty acids EPA and DHA, could be included in renal patient's diet. The following fish species (Litte tunny, Red drum, Spotted eagleray, Escolar, Swordfish, Big-scale pomfret, Cortez flounder, Largemouth blackbass, Periche mojarra, Florida Pompano) were analyzed according to the AOAC and Keller techniques to determine their protein, phosphorus, sodium, potassium, cholesterol, vitamins D(3) and E, and n-3 EPA+DHA content. These results were used to calculate relations between nutrients. The protein in the analyzed species ranged from 16.5 g/100 g of fillet (Largemouth black bass) to 27.2 g/100 g (Red drum); the lowest phosphorus value was 28.6 mg/100 g (Periche mojarra) and the highest 216.3 mg/100 g (Spotted eagle ray). 80% of the fish presented > 100 mg EPA + DHA in 100 g of fillet. By its Phosphorus/gProtein ratio, Escolar and Swordfish could not be included in the renal diet; Little tunny, Escolar, Big-scale pomfret, Largemouth black-bass, Periche mojarra and Florida Pompano presented a lower Phosphorus/EPA + DHA ratio. Florida pompano is the most recommended specie for renal patients, due to its optimal nutrient relations. However, all analyzed species, except Escolar and Swordfish, could be included in renal diets.

  17. MOS modeling hierarchy including radiation effects

    Alexander, D.R.; Turfler, R.M.

    1975-01-01

    A hierarchy of modeling procedures has been developed for MOS transistors, circuit blocks, and integrated circuits which include the effects of total dose radiation and photocurrent response. The models were developed for use with the SCEPTRE circuit analysis program, but the techniques are suitable for other modern computer aided analysis programs. The modeling hierarchy permits the designer or analyst to select the level of modeling complexity consistent with circuit size, parametric information, and accuracy requirements. Improvements have been made in the implementation of important second order effects in the transistor MOS model, in the definition of MOS building block models, and in the development of composite terminal models for MOS integrated circuits

  18. Drug delivery device including electrolytic pump

    Foulds, Ian G.; Buttner, Ulrich; Yi, Ying

    2016-01-01

    Systems and methods are provided for a drug delivery device and use of the device for drug delivery. In various aspects, the drug delivery device combines a “solid drug in reservoir” (SDR) system with an electrolytic pump. In various aspects an improved electrolytic pump is provided including, in particular, an improved electrolytic pump for use with a drug delivery device, for example an implantable drug delivery device. A catalytic reformer can be incorporated in a periodically pulsed electrolytic pump to provide stable pumping performance and reduced actuation cycle.

  19. Drug delivery device including electrolytic pump

    Foulds, Ian G.

    2016-03-31

    Systems and methods are provided for a drug delivery device and use of the device for drug delivery. In various aspects, the drug delivery device combines a “solid drug in reservoir” (SDR) system with an electrolytic pump. In various aspects an improved electrolytic pump is provided including, in particular, an improved electrolytic pump for use with a drug delivery device, for example an implantable drug delivery device. A catalytic reformer can be incorporated in a periodically pulsed electrolytic pump to provide stable pumping performance and reduced actuation cycle.

  20. About hidden influence of predictor variables: Suppressor and mediator variables

    Milovanović Boško

    2013-01-01

    Full Text Available In this paper procedure for researching hidden influence of predictor variables in regression models and depicting suppressor variables and mediator variables is shown. It is also shown that detection of suppressor variables and mediator variables could provide refined information about the research problem. As an example for applying this procedure, relation between Atlantic atmospheric centers and air temperature and precipitation amount in Serbia is chosen. [Projekat Ministarstva nauke Republike Srbije, br. 47007

  1. The standard model and beyond

    Marciano, W.J.

    1989-05-01

    In these lectures, my aim is to present a status report on the standard model and some key tests of electroweak unification. Within that context, I also discuss how and where hints of new physics may emerge. To accomplish those goals, I have organized my presentation as follows. I survey the standard model parameters with particular emphasis on the gauge coupling constants and vector boson masses. Examples of new physics appendages are also commented on. In addition, I have included an appendix on dimensional regularization and a simple example which employs that technique. I focus on weak charged current phenomenology. Precision tests of the standard model are described and up-to-date values for the Cabibbo-Kobayashi-Maskawa (CKM) mixing matrix parameters are presented. Constraints implied by those tests for a 4th generation, extra Z' bosons, and compositeness are discussed. An overview of the physics of tau decays is also included. I discuss weak neutral current phenomenology and the extraction of sin 2 θW from experiment. The results presented there are based on a global analysis of all existing data. I have chosen to concentrate that discussion on radiative corrections, the effect of a heavy top quark mass, implications for grand unified theories (GUTS), extra Z' gauge bosons, and atomic parity violation. The potential for further experimental progress is also commented on. Finally, I depart from the narrowest version of the standard model and discuss effects of neutrino masses, mixings, and electromagnetic moments. 32 refs., 3 figs., 5 tabs

  2. Energy principle with included boundary conditions

    Lehnert, B.

    1994-01-01

    Earlier comments by the author on the limitations of the classical form of the extended energy principle are supported by a complementary analysis on the potential energy change arising from free-boundary displacements of a magnetically confined plasma. In the final formulation of the extended principle, restricted displacements, satisfying pressure continuity by means of plasma volume currents in a thin boundary layer, are replaced by unrestricted (arbitrary) displacements which can give rise to induced surface currents. It is found that these currents contribute to the change in potential energy, and that their contribution is not taken into account by such a formulation. A general expression is further given for surface currents induced by arbitrary displacements. The expression is used to reformulate the energy principle for the class of displacements which satisfy all necessary boundary conditions, including that of the pressure balance. This makes a minimization procedure of the potential energy possible, for the class of all physically relevant test functions which include the constraints imposed by the boundary conditions. Such a procedure is also consistent with a corresponding variational calculus. (Author)

  3. Aerosol simulation including chemical and nuclear reactions

    Marwil, E.S.; Lemmon, E.C.

    1985-01-01

    The numerical simulation of aerosol transport, including the effects of chemical and nuclear reactions presents a challenging dynamic accounting problem. Particles of different sizes agglomerate and settle out due to various mechanisms, such as diffusion, diffusiophoresis, thermophoresis, gravitational settling, turbulent acceleration, and centrifugal acceleration. Particles also change size, due to the condensation and evaporation of materials on the particle. Heterogeneous chemical reactions occur at the interface between a particle and the suspending medium, or a surface and the gas in the aerosol. Homogeneous chemical reactions occur within the aersol suspending medium, within a particle, and on a surface. These reactions may include a phase change. Nuclear reactions occur in all locations. These spontaneous transmutations from one element form to another occur at greatly varying rates and may result in phase or chemical changes which complicate the accounting process. This paper presents an approach for inclusion of these effects on the transport of aerosols. The accounting system is very complex and results in a large set of stiff ordinary differential equations (ODEs). The techniques for numerical solution of these ODEs require special attention to achieve their solution in an efficient and affordable manner. 4 refs

  4. Addressing Stillbirth in India Must Include Men.

    Roberts, Lisa; Montgomery, Susanne; Ganesh, Gayatri; Kaur, Harinder Pal; Singh, Ratan

    2017-07-01

    Millennium Development Goal 4, to reduce child mortality, can only be achieved by reducing stillbirths globally. A confluence of medical and sociocultural factors contribute to the high stillbirth rates in India. The psychosocial aftermath of stillbirth is a well-documented public health problem, though less is known of the experience for men, particularly outside of the Western context. Therefore, men's perceptions and knowledge regarding reproductive health, as well as maternal-child health are important. Key informant interviews (n = 5) were analyzed and 28 structured interviews were conducted using a survey based on qualitative themes. Qualitative themes included men's dual burden and right to medical and reproductive decision making power. Wives were discouraged from expressing grief and pushed to conceive again. If not successful, particularly if a son was not conceived, a second wife was considered a solution. Quantitative data revealed that men with a history of stillbirths had greater anxiety and depression, perceived less social support, but had more egalitarian views towards women than men without stillbirth experience. At the same time fathers of stillbirths were more likely to be emotionally or physically abusive. Predictors of mental health, attitudes towards women, and perceived support are discussed. Patriarchal societal values, son preference, deficient women's autonomy, and sex-selective abortion perpetuate the risk for future poor infant outcomes, including stillbirth, and compounds the already higher risk of stillbirth for males. Grief interventions should explore and take into account men's perceptions, attitudes, and behaviors towards reproductive decision making.

  5. Performance Standards': Utility for Different Uses of Assessments

    Robert L. Linn

    2003-09-01

    Full Text Available Performance standards are arguably one of the most controversial topics in educational measurement. There are uses of assessments such as licensure and certification where performance standards are essential. There are many other uses, however, where performance standards have been mandated or become the preferred method of reporting assessment results where the standards are not essential to the use. Distinctions between essential and nonessential uses of performance standards are discussed. It is argued that the insistence on reporting in terms of performance standards in situations where they are not essential has been more harmful than helpful. Variability in the definitions of proficient academic achievement by states for purposes of the No Child Left Behind Act of 2001 is discussed and it is argued that the variability is so great that characterizing achievement is meaningless. Illustrations of the great uncertainty in standards are provided.

  6. Control principles of confounders in ecological comparative studies: standardization and regressive modelss

    Varaksin Anatoly

    2014-03-01

    Full Text Available The methods of the analysis of research data including the concomitant variables (confounders associated with both the response and the current factor are considered. There are two usual ways to take into account such variables: the first, at the stage of planning the experiment and the second, in analyzing the received data. Despite the equal effectiveness of these approaches, there exists strong reason to restrict the usage of regression method to accounting for confounders by ANCOVA. Authors consider the standardization by stratification as a reliable method to account for the effect of confounding factors as opposed to the widely-implemented application of logistic regression and the covariance analysis. The program for the automation of standardization procedure is proposed, it is available at the site of the Institute of Industrial Ecology.

  7. 77 FR 43542 - Cost Accounting Standards: Cost Accounting Standards 412 and 413-Cost Accounting Standards...

    2012-07-25

    ... rule that revised Cost Accounting Standard (CAS) 412, ``Composition and Measurement of Pension Cost... Accounting Standards: Cost Accounting Standards 412 and 413--Cost Accounting Standards Pension Harmonization Rule AGENCY: Cost Accounting Standards Board, Office of Federal Procurement Policy, Office of...

  8. State Skill Standards: Welding

    Pointer, Mike; Naylor, Randy; Warden, John; Senek, Gene; Shirley, Charles; Lefcourt, Lew; Munson, Justin; Johnson, Art

    2005-01-01

    The Department of Education has undertaken an ambitious effort to develop statewide occupational skill standards. The standards in this document are for welding programs and are designed to clearly state what the student should know and be able to do upon completion of an advanced high-school program. The writing team determined that any statewide…

  9. State Skill Standards: Photography

    Howell, Frederick; Reed, Loretta; Jensen, Capra; Robison, Gary; Taylor, Susan; Pavesich, Christine

    2007-01-01

    The Department of Education has undertaken an ambitious effort to develop statewide skill standards for all content areas in career and technical education. The standards in this document are for photography programs and are designed to clearly state what the student should know and be able to do upon completion of an advanced high-school program.…

  10. How many standards?

    Maegaard, Marie

    2009-01-01

    Discussions of standardisation and standard languages has a long history in linguistics. Tore Kristiansen has contributed to these discussions in various ways, and in this chapter I will focus on his claim that young Danes operate with two standards, one for the media and one for the school...

  11. Environmental radiation standards

    Kocher, D.C.

    1987-01-01

    This document contains an outline of an oral presentation on environmental radiation standards presented to the American Nuclear Societies' Topical Conference on Population Exposure from the Nuclear Fuel Cycle. The paper contains several definitions, a summary of current radiation exposure limits; and numerous proposed changes to current standards. 7 figs

  12. The Genomic Standards Consortium

    Field, Dawn; Amaral-Zettler, Linda; Cochrane, Guy

    2011-01-01

    Standards Consortium (GSC), an open-membership organization that drives community-based standardization activities, Here we provide a short history of the GSC, provide an overview of its range of current activities, and make a call for the scientific community to join forces to improve the quality...

  13. Weston Standard battery

    This is a Weston AOIP standard battery with its calibration certificate (1956). Inside, the glassware forms an "H". Its name comes from the British physicist Edward Weston. A standard is the materialization of a given quantity whose value is known with great accuracy.

  14. Surface soil contamination standards

    Boothe, G.F.

    1979-01-01

    The purpose of this document is to define surface soil contamination limits for radioactive materials below which posting, restrictions and environmental controls are not necessary in order to protect personnel and the environment. The standards can also be used to determine if solid waste or other material is contaminated relative to disposal requirements. The derivation of the standards is given

  15. Standard classification: Physics

    1977-01-01

    This is a draft standard classification of physics. The conception is based on the physics part of the systematic catalogue of the Bayerische Staatsbibliothek and on the classification given in standard textbooks. The ICSU-AB classification now used worldwide by physics information services was not taken into account. (BJ) [de

  16. Governing through standards

    Brøgger, Katja

    This abstract adresses the ways in which new education standards have become integral to new modes of education governance. The paper explores the role of standards for accelerating the shift from national to transnational governance in higher education. Drawing on the case of higher education...

  17. Environmental radiation protection standards

    Richings, L.D.G.; Morley, F.; Kelley, G.N.

    1978-04-01

    The principles involved in the setting of radiological protection standards are reviewed, and the differences in procedures used by various countries in implementing them are outlined. Standards are taken here to mean the specific numerical limits relating to radiation doses to people or to amounts of radioactive material released into the environment. (author)

  18. Natural circulation under variable primary mass inventories at BETHSY facility

    Bazin, P.; Clement, P.; Deruaz, R.

    1989-01-01

    BETHSY is a high pressure integral test facility which models a 3 loop Framatome PWR with the intent of studying PWR accidents. The BETHSY programme includes both accident transients and tests under successive steady state conditions. So far, tests of the latter type have been especially devoted to situations where natural circulation takes place in the primary coolant system (PCS). Tests 4.1a and 4.1a TC, the results of which are introduced, deal with PCS natural circulation patterns and related heat transport mechanisms under two different core power levels (2 and 5% of nominal power), variable primary mass inventory (100% to 30-40% according to core power) and at two different steam generator liquid levels (standard value and 1 meter). (orig.)

  19. Normalization method for metabolomics data using optimal selection of multiple internal standards

    Yetukuri Laxman

    2007-03-01

    Full Text Available Abstract Background Success of metabolomics as the phenotyping platform largely depends on its ability to detect various sources of biological variability. Removal of platform-specific sources of variability such as systematic error is therefore one of the foremost priorities in data preprocessing. However, chemical diversity of molecular species included in typical metabolic profiling experiments leads to different responses to variations in experimental conditions, making normalization a very demanding task. Results With the aim to remove unwanted systematic variation, we present an approach that utilizes variability information from multiple internal standard compounds to find optimal normalization factor for each individual molecular species detected by metabolomics approach (NOMIS. We demonstrate the method on mouse liver lipidomic profiles using Ultra Performance Liquid Chromatography coupled to high resolution mass spectrometry, and compare its performance to two commonly utilized normalization methods: normalization by l2 norm and by retention time region specific standard compound profiles. The NOMIS method proved superior in its ability to reduce the effect of systematic error across the full spectrum of metabolite peaks. We also demonstrate that the method can be used to select best combinations of standard compounds for normalization. Conclusion Depending on experiment design and biological matrix, the NOMIS method is applicable either as a one-step normalization method or as a two-step method where the normalization parameters, influenced by variabilities of internal standard compounds and their correlation to metabolites, are first calculated from a study conducted in repeatability conditions. The method can also be used in analytical development of metabolomics methods by helping to select best combinations of standard compounds for a particular biological matrix and analytical platform.

  20. 1996 DOE technical standards program workshop: Proceedings

    NONE

    1996-07-01

    The workshop theme is `The Strategic Standardization Initiative - A Technology Exchange and Global Competitiveness Challenge for DOE.` The workshop goal is to inform the DOE technical standards community of strategic standardization activities taking place in the Department, other Government agencies, standards developing organizations, and industry. Individuals working on technical standards will be challenged to improve cooperation and communications with the involved organizations in response to the initiative. Workshop sessions include presentations by representatives from various Government agencies that focus on coordination among and participation of Government personnel in the voluntary standards process; reports by standards organizations, industry, and DOE representatives on current technology exchange programs; and how the road ahead appears for `information superhighway` standardization. Another session highlights successful standardization case studies selected from several sites across the DOE complex. The workshop concludes with a panel discussion on the goals and objectives of the DOE Technical Standards Program as envisioned by senior DOE management. The annual workshop on technical standards has proven to be an effective medium for communicating information related to standards throughout the DOE community. Technical standards are used to transfer technology and standardize work processes to produce consistent, acceptable results. They provide a practical solution to the Department`s challenge to protect the environment and the health and safety of the public and workers during all facility operations. Through standards, the technologies of industries and governments worldwide are available to DOE. The DOE Technical Standards Program, a Department-wide effort that crosscuts all organizations and disciplines, links the Department to those technologies.

  1. The surgery of peripheral nerves (including tumors)

    Fugleholm, Kåre

    2013-01-01

    Surgical pathology of the peripheral nervous system includes traumatic injury, entrapment syndromes, and tumors. The recent significant advances in the understanding of the pathophysiology and cellular biology of peripheral nerve degeneration and regeneration has yet to be translated into improved...... surgical techniques and better outcome after peripheral nerve injury. Decision making in peripheral nerve surgery continues to be a complex challenge, where the mechanism of injury, repeated clinical evaluation, neuroradiological and neurophysiological examination, and detailed knowledge of the peripheral...... nervous system response to injury are prerequisite to obtain the best possible outcome. Surgery continues to be the primary treatment modality for peripheral nerve tumors and advances in adjuvant oncological treatment has improved outcome after malignant peripheral nerve tumors. The present chapter...

  2. AMS at the ANU including biomedical applications

    Fifield, L K; Allan, G L; Cresswell, R G; Ophel, T R [Australian National Univ., Canberra, ACT (Australia); King, S J; Day, J P [Manchester Univ. (United Kingdom). Dept. of Chemistry

    1994-12-31

    An extensive accelerator mass spectrometry program has been conducted on the 14UD accelerator at the Australian National University since 1986. In the two years since the previous conference, the research program has expanded significantly to include biomedical applications of {sup 26}Al and studies of landform evolution using isotopes produced in situ in surface rocks by cosmic ray bombardment. The system is now used for the measurement of {sup 10}Be, {sup 14}C, {sup 26}Al, {sup 36}Cl, {sup 59}Ni and {sup 129}I, and research is being undertaken in hydrology, environmental geochemistry, archaeology and biomedicine. On the technical side, a new test system has permitted the successful off-line development of a high-intensity ion source. A new injection line to the 14UD has been established and the new source is now in position and providing beams to the accelerator. 4 refs.

  3. AMS at the ANU including biomedical applications

    Fifield, L.K.; Allan, G.L.; Cresswell, R.G.; Ophel, T.R. [Australian National Univ., Canberra, ACT (Australia); King, S.J.; Day, J.P. [Manchester Univ. (United Kingdom). Dept. of Chemistry

    1993-12-31

    An extensive accelerator mass spectrometry program has been conducted on the 14UD accelerator at the Australian National University since 1986. In the two years since the previous conference, the research program has expanded significantly to include biomedical applications of {sup 26}Al and studies of landform evolution using isotopes produced in situ in surface rocks by cosmic ray bombardment. The system is now used for the measurement of {sup 10}Be, {sup 14}C, {sup 26}Al, {sup 36}Cl, {sup 59}Ni and {sup 129}I, and research is being undertaken in hydrology, environmental geochemistry, archaeology and biomedicine. On the technical side, a new test system has permitted the successful off-line development of a high-intensity ion source. A new injection line to the 14UD has been established and the new source is now in position and providing beams to the accelerator. 4 refs.

  4. CERN Technical Training: LABVIEW courses include RADE

    HR Department

    2009-01-01

    The contents of the "LabView Basic I" and "LabView Intermediate II" courses have recently been changed to include, respectively, an introduction to and expert training in the Rapid Application Development Environment (RADE). RADE is a LabView-based application developed at CERN to integrate LabView in the accelerator and experiment control infrastructure. It is a suitable solution to developing expert tools, machine development analysis and independent test facilities. The course names have also been changed to "LabVIEW Basics I with RADE Introduction" and "LabVIEW Intermediate II with Advanced RADE Application". " LabVIEW Basics I with RADE Introduction" is designed for: Users preparing to develop applications using LabVIEW, or NI Developer Suite; users and technical managers evaluating LabVIEW or NI Developer Suite in purchasing decisions; users pursuing the Certified LabVIEW Developer certification. The course pr...

  5. CERN Technical Training: LABVIEW courses include RADE

    HR Department

    2009-01-01

    The contents of the "LabView Basic I" and "LabView Intermediate II" courses have recently been changed to include, respectively, an introduction to and expert training in the Rapid Application Development Environment (RADE). RADE is a LabView-based application developed at CERN to integrate LabView in the accelerator and experiment control infrastructure. It is a suitable solution to developing expert tools, machine development analysis and independent test facilities. The course names have also been changed to "LabVIEW Basics I with RADE Introduction" and "LabVIEW Intermediate II with Advanced RADE Application". " LabVIEW Basics I with RADE Introduction" is designed for: Users preparing to develop applications using LabVIEW, or NI Developer Suite; users and technical managers evaluating LabVIEW or NI Developer Suite in purchasing decisions; users pursuing the Certified LabVIEW Developer certification. The course prepares participants to develop test and measurement, da...

  6. CERN Technical Training: LABVIEW courses include RADE

    HR Department

    2009-01-01

    The contents of "LabView Basic I" and "LabView Intermediate II" trainings have been recently changed to include, respectively, an introduction and an expert training on the Rapid Application Development Environment (RADE). RADE is a LabView-based application developed at CERN to integrate LabView in the accelerator and experiment control infrastructure. It is a suitable solution to develop expert tools, machine development analysis and independent test facilities. The course names have also been changed to "LabVIEW Basics I with RADE Introduction" and "LabVIEW Intermediate II with Advanced RADE Application". " LabVIEW Basics I with RADE Introduction" is designed for: Users preparing to develop applications using LabVIEW, or NI Developer Suite; users and technical managers evaluating LabVIEW or NI Developer Suite in purchasing decisions; users pursuing the Certified LabVIEW Developer certification. The course prepare...

  7. Critical point anomalies include expansion shock waves

    Nannan, N. R., E-mail: ryan.nannan@uvs.edu [Mechanical Engineering Discipline, Anton de Kom University of Suriname, Leysweg 86, PO Box 9212, Paramaribo, Suriname and Process and Energy Department, Delft University of Technology, Leeghwaterstraat 44, 2628 CA Delft (Netherlands); Guardone, A., E-mail: alberto.guardone@polimi.it [Department of Aerospace Science and Technology, Politecnico di Milano, Via La Masa 34, 20156 Milano (Italy); Colonna, P., E-mail: p.colonna@tudelft.nl [Propulsion and Power, Delft University of Technology, Kluyverweg 1, 2629 HS Delft (Netherlands)

    2014-02-15

    From first-principle fluid dynamics, complemented by a rigorous state equation accounting for critical anomalies, we discovered that expansion shock waves may occur in the vicinity of the liquid-vapor critical point in the two-phase region. Due to universality of near-critical thermodynamics, the result is valid for any common pure fluid in which molecular interactions are only short-range, namely, for so-called 3-dimensional Ising-like systems, and under the assumption of thermodynamic equilibrium. In addition to rarefaction shock waves, diverse non-classical effects are admissible, including composite compressive shock-fan-shock waves, due to the change of sign of the fundamental derivative of gasdynamics.

  8. CLIC expands to include the Southern Hemisphere

    Roberto Cantoni

    2010-01-01

    Australia has recently joined the CLIC collaboration: the enlargement will bring new expertise and resources to the project, and is especially welcome in the wake of CERN budget redistributions following the recent adoption of the Medium Term Plan.   The countries involved in CLIC collaboration With the signing of a Memorandum of Understanding on 26 August 2010, the ACAS network (Australian Collaboration for Accelerator Science) became the 40th member of in the multilateral CLIC collaboration making Australia the 22nd country to join the collaboration. “The new MoU was signed by the ACAS network, which includes the Australian Synchrotron and the University of Melbourne”, explains Jean-Pierre Delahaye, CLIC Study Leader. “Thanks to their expertise, the Australian institutes will contribute greatly to the CLIC damping rings and the two-beam test modules." Institutes from any country wishing to join the CLIC collaboration are invited to assume responsibility o...

  9. Should Broca's area include Brodmann area 47?

    Ardila, Alfredo; Bernal, Byron; Rosselli, Monica

    2017-02-01

    Understanding brain organization of speech production has been a principal goal of neuroscience. Historically, brain speech production has been associated with so-called Broca’s area (Brodmann area –BA- 44 and 45), however, modern neuroimaging developments suggest speech production is associated with networks rather than with areas. The purpose of this paper was to analyze the connectivity of BA47 ( pars orbitalis) in relation to language . A meta-analysis was conducted to assess the language network in which BA47 is involved. The Brainmap database was used. Twenty papers corresponding to 29 experimental conditions with a total of 373 subjects were included. Our results suggest that BA47 participates in a “frontal language production system” (or extended Broca’s system). The BA47  connectivity found is also concordant with a minor role in language semantics. BA47 plays a central role in the language production system.

  10. Including climate change in energy investment decisions

    Ybema, J.R.; Boonekamp, P.G.M.; Smit, J.T.J.

    1995-08-01

    To properly take climate change into account in the analysis of energy investment decisions, it is required to apply decision analysis methods that are capable of considering the specific characteristics of climate change (large uncertainties, long term horizon). Such decision analysis methods do exist. They can explicitly include evolving uncertainties, multi-stage decisions, cumulative effects and risk averse attitudes. Various methods are considered in this report and two of these methods have been selected: hedging calculations and sensitivity analysis. These methods are applied to illustrative examples, and its limitations are discussed. The examples are (1a) space heating and hot water for new houses from a private investor perspective and (1b) as example (1a) but from a government perspective, (2) electricity production with an integrated coal gasification combined cycle (ICGCC) with or without CO 2 removal, and (3) national energy strategy to hedge for climate change. 9 figs., 21 tabs., 42 refs., 1 appendix

  11. Education Program on Fossil Resources Including Coal

    Usami, Masahiro

    Fossil fuels including coal play a key role as crucial energies in contributing to economic development in Asia. On the other hand, its limited quantity and the environmental problems causing from its usage have become a serious global issue and a countermeasure to solve such problems is very much demanded. Along with the pursuit of sustainable development, environmentally-friendly use of highly efficient fossil resources should be therefore, accompanied. Kyushu-university‧s sophisticated research through long years of accumulated experience on the fossil resources and environmental sectors together with the advanced large-scale commercial and empirical equipments will enable us to foster cooperative research and provide internship program for the future researchers. Then, this program is executed as a consignment business from the Ministry of Economy, Trade and Industry from 2007 fiscal year to 2009 fiscal year. The lecture that uses the textbooks developed by this program is scheduled to be started a course in fiscal year 2010.

  12. Individual Movement Variability Magnitudes Are Explained by Cortical Neural Variability.

    Haar, Shlomi; Donchin, Opher; Dinstein, Ilan

    2017-09-13

    Humans exhibit considerable motor variability even across trivial reaching movements. This variability can be separated into specific kinematic components such as extent and direction that are thought to be governed by distinct neural processes. Here, we report that individual subjects (males and females) exhibit different magnitudes of kinematic variability, which are consistent (within individual) across movements to different targets and regardless of which arm (right or left) was used to perform the movements. Simultaneous fMRI recordings revealed that the same subjects also exhibited different magnitudes of fMRI variability across movements in a variety of motor system areas. These fMRI variability magnitudes were also consistent across movements to different targets when performed with either arm. Cortical fMRI variability in the posterior-parietal cortex of individual subjects explained their movement-extent variability. This relationship was apparent only in posterior-parietal cortex and not in other motor system areas, thereby suggesting that individuals with more variable movement preparation exhibit larger kinematic variability. We therefore propose that neural and kinematic variability are reliable and interrelated individual characteristics that may predispose individual subjects to exhibit distinct motor capabilities. SIGNIFICANCE STATEMENT Neural activity and movement kinematics are remarkably variable. Although intertrial variability is rarely studied, here, we demonstrate that individual human subjects exhibit distinct magnitudes of neural and kinematic variability that are reproducible across movements to different targets and when performing these movements with either arm. Furthermore, when examining the relationship between cortical variability and movement variability, we find that cortical fMRI variability in parietal cortex of individual subjects explained their movement extent variability. This enabled us to explain why some subjects

  13. Variability in large-scale wind power generation: Variability in large-scale wind power generation

    Kiviluoma, Juha [VTT Technical Research Centre of Finland, Espoo Finland; Holttinen, Hannele [VTT Technical Research Centre of Finland, Espoo Finland; Weir, David [Energy Department, Norwegian Water Resources and Energy Directorate, Oslo Norway; Scharff, Richard [KTH Royal Institute of Technology, Electric Power Systems, Stockholm Sweden; Söder, Lennart [Royal Institute of Technology, Electric Power Systems, Stockholm Sweden; Menemenlis, Nickie [Institut de recherche Hydro-Québec, Montreal Canada; Cutululis, Nicolaos A. [DTU, Wind Energy, Roskilde Denmark; Danti Lopez, Irene [Electricity Research Centre, University College Dublin, Dublin Ireland; Lannoye, Eamonn [Electric Power Research Institute, Palo Alto California USA; Estanqueiro, Ana [LNEG, Laboratorio Nacional de Energia e Geologia, UESEO, Lisbon Spain; Gomez-Lazaro, Emilio [Renewable Energy Research Institute and DIEEAC/EDII-AB, Castilla-La Mancha University, Albacete Spain; Zhang, Qin [State Grid Corporation of China, Beijing China; Bai, Jianhua [State Grid Energy Research Institute Beijing, Beijing China; Wan, Yih-Huei [National Renewable Energy Laboratory, Transmission and Grid Integration Group, Golden Colorado USA; Milligan, Michael [National Renewable Energy Laboratory, Transmission and Grid Integration Group, Golden Colorado USA

    2015-10-25

    The paper demonstrates the characteristics of wind power variability and net load variability in multiple power systems based on real data from multiple years. Demonstrated characteristics include probability distribution for different ramp durations, seasonal and diurnal variability and low net load events. The comparison shows regions with low variability (Sweden, Spain and Germany), medium variability (Portugal, Ireland, Finland and Denmark) and regions with higher variability (Quebec, Bonneville Power Administration and Electric Reliability Council of Texas in North America; Gansu, Jilin and Liaoning in China; and Norway and offshore wind power in Denmark). For regions with low variability, the maximum 1 h wind ramps are below 10% of nominal capacity, and for regions with high variability, they may be close to 30%. Wind power variability is mainly explained by the extent of geographical spread, but also higher capacity factor causes higher variability. It was also shown how wind power ramps are autocorrelated and dependent on the operating output level. When wind power was concentrated in smaller area, there were outliers with high changes in wind output, which were not present in large areas with well-dispersed wind power.

  14. Knee extension torque variability after exercise in ACL reconstructed knees.

    Goetschius, John; Kuenze, Christopher M; Hart, Joseph M

    2015-08-01

    The purpose of this study was to compare knee extension torque variability in patients with ACL reconstructed knees before and after exercise. Thirty two patients with an ACL reconstructed knee (ACL-R group) and 32 healthy controls (control group) completed measures of maximal isometric knee extension torque (90° flexion) at baseline and following a 30-min exercise protocol (post-exercise). Exercise included 30-min of repeated cycles of inclined treadmill walking and hopping tasks. Dependent variables were the coefficient of variation (CV) and raw-change in CV (ΔCV): CV = (torque standard deviation/torque mean x 100), ΔCV = (post-exercise - baseline). There was a group-by-time interaction (p = 0.03) on CV. The ACL-R group demonstrated greater CV than the control group at baseline (ACL-R = 1.07 ± 0.55, control = 0.79 ± 0.42, p = 0.03) and post-exercise (ACL-R = 1.60 ± 0.91, control = 0.94 ± 0.41, p = 0.001). ΔCV was greater (p = 0.03) in the ACL-R group (0.52 ± 0.82) than control group (0.15 ± 0.46). CV significantly increased from baseline to post-exercise (p = 0.001) in the ACL-R group, while the control group did not (p = 0.06). The ACL-R group demonstrated greater knee extension torque variability than the control group. Exercise increased torque variability more in the ACL-R group than control group. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  15. The SYVAC standards (ed. 2)

    Resnick, M.

    1983-04-01

    The Department of the Environment has embarked on a programme to develop computer models to help with assessment of sites suitable for the disposal of nuclear wastes. The first priority is to produce a system, based on the System Variability Analysis Code (SYVAC) obtained from Atomic Energy of Canada Ltd., suitable for assessing radioactive waste disposal in land repositories containing non heat producing wastes from typical UK sources. The requirements of the SYVAC system development were so diverse that each portion of the development was contracted to a different company. Scicon are responsible for software coordination, system integration and user interface. Their present report deals with SYVAC standards: FORTRAN coding; program documentation; testing guidelines; change control. (U.K.)

  16. Understanding diagnostic variability in breast pathology: lessons learned from an expert consensus review panel

    Allison, Kimberly H; Reisch, Lisa M; Carney, Patricia A; Weaver, Donald L; Schnitt, Stuart J; O’Malley, Frances P; Geller, Berta M; Elmore, Joann G

    2015-01-01

    Aims To gain a better understanding of the reasons for diagnostic variability, with the aim of reducing the phenomenon. Methods and results In preparation for a study on the interpretation of breast specimens (B-PATH), a panel of three experienced breast pathologists reviewed 336 cases to develop consensus reference diagnoses. After independent assessment, cases coded as diagnostically discordant were discussed at consensus meetings. By the use of qualitative data analysis techniques, transcripts of 16 h of consensus meetings for a subset of 201 cases were analysed. Diagnostic variability could be attributed to three overall root causes: (i) pathologist-related; (ii) diagnostic coding/study methodology-related; and (iii) specimen-related. Most pathologist-related root causes were attributable to professional differences in pathologists’ opinions about whether the diagnostic criteria for a specific diagnosis were met, most frequently in cases of atypia. Diagnostic coding/study methodology-related root causes were primarily miscategorizations of descriptive text diagnoses, which led to the development of a standardized electronic diagnostic form (BPATH-Dx). Specimen-related root causes included artefacts, limited diagnostic material, and poor slide quality. After re-review and discussion, a consensus diagnosis could be assigned in all cases. Conclusions Diagnostic variability is related to multiple factors, but consensus conferences, standardized electronic reporting formats and comments on suboptimal specimen quality can be used to reduce diagnostic variability. PMID:24511905

  17. Variability in large-scale wind power generation

    Kiviluoma, Juha; Holttinen, Hannele; Weir, David

    2016-01-01

    The paper demonstrates the characteristics of wind power variability and net load variability in multiple power systems based on real data from multiple years. Demonstrated characteristics include probability distribution for different ramp durations, seasonal and diurnal variability and low net ...... with well-dispersed wind power. Copyright © 2015 John Wiley & Sons, Ltd....

  18. Linear variable voltage diode capacitor and adaptive matching networks

    Larson, L.E.; De Vreede, L.C.N.

    2006-01-01

    An integrated variable voltage diode capacitor topology applied to a circuit providing a variable voltage load for controlling variable capacitance. The topology includes a first pair of anti-series varactor diodes, wherein the diode power-law exponent n for the first pair of anti-series varactor

  19. Mathematical model of thyristor inverter including a series-parallel resonant circuit

    Luft, M.; Szychta, E.

    2008-01-01

    The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with the aid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  20. Mathematical Model of Thyristor Inverter Including a Series-parallel Resonant Circuit

    Miroslaw Luft; Elzbieta Szychta

    2008-01-01

    The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with theaid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  1. Mathematical Model of Thyristor Inverter Including a Series-parallel Resonant Circuit

    Miroslaw Luft

    2008-01-01

    Full Text Available The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with theaid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  2. Analysis of Smart Composite Structures Including Debonding

    Chattopadhyay, Aditi; Seeley, Charles E.

    1997-01-01

    Smart composite structures with distributed sensors and actuators have the capability to actively respond to a changing environment while offering significant weight savings and additional passive controllability through ply tailoring. Piezoelectric sensing and actuation of composite laminates is the most promising concept due to the static and dynamic control capabilities. Essential to the implementation of these smart composites are the development of accurate and efficient modeling techniques and experimental validation. This research addresses each of these important topics. A refined higher order theory is developed to model composite structures with surface bonded or embedded piezoelectric transducers. These transducers are used as both sensors and actuators for closed loop control. The theory accurately captures the transverse shear deformation through the thickness of the smart composite laminate while satisfying stress free boundary conditions on the free surfaces. The theory is extended to include the effect of debonding at the actuator-laminate interface. The developed analytical model is implemented using the finite element method utilizing an induced strain approach for computational efficiency. This allows general laminate geometries and boundary conditions to be analyzed. The state space control equations are developed to allow flexibility in the design of the control system. Circuit concepts are also discussed. Static and dynamic results of smart composite structures, obtained using the higher order theory, are correlated with available analytical data. Comparisons, including debonded laminates, are also made with a general purpose finite element code and available experimental data. Overall, very good agreement is observed. Convergence of the finite element implementation of the higher order theory is shown with exact solutions. Additional results demonstrate the utility of the developed theory to study piezoelectric actuation of composite

  3. SELECTING QUASARS BY THEIR INTRINSIC VARIABILITY

    Schmidt, Kasper B.; Rix, Hans-Walter; Jester, Sebastian; Hennawi, Joseph F.; Marshall, Philip J.; Dobler, Gregory

    2010-01-01

    We present a new and simple technique for selecting extensive, complete, and pure quasar samples, based on their intrinsic variability. We parameterize the single-band variability by a power-law model for the light-curve structure function, with amplitude A and power-law index γ. We show that quasars can be efficiently separated from other non-variable and variable sources by the location of the individual sources in the A-γ plane. We use ∼60 epochs of imaging data, taken over ∼5 years, from the SDSS stripe 82 (S82) survey, where extensive spectroscopy provides a reference sample of quasars, to demonstrate the power of variability as a quasar classifier in multi-epoch surveys. For UV-excess selected objects, variability performs just as well as the standard SDSS color selection, identifying quasars with a completeness of 90% and a purity of 95%. In the redshift range 2.5 < z < 3, where color selection is known to be problematic, variability can select quasars with a completeness of 90% and a purity of 96%. This is a factor of 5-10 times more pure than existing color selection of quasars in this redshift range. Selecting objects from a broad griz color box without u-band information, variability selection in S82 can afford completeness and purity of 92%, despite a factor of 30 more contaminants than quasars in the color-selected feeder sample. This confirms that the fraction of quasars hidden in the 'stellar locus' of color space is small. To test variability selection in the context of Pan-STARRS 1 (PS1) we created mock PS1 data by down-sampling the S82 data to just six epochs over 3 years. Even with this much sparser time sampling, variability is an encouragingly efficient classifier. For instance, a 92% pure and 44% complete quasar candidate sample is attainable from the above griz-selected catalog. Finally, we show that the presented A-γ technique, besides selecting clean and pure samples of quasars (which are stochastically varying objects), is also

  4. Alternating phase focussing including space charge

    Cheng, W.H.; Gluckstern, R.L.

    1992-01-01

    Longitudinal stability can be obtained in a non-relativistic drift tube accelerator by traversing each gap as the rf accelerating field rises. However, the rising accelerating field leads to a transverse defocusing force which is usually overcome by magnetic focussing inside the drift tubes. The radio frequency quadrupole is one way of providing simultaneous longitudinal and transverse focusing without the use of magnets. One can also avoid the use of magnets by traversing alternate gaps between drift tubes as the field is rising and falling, thus providing an alternation of focussing and defocusing forces in both the longitudinal and transverse directions. The stable longitudinal phase space area is quite small, but recent efforts suggest that alternating phase focussing (APF) may permit low velocity acceleration of currents in the 100-300 ma range. This paper presents a study of the parameter space and a test of crude analytic predictions by adapting the code PARMILA, which includes space charge, to APF. 6 refs., 3 figs

  5. Probabilistic production simulation including CHP plants

    Larsen, H.V.; Palsson, H.; Ravn, H.F.

    1997-04-01

    A probabilistic production simulation method is presented for an energy system containing combined heat and power plants. The method permits incorporation of stochastic failures (forced outages) of the plants and is well suited for analysis of the dimensioning of the system, that is, for finding the appropriate types and capacities of production plants in relation to expansion planning. The method is in the tradition of similar approaches for the analysis of power systems, based on the load duration curve. The present method extends on this by considering a two-dimensional load duration curve where the two dimensions represent heat and power. The method permits the analysis of a combined heat and power system which includes all the basic relevant types of plants, viz., condensing plants, back pressure plants, extraction plants and heat plants. The focus of the method is on the situation where the heat side has priority. This implies that on the power side there may be imbalances between demand and production. The method permits quantification of the expected power overflow, the expected unserviced power demand, and the expected unserviced heat demand. It is shown that a discretization method as well as double Fourier series may be applied in algorithms based on the method. (au) 1 tab., 28 ills., 21 refs.

  6. Langevin simulations of QCD, including fermions

    Kronfeld, A.S.

    1986-02-01

    We encounter critical slow down in updating when xi/a -> infinite and in matrix inversion (needed to include fermions) when msub(q)a -> 0. A simulation that purports to solve QCD numerically will encounter these limits, so to face the challenge in the title of this workshop, we must cure the disease of critical slow down. Physically, this critical slow down is due to the reluctance of changes at short distances to propagate to large distances. Numerically, the stability of an algorithm at short wavelengths requires a (moderately) small step size; critical slow down occurs when the effective long wavelength step size becomes tiny. The remedy for this disease is an algorithm that propagates signals quickly throughout the system; i.e. one whose effective step size is not reduced for the long wavelength conponents of the fields. (Here the effective ''step size'' is essentially an inverse decorrelation time.) To do so one must resolve various wavelengths of the system and modify the dynamics (in CPU time) of the simulation so that all modes evolve at roughly the same rate. This can be achieved by introducing Fourier transforms. I show how to implement Fourier acceleration for Langevin updating and for conjugate gradient matrix inversion. The crucial feature of these algorithms that lends them to Fourier acceleration is that they update the lattice globally; hence the Fourier transforms are computed once per sweep rather than once per hit. (orig./HSI)

  7. Analysis of suspension with variable stiffness and variable damping force for automotive applications

    Lalitkumar Maikulal Jugulkar

    2016-05-01

    Full Text Available Passive shock absorbers are designed for standard load condition. These give better vibration isolation performance only for the standard load condition. However, if the sprung mass is lesser than the standard mass, comfort and road holding ability is affected. It is demonstrated that sprung mass acceleration increases by 50%, when the vehicle mass varies by 100 kg. In order to obtain consistent damping performance from the shock absorber, it is essential to vary its stiffness and damping properties. In this article, a variable stiffness system is presented, which comprises of two helical springs and a variable fluid damper. Fluid damper intensity is changed in four discrete levels to achieve variable stiffness of the prototype. Numerical simulations have been performed with MATLAB Simscape and Simulink which have been with experimentation on a prototype. Furthermore, the numerical model of the prototype is used in design of real size shock absorber with variable stiffness and damping. Numerical simulation results on the real size model indicate that the peak acceleration will improve by 15% in comparison to the conventional passive solution, without significant deterioration of road holding ability. Arrangement of sensors and actuators for incorporating the system in a vehicle suspension has also been discussed.

  8. A Metadata Standard for Hydroinformatic Data Conforming to International Standards

    Notay, Vikram; Carstens, Georg; Lehfeldt, Rainer

    2017-04-01

    worldwide, the profile provides a means to describe hydroinformatic data that conforms to existing metadata standards. Additionally, EU and German national standards, INSPIRE and GDI-DE have been considered to ensure interoperability on an international and national level. Finally, elements of the GovData profile of the Federal Government of Germany have been integrated to be able to participate in its Open Data initiative. All these factors make the metadata profile developed at BAW highly suitable for describing hydroinformatic data in particular and physical state variables in general. Further details about this metadata profile will be presented at the conference. Acknowledgements: The authors would like to thank Christoph Wosniok and Peter Schade for their contributions towards the development of this metadata standard.

  9. Protecting chips against hold time violations due to variability

    Neuberger, Gustavo; Reis, Ricardo

    2013-01-01

    With the development of Very-Deep Sub-Micron technologies, process variability is becoming increasingly important and is a very important issue in the design of complex circuits. Process variability is the statistical variation of process parameters, meaning that these parameters do not have always the same value, but become a random variable, with a given mean value and standard deviation. This effect can lead to several issues in digital circuit design.The logical consequence of this parameter variation is that circuit characteristics, as delay and power, also become random variables. Becaus

  10. A case of standardization?

    Rod, Morten Hulvej; Høybye, Mette Terp

    2016-01-01

    the ones envisioned by the makers of standards. In 2012, the Danish National Health Authorities introduced a set of health promotion guidelines that were meant to guide the decision making and priority setting of Denmark's 98 local governments. The guidelines provided recommendations for health promotion...... and standardization. It remains an open question whether or not the guidelines lead to more standardized policies and interventions, but we suggest that the guidelines promote a risk factor-oriented approach as the dominant frame for knowledge, reasoning, decision making and priority setting in health promotion. We...

  11. The Standard Model

    Sutton, Christine

    1994-01-01

    The initial evidence from Fermilab for the long awaited sixth ('top') quark puts another rivet in the already firm structure of today's Standard Model of physics. Analysis of the Fermilab CDF data gives a top mass of 174 GeV with an error of ten per cent either way. This falls within the mass band predicted by the sum total of world Standard Model data and underlines our understanding of physics in terms of six quarks and six leptons. In this specially commissioned overview, physics writer Christine Sutton explains the Standard Model

  12. Wireless installation standard

    Lim, Hwang Bin

    2007-12-01

    This is divided six parts which are radio regulation law on securing of radio resource, use of radio resource, protection of radio resource, radio regulation enforcement ordinance with securing, distribution and assignment of radio regulation, radio regulation enforcement regulation on utility of radio resource and technical qualification examination, a wireless installation regulation of technique standard and safety facility standard, radio regulation such as certification regulation of information communicative machines and regulation of radio station on compliance of signal security, radio equipment in radio station, standard frequency station and emergency communication.

  13. Operator licensing examiner standards

    1993-01-01

    The Operator Licensing Examiner Standards provide policy and guidance to NRC examiners and establish the procedures and practices for examining licensees and applicants for reactor operator and senior reactor operator licenses at power reactor facilities pursuant to Part 55 of Title 10 of the Code of Federal Regulations (10 CFR 55). The Examiner Standards are intended to assist NRC examiners and facility licensees to better understand the initial and requalification examination processes and to ensure the equitable and consistent administration of examinations to all applicants. These standards are not a substitute for the operator licensing regulations and are subject to revision or other internal operator licensing policy changes

  14. International radiofrequency standards

    Lincoln, J.

    2001-01-01

    Of the various radiofrequency standards in use around the world, many are based on or similar to the Guidelines published by ICNIRP (The International Commission on Non-ionising Radiation Protection). This organisation is a working group operating in co-operation with the Environmental Health division of the World Health Organisation (WHO). This paper presents a very brief overview of current international standards, beginning with a summary of the salient points of the ICNIRP Guidelines. It should be remembered that these are guidelines only and do not exist as a separate standard. Copyright (2001) Australasian Radiation Protection Society Inc

  15. Description of broadband structure-borne and airborne noise transmission from the powertrain (engine-gear combination including engine intake and exhaust system) in modern combustion process as well as new systems for variable control of gas exchange. Binaural transfer path analysis and synthesis. Interim report; Beschreibung der breitbandigen Koerper- und Luftschallausbreitung aus dem Powertrain (Motor-Getriebe-Verband inklusive Ansaug- und Abgasanlage) bei modernen Verbrennungsverfahren sowie neuer Systeme zur variablen Ladungswechselsteuerung. Binaurale Transferpfadanalyse und -synthese. Zwischenbericht

    Sottek, R. [HEAD acoustics GmbH, Herzogenrath (Germany); Behler, G.; Kellert, T. [RWTH Aachen (DE). Inst. fuer Technische Akustik (ITA); Bernhard, U.

    2004-07-01

    The modern combustion procedures and new valve train generations lead to a different temporal and spectral behaviour of the vibrations between the interfaces of a powertrain and the adjoining structures and at the same time to a different airborne sound radiation via the engine compartment and the orifices and component surfaces at the intake and exhaust system into the passenger compartment. The influence of the high-frequency components on the vehicle interior noise becomes more and more important. Coupling and mass effects have to be taken into consideration now, because otherwise results might increasingly be misinterpreted. Previous methods including the binaural transfer path analysis and synthesis do not take account of these effects. This research project shall fill this gap. Regarding the airborne sound component the engine compartment can at best be considered as a pressure chamber for low frequencies only. However, for higher frequencies the positions of the partial sound sources, the corresponding transfer functions, near-field effects and modal structures in the engine compartment become increasingly relevant. In this project these influencing parameters shall be classified with regard to quality and quantity. This knowledge is also of fundamental interest for the determination of the primary sound sources on the test bench and the transferability of the results to the vehicle. The most important aim of this project is to develop simplified models for the structure-borne and airborne noise transmission from a precise and complex database and to reduce them to the essential by means of parameter studies. In the final stage of the project, the complicated fine structures of the transfer functions will be reduced to a few model functions, similar to the procedure of the modal analysis. From this simple model a ''black box'' will be derived which is the basis for simulating driving conditions, applying modifications and judging them

  16. Standards of Quality: Accreditation Guidelines Redesigned

    Forsythe, Hazel; Andrews, Frances; Stanley, M. Sue; Anderson, Carol L.

    2011-01-01

    To ensure optimal standards for AAFCS program accreditation, the Council for Accreditation (CFA) conducted a review and revision of the "2001 AAFCS Standards for Accreditation." The CFA took a three-pronged approach including (a) a review of academic accreditations that had relationships to the FCS disciplines, (b) concept, content, and process…

  17. China's High-technology Standards Development

    2007-01-01

    There are several major technology standards, including audio video coding (AVS), automotive electronics, third generation (3G) mobile phones, mobile television, wireless networks and digital terrestrial television broadcasting, that have been released or are currently under development in China. This article offers a detailed analysis of each standard and studies their impact on China's high-technology industry.

  18. Phytoscreening with SPME: Variability Analysis.

    Limmer, Matt A; Burken, Joel G

    2015-01-01

    Phytoscreening has been demonstrated at a variety of sites over the past 15 years as a low-impact, sustainable tool in delineation of shallow groundwater contaminated with chlorinated solvents. Collection of tree cores is rapid and straightforward, but low concentrations in tree tissues requires sensitive analytics. Solid-phase microextraction (SPME) is amenable to the complex matrix while allowing for solvent-less extraction. Accurate quantification requires the absence of competitive sorption, examined here both in laboratory experiments and through comprehensive examination of field data. Analysis of approximately 2,000 trees at numerous field sites also allowed testing of the tree genus and diameter effects on measured tree contaminant concentrations. Collectively, while these variables were found to significantly affect site-adjusted perchloroethylene (PCE) concentrations, the explanatory power of these effects was small (adjusted R(2) = 0.031). 90th quantile chemical concentrations in trees were significantly reduced by increasing Henry's constant and increasing hydrophobicity. Analysis of replicate tree core data showed no correlation between replicate relative standard deviation (RSD) and wood type or tree diameter, with an overall median RSD of 30%. Collectively, these findings indicate SPME is an appropriate technique for sampling and analyzing chlorinated solvents in wood and that phytoscreening is robust against changes in tree type and diameter.

  19. Gallbladder microbiota variability in Colombian gallstones patients.

    Arteta, Ariel Antonio; Carvajal-Restrepo, Hernan; Sánchez-Jiménez, Miryan Margot; Diaz-Rodriguez, Sergio; Cardona-Castro, Nora

    2017-03-31

    Gallbladder stones are a very frequently occurring condition. Despite bile bactericidal activity, many bacteria have been detected inside the gallbladder, and gallstones facilitate their presence. Between 3% and 5% of the patients with Salmonella spp. infection develop the carrier stage, with the bacteria persisting inside the gallbladder, shedding bacteria in their feces without signs of infection. The aim of this study was to isolate bacteria from Colombian patients with gallstones, using standard culturing methods, and to identify Salmonella spp. carriers by molecular techniques. A total of 149 patients (120 female and 29 male) diagnosed with gallstones who underwent cholecystectomy and who did not have symptoms of acute inflammation were included. Gallbladder tissue and bile were cultured and used for DNA extraction and Salmonella spp. hilA gene detection. Of the 149 patients 28 (19%) had positive cultures. Twenty-one (75%) patients with positive cultures were from Medellin's metropolitan area. In this geographical location, the most frequent isolations were Pseudomonas spp. (38%), Klebsiella spp. (23%), and Proteus spp. (9%) in addition to unique cases of other bacteria. In Apartado, the isolates found were Enterobacter cloacae (50%), Raoultella terrigena (32%), and both Enterobacter cloacae and Raoultella terrigena were isolated in one (18%) male patient. Five (3.3%) of the 149 patients had positive polymerase chain reaction (PCR) results for the hilA gene of Salmonella spp., all of whom were female and residents of the Medellín metropolitan area. The gallbladder microbiota variability found could be related to geographical, ethnic, and environmental conditions.

  20. Brown Dwarf Variability: What's Varying and Why?

    Marley, Mark Scott

    2014-01-01

    Surveys by ground based telescopes, HST, and Spitzer have revealed that brown dwarfs of most spectral classes exhibit variability. The spectral and temporal signatures of the variability are complex and apparently defy simplistic classification which complicates efforts to model the changes. Important questions include understanding if clearings are forming in an otherwise uniform cloud deck or if thermal perturbations, perhaps associated with breaking gravity waves, are responsible. If clouds are responsible how long does it take for the atmospheric thermal profile to relax from a hot cloudy to a cooler cloudless state? If thermal perturbations are responsible then what atmospheric layers are varying? How do the observed variability timescales compare to atmospheric radiative, chemical, and dynamical timescales? I will address such questions by presenting modeling results for time-varying partly cloudy atmospheres and explore the importance of various atmospheric processes over the relevant timescales for brown dwarfs of a range of effective temperatures. Regardless of the origin of the observed variability, the complexity seen in the atmospheres of the field dwarfs hints at the variability that we may encounter in the next few years in directly imaged young Jupiters. Thus understanding the nature of variability in the field dwarfs, including sensitivity to gravity and metallicity, is of particular importance for exoplanet characterization.