WorldWideScience

Sample records for test defined baseline

  1. Baseline LAW Glass Formulation Testing

    International Nuclear Information System (INIS)

    Kruger, Albert A.; Mooers, Cavin; Bazemore, Gina; Pegg, Ian L.; Hight, Kenneth; Lai, Shan Tao; Buechele, Andrew; Rielley, Elizabeth; Gan, Hao; Muller, Isabelle S.; Cecil, Richard

    2013-01-01

    The major objective of the baseline glass formulation work was to develop and select glass formulations that are compliant with contractual and processing requirements for each of the LAW waste streams. Other objectives of the work included preparation and characterization of glasses with respect to the properties of interest, optimization of sulfate loading in the glasses, evaluation of ability to achieve waste loading limits, testing to demonstrate compatibility of glass melts with melter materials of construction, development of glass formulations to support ILAW qualification activities, and identification of glass formulation issues with respect to contract specifications and processing requirements

  2. FAQs about Baseline Testing among Young Athletes

    Science.gov (United States)

    ... a similar exam conducted by a health care professional during the season if an athlete has a suspected concussion. Baseline testing generally takes place during the pre-season—ideally prior to the first practice. It is important to note that some baseline ...

  3. Arc melter demonstration baseline test results

    International Nuclear Information System (INIS)

    Soelberg, N.R.; Chambers, A.G.; Anderson, G.L.; Oden, L.L.; O'Connor, W.K.; Turner, P.C.

    1994-07-01

    This report describes the test results and evaluation for the Phase 1 (baseline) arc melter vitrification test series conducted for the Buried Waste Integrated Demonstration program (BWID). Phase 1 tests were conducted on surrogate mixtures of as-incinerated wastes and soil. Some buried wastes, soils, and stored wastes at the INEL and other DOE sites, are contaminated with transuranic (TRU) radionuclides and hazardous organics and metals. The high temperature environment in an electric arc furnace may be used to process these wastes to produce materials suitable for final disposal. An electric arc furnace system can treat heterogeneous wastes and contaminated soils by (a) dissolving and retaining TRU elements and selected toxic metals as oxides in the slag phase, (b) destroying organic materials by dissociation, pyrolyzation, and combustion, and (c) capturing separated volatilized metals in the offgas system for further treatment. Structural metals in the waste may be melted and tapped separately for recycle or disposal, or these metals may be oxidized and dissolved into the slag. The molten slag, after cooling, will provide a glass/ceramic final waste form that is homogeneous, highly nonleachable, and extremely durable. These features make this waste form suitable for immobilization of TRU radionuclides and toxic metals for geologic timeframes. Further, the volume of contaminated wastes and soils will be substantially reduced in the process

  4. Baseline for trust: defining 'new and additional' climate funding

    Energy Technology Data Exchange (ETDEWEB)

    Stadelmann, Martin [University of Zurich (Switzerland); Roberts, J. Timmons [Brown University (United States); Huq, Saleemul

    2010-06-15

    Climate finance is becoming a dark curve on the road from Copenhagen to Cancún. Poorer nations fear that richer ones will fulfil the US$30 billion 'fast-start' climate finance promises made in the non-binding Copenhagen Accord by relabelling or diverting basic development aid, or by simply delivering on past climate finance pledges. The problem is simple: contributor countries are operating with no clear baseline against which their promise of 'new and additional' funding can be counted – and they do not accept the baselines put forth by developing countries. A viable solution for the short term is to use projections of business-as-usual development assistance as baselines. The longer-term benchmark could be the provision of truly 'new' funds from new funding sources. Substantial up-front negotiations may be required, but seizing this opportunity to define baselines will build confidence on both sides and create predictability for future finance.

  5. Validity and Reliability of Baseline Testing in a Standardized Environment.

    Science.gov (United States)

    Higgins, Kathryn L; Caze, Todd; Maerlender, Arthur

    2017-08-11

    The Immediate Postconcussion Assessment and Cognitive Testing (ImPACT) is a computerized neuropsychological test battery commonly used to determine cognitive recovery from concussion based on comparing post-injury scores to baseline scores. This model is based on the premise that ImPACT baseline test scores are a valid and reliable measure of optimal cognitive function at baseline. Growing evidence suggests that this premise may not be accurate and a large contributor to invalid and unreliable baseline test scores may be the protocol and environment in which baseline tests are administered. This study examined the effects of a standardized environment and administration protocol on the reliability and performance validity of athletes' baseline test scores on ImPACT by comparing scores obtained in two different group-testing settings. Three hundred-sixty one Division 1 cohort-matched collegiate athletes' baseline data were assessed using a variety of indicators of potential performance invalidity; internal reliability was also examined. Thirty-one to thirty-nine percent of the baseline cases had at least one indicator of low performance validity, but there were no significant differences in validity indicators based on environment in which the testing was conducted. Internal consistency reliability scores were in the acceptable to good range, with no significant differences between administration conditions. These results suggest that athletes may be reliably performing at levels lower than their best effort would produce. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

  7. Defining historical baselines for conservation: ecological changes since European settlement on Vancouver Island, Canada.

    Science.gov (United States)

    Bjorkman, Anne D; Vellend, Mark

    2010-12-01

    Conservation and restoration goals are often defined by historical baseline conditions that occurred prior to a particular period of human disturbance, such as European settlement in North America. Nevertheless, if ecosystems were heavily influenced by native peoples prior to European settlement, conservation efforts may require active management rather than simple removal of or reductions in recent forms of disturbance. We used pre-European settlement land survey records (1859-1874) and contemporary vegetation surveys to assess changes over the past 150 years in tree species and habitat composition, forest density, and tree size structure on southern Vancouver Island and Saltspring Island, British Columbia, Canada. Several lines of evidence support the hypothesis that frequent historical burning by native peoples, and subsequent fire suppression, have played dominant roles in shaping this landscape. First, the relative frequency of fire-sensitive species (e.g., cedar [Thuja plicata]) has increased, whereas fire-tolerant species (e.g., Douglas-fir [Pseudotsuga menziesii]) have decreased. Tree density has increased 2-fold, and the proportion of the landscape in forest has greatly increased at the expense of open habitats (plains, savannas), which today contain most of the region's threatened species. Finally, the frequency distribution of tree size has shifted from unimodal to monotonically decreasing, which suggests removal of an important barrier to tree recruitment. In addition, although most of the open habitats are associated with Garry oak (Quercus garryana) at present, most of the open habitats prior to European settlement were associated with Douglas-fir, which suggests that the current focus on Garry oak as a flagship for the many rare species in savannas may be misguided. Overall, our results indicate that the maintenance and restoration of open habitats will require active management and that historical records can provide critical guidance to such

  8. Item response theory analysis of the mechanics baseline test

    Science.gov (United States)

    Cardamone, Caroline N.; Abbott, Jonathan E.; Rayyan, Saif; Seaton, Daniel T.; Pawl, Andrew; Pritchard, David E.

    2012-02-01

    Item response theory is useful in both the development and evaluation of assessments and in computing standardized measures of student performance. In item response theory, individual parameters (difficulty, discrimination) for each item or question are fit by item response models. These parameters provide a means for evaluating a test and offer a better measure of student skill than a raw test score, because each skill calculation considers not only the number of questions answered correctly, but the individual properties of all questions answered. Here, we present the results from an analysis of the Mechanics Baseline Test given at MIT during 2005-2010. Using the item parameters, we identify questions on the Mechanics Baseline Test that are not effective in discriminating between MIT students of different abilities. We show that a limited subset of the highest quality questions on the Mechanics Baseline Test returns accurate measures of student skill. We compare student skills as determined by item response theory to the more traditional measurement of the raw score and show that a comparable measure of learning gain can be computed.

  9. Evaluation of the results of a randomized controlled trial : how to define changes between baseline and follow-up

    NARCIS (Netherlands)

    Twisk, J.; Proper, K.

    2004-01-01

    The most common way to evaluate the effect of an intervention is to compare the intervention and non-intervention groups regarding the change in the outcome variable between baseline and follow-up; however, there are many different ways to define "changes". The purpose of this article is to

  10. Technical baseline description for in situ vitrification laboratory test equipment

    International Nuclear Information System (INIS)

    Beard, K.V.; Bonnenberg, R.W.; Watson, L.R.

    1991-09-01

    IN situ vitrification (ISV) has been identified as possible waste treatment technology. ISV was developed by Pacific Northwest Laboratory (PNL), Richland, Washington, as a thermal treatment process to treat contaminated soils in place. The process, which electrically melts and dissolves soils and associated inorganic materials, simultaneously destroys and/or removes organic contaminants while incorporating inorganic contaminants into a stable, glass-like residual product. This Technical Baseline Description has been prepared to provide high level descriptions of the design of the Laboratory Test model, including all design modifications and safety improvements made to data. Furthermore, the Technical Baseline Description provides a basic overview of the interface documents for configuration management, program management interfaces, safety, quality, and security requirements. 8 figs

  11. Defining and testing a granular continuum element

    Energy Technology Data Exchange (ETDEWEB)

    Rycroft, Chris H.; Kamrin, Ken; Bazant, Martin Z.

    2007-12-03

    Continuum mechanics relies on the fundamental notion of amesoscopic volume "element" in which properties averaged over discreteparticles obey deterministic relationships. Recent work on granularmaterials suggests a continuum law may be inapplicable, revealinginhomogeneities at the particle level, such as force chains and slow cagebreaking. Here, we analyze large-scale Discrete-Element Method (DEM)simulations of different granular flows and show that a "granularelement" can indeed be defined at the scale of dynamical correlations,roughly three to five particle diameters. Its rheology is rather subtle,combining liquid-like dependence on deformation rate and solid-likedependence on strain. Our results confirm some aspects of classicalplasticity theory (e.g., coaxiality of stress and deformation rate),while contradicting others (i.e., incipient yield), and can guide thedevelopment of more realistic continuum models.

  12. Validity of urinary monoamine assay sales under the "spot baseline urinary neurotransmitter testing marketing model".

    Science.gov (United States)

    Hinz, Marty; Stein, Alvin; Uncini, Thomas

    2011-01-01

    Spot baseline urinary monoamine assays have been used in medicine for over 50 years as a screening test for monoamine-secreting tumors, such as pheochromocytoma and carcinoid syndrome. In these disease states, when the result of a spot baseline monoamine assay is above the specific value set by the laboratory, it is an indication to obtain a 24-hour urine sample to make a definitive diagnosis. There are no defined applications where spot baseline urinary monoamine assays can be used to diagnose disease or other states directly. No peer-reviewed published original research exists which demonstrates that these assays are valid in the treatment of individual patients in the clinical setting. Since 2001, urinary monoamine assay sales have been promoted for numerous applications under the "spot baseline urinary neurotransmitter testing marketing model". There is no published peer-reviewed original research that defines the scientific foundation upon which the claims for these assays are made. On the contrary, several articles have been published that discredit various aspects of the model. To fill the void, this manuscript is a comprehensive review of the scientific foundation and claims put forth by laboratories selling urinary monoamine assays under the spot baseline urinary neurotransmitter testing marketing model.

  13. Baselines and test data for cross-lingual inference

    DEFF Research Database (Denmark)

    Agic, Zeljko; Schluter, Natalie

    2018-01-01

    The recent years have seen a revival of interest in textual entailment, sparked by i) the emergence of powerful deep neural network learners for natural language processing and ii) the timely development of large-scale evaluation datasets such as SNLI. Recast as natural language inference......, the problem now amounts to detecting the relation between pairs of statements: they either contradict or entail one another, or they are mutually neutral. Current research in natural language inference is effectively exclusive to English. In this paper, we propose to advance the research in SNLI-style natural...... language inference toward multilingual evaluation. To that end, we provide test data for four major languages: Arabic, French, Spanish, and Russian. We experiment with a set of baselines. Our systems are based on cross-lingual word embeddings and machine translation. While our best system scores an average...

  14. Baseline Testing of the Club Car Carryall With Asymmetric Ultracapacitors

    Science.gov (United States)

    Eichenberg, Dennis J.

    2003-01-01

    The NASA John H. Glenn Research Center initiated baseline testing of the Club Car Carryall with asymmetric ultracapacitors as a way to reduce pollution in industrial settings, reduce fossil fuel consumption, and reduce operating costs for transportation systems. The Club Car Carryall provides an inexpensive approach to advance the state of the art in electric vehicle technology in a practical application. The project transfers space technology to terrestrial use via non-traditional partners, and provides power system data valuable for future space applications. The work was done under the Hybrid Power Management (HPM) Program, which includes the Hybrid Electric Transit Bus (HETB). The Carryall is a state of the art, ground up, electric utility vehicle. A unique aspect of the project was the use of a state of the art, long life ultracapacitor energy storage system. Innovative features, such as regenerative braking through ultracapacitor energy storage, are planned. Regenerative braking recovers much of the kinetic energy of the vehicle during deceleration. The Carryall was tested with the standard lead acid battery energy storage system, as well as with an asymmetric ultracapacitor energy storage system. The report concludes that the Carryall provides excellent performance, and that the implementation of asymmetric ultracapacitors in the power system can provide significant performance improvements.

  15. Baseline Testing of The EV Global E-Bike

    Science.gov (United States)

    Eichenberg, Dennis J.; Kolacz, John S.; Tavernelli, Paul F.

    2001-01-01

    The NASA John H. Glenn Research Center initiated baseline testing of the EV Global E-Bike as a way to reduce pollution in urban areas, reduce fossil fuel consumption and reduce Operating costs for transportation systems. The work was done Linder the Hybrid Power Management (HPM) Program, which includes the Hybrid Electric Transit Bus (HETB). The E-Bike is a state of the art, ground up, hybrid electric bicycle. Unique features of the vehicle's power system include the use of an efficient, 400 W. electric hub motor and a 7-speed derailleur system that permits operation as fully electric, fully pedal, or a combination of the two. Other innovative features, such as regenerative braking through ultracapacitor energy storage are planned. Regenerative braking recovers much of the kinetic energy of the vehicle during deceleration. The E-Bike is an inexpensive approach to advance the state of the art in hybrid technology in a practical application. The project transfers space technology to terrestrial use via nontraditional partners, and provides power system data valuable for future space applications. A description of the E-bike, the results of performance testing, and future vehicle development plans is the subject of this report. The report concludes that the E-Bike provides excellent performance, and that the implementation of ultracapacitors in the power system can provide significant performance improvements.

  16. Reliability of a Computerized Neurocognitive Test in Baseline Concussion Testing of High School Athletes.

    Science.gov (United States)

    MacDonald, James; Duerson, Drew

    2015-07-01

    Baseline assessments using computerized neurocognitive tests are frequently used in the management of sport-related concussions. Such testing is often done on an annual basis in a community setting. Reliability is a fundamental test characteristic that should be established for such tests. Our study examined the test-retest reliability of a computerized neurocognitive test in high school athletes over 1 year. Repeated measures design. Two American high schools. High school athletes (N = 117) participating in American football or soccer during the 2011-2012 and 2012-2013 academic years. All study participants completed 2 baseline computerized neurocognitive tests taken 1 year apart at their respective schools. The test measures performance on 4 cognitive tasks: identification speed (Attention), detection speed (Processing Speed), one card learning accuracy (Learning), and one back speed (Working Memory). Reliability was assessed by measuring the intraclass correlation coefficient (ICC) between the repeated measures of the 4 cognitive tasks. Pearson and Spearman correlation coefficients were calculated as a secondary outcome measure. The measure for identification speed performed best (ICC = 0.672; 95% confidence interval, 0.559-0.760) and the measure for one card learning accuracy performed worst (ICC = 0.401; 95% confidence interval, 0.237-0.542). All tests had marginal or low reliability. In a population of high school athletes, computerized neurocognitive testing performed in a community setting demonstrated low to marginal test-retest reliability on baseline assessments 1 year apart. Further investigation should focus on (1) improving the reliability of individual tasks tested, (2) controlling for external factors that might affect test performance, and (3) identifying the ideal time interval to repeat baseline testing in high school athletes. Computerized neurocognitive tests are used frequently in high school athletes, often within a model of baseline testing

  17. The challenge of defining risk-based metrics to improve food safety: inputs from the BASELINE project.

    Science.gov (United States)

    Manfreda, Gerardo; De Cesare, Alessandra

    2014-08-01

    In 2002, the Regulation (EC) 178 of the European Parliament and of the Council states that, in order to achieve the general objective of a high level of protection of human health and life, food law shall be based on risk analysis. However, the Commission Regulation No 2073/2005 on microbiological criteria for foodstuffs requires that food business operators ensure that foodstuffs comply with the relevant microbiological criteria. Such criteria define the acceptability of a product, a batch of foodstuffs or a process, based on the absence, presence or number of micro-organisms, and/or on the quantity of their toxins/metabolites, per unit(s) of mass, volume, area or batch. The same Regulation describes a food safety criterion as a mean to define the acceptability of a product or a batch of foodstuff applicable to products placed on the market; moreover, it states a process hygiene criterion as a mean indicating the acceptable functioning of the production process. Both food safety criteria and process hygiene criteria are not based on risk analysis. On the contrary, the metrics formulated by the Codex Alimentarius Commission in 2004, named Food Safety Objective (FSO) and Performance Objective (PO), are risk-based and fit the indications of Regulation 178/2002. The main aims of this review are to illustrate the key differences between microbiological criteria and the risk-based metrics defined by the Codex Alimentarius Commission and to explore the opportunity and also the possibility to implement future European Regulations including PO and FSO as supporting parameters to microbiological criteria. This review clarifies also the implications of defining an appropriate level of human protection, how to establish FSO and PO and how to implement them in practice linked to each other through quantitative risk assessment models. The contents of this review should clarify the context for application of the results collected during the EU funded project named BASELINE (www

  18. Discrete Address Beacon System (DABS) Baseline Test and Evaluation.

    Science.gov (United States)

    1980-04-01

    maturing of the DABS/Automatic Traffic Advisory and Resolution Service ( ATARS ) system concept, would require further study and test prior to issuance of...surveillance, Automatic Traffic Advisory and Resolution Service ( ATARS ) and communications capabilities required for air traffic control (ATC) in the...1980’s and 1990’s. Compatibility with ATCRBS has been emphasized to permit an extended and economical transition. The requirement for the development

  19. System maintenance test plan for the TWRS controlled baseline database system

    International Nuclear Information System (INIS)

    Spencer, S.G.

    1998-01-01

    TWRS [Tank Waste Remediation System] Controlled Baseline Database, formally known as the Performance Measurement Control System, is used to track and monitor TWRS project management baseline information. This document contains the maintenance testing approach for software testing of the TCBD system once SCR/PRs are implemented

  20. Experiment data report for semiscale Mod-1 test S-04-1 (baseline ECC test)

    International Nuclear Information System (INIS)

    Crapo, H.S.; Collins, B.L.; Sackett, K.E.

    1976-09-01

    Recorded test data are presented for Test S-04-1 of the Semiscale Mod-1 Baseline ECC Test Series. This test is among several Semiscale Mod-1 experiments conducted to investigate the thermal and hydraulic phenomena accompanying a hypothesized loss-of-coolant accident in a pressurized water reactor system. Test S-04-1 was conducted from an initial cold leg fluid temperature of 542 0 F and an initial pressure of 2,263 psia. A simulated double-ended offset shear cold leg break was used to investigate the system response to a depressurization and reflood transient using system volume scaled coolant injection parameters. System flow was set to achieve a core fluid temperature differential of 66 0 F at a full core power of 1.6 MW. The flow resistance of the intact loop was based on core area scaling. An electrically heated core with a flat radial power profile was used in the pressure vessel to simulate the effects of a nuclear core. During system depressurization, core power was reduced from the initial level of 1.6 MW in such a manner as to simulate the surface heat flux response of nuclear fuel rods until such time that departure from nucleate boiling might occur. Blowdown to the pressure suppression system was accompanied by simulated emergency core cooling injection into both the intact and broken loops. Coolant injection was continued until test termination at 200 seconds after initiation of blowdown

  1. Baseline neurocognitive testing in sports-related concussions: the importance of a prior night's sleep.

    Science.gov (United States)

    McClure, D Jake; Zuckerman, Scott L; Kutscher, Scott J; Gregory, Andrew J; Solomon, Gary S

    2014-02-01

    The management of sports-related concussions (SRCs) utilizes serial neurocognitive assessments and self-reported symptom inventories to assess recovery and safety for return to play (RTP). Because postconcussive RTP goals include symptom resolution and a return to neurocognitive baseline levels, clinical decisions rest in part on understanding modifiers of this baseline. Several studies have reported age and sex to influence baseline neurocognitive performance, but few have assessed the potential effect of sleep. We chose to investigate the effect of reported sleep duration on baseline Immediate Post-Concussion Assessment and Cognitive Testing (ImPACT) performance and the number of patient-reported symptoms. We hypothesized that athletes receiving less sleep before baseline testing would perform worse on neurocognitive metrics and report more symptoms. Cross-sectional study; Level of evidence, 3. We retrospectively reviewed 3686 nonconcussed athletes (2371 male, 1315 female; 3305 high school, 381 college) with baseline symptom and ImPACT neurocognitive scores. Patients were stratified into 3 groups based on self-reported sleep duration the night before testing: (1) short, sleep duration on baseline ImPACT performance. A univariate ANCOVA was performed to investigate the influence of sleep on total self-reported symptoms. When controlling for age and sex as covariates, the MANCOVA revealed significant group differences on ImPACT reaction time, verbal memory, and visual memory scores but not visual-motor (processing) speed scores. An ANCOVA also revealed significant group differences in total reported symptoms. For baseline symptoms and ImPACT scores, subsequent pairwise comparisons revealed these associations to be most significant when comparing the short and intermediate sleep groups. Our results indicate that athletes sleeping fewer than 7 hours before baseline testing perform worse on 3 of 4 ImPACT scores and report more symptoms. Because SRC management and RTP

  2. Experiment data report for semiscale Mod-1 test S-04-2 (baseline ECC test)

    International Nuclear Information System (INIS)

    Crapo, H.S.; Collins, B.L.; Sackett, K.E.

    1976-09-01

    Recorded test data are presented for Test S-04-2 of the Semiscale Mod-1 Baseline ECC test series. This test is among Semiscale Mod-1 experiments conducted to investigate the thermal and hydraulic phenomena accompanying a hypothesized loss-of-coolant accident in a pressurized water reactor system. Test S-04-2 was conducted from an initial cold leg fluid temperature of 542 0 F and an initial pressure of 2,263 psia. A simulated double-ended offset shear cold leg break was used to investigate the system response to a depressurization and reflood transient using emergency core coolant injection parameters based on downcomer volume scaling. System flow was set to achieve a core fluid temperature differential of 66 0 F at a full core power of 1.6 MW. The flow resistance of the intact loop was based on core area scaling. An electrically heated core with a flat radial power profile was used in the pressure vessel to simulate the effects of a nuclear core. During system depressurization, core power was reduced from the initial level of 1.6 MW to simulate the surface heat flux response of nuclear fuel rods until such sime that departure from nucleate boiling might occur. Blowdown to the pressure suppression system was accompanied by simulated emergency core coolant injection into both the intact and broken loops. Coolant injection was continued until test termination at 200 seconds after initiation of blowdown. The purpose of the report is to make available the uninterpreted data from Test S-04-2 for future data analysis and test results reporting activities. The data, presented in the form of graphs in engineering units, have been analyzed only to the extent necessary to assure that they are reasonable and consistent

  3. The Effect of Pretest Exercise on Baseline Computerized Neurocognitive Test Scores.

    Science.gov (United States)

    Pawlukiewicz, Alec; Yengo-Kahn, Aaron M; Solomon, Gary

    2017-10-01

    Baseline neurocognitive assessment plays a critical role in return-to-play decision making following sport-related concussions. Prior studies have assessed the effect of a variety of modifying factors on neurocognitive baseline test scores. However, relatively little investigation has been conducted regarding the effect of pretest exercise on baseline testing. The aim of our investigation was to determine the effect of pretest exercise on baseline Immediate Post-Concussion Assessment and Cognitive Testing (ImPACT) scores in adolescent and young adult athletes. We hypothesized that athletes undergoing self-reported strenuous exercise within 3 hours of baseline testing would perform more poorly on neurocognitive metrics and would report a greater number of symptoms than those who had not completed such exercise. Cross-sectional study; Level of evidence, 3. The ImPACT records of 18,245 adolescent and young adult athletes were retrospectively analyzed. After application of inclusion and exclusion criteria, participants were dichotomized into groups based on a positive (n = 664) or negative (n = 6609) self-reported history of strenuous exercise within 3 hours of the baseline test. Participants with a positive history of exercise were then randomly matched, based on age, sex, education level, concussion history, and hours of sleep prior to testing, on a 1:2 basis with individuals who had reported no pretest exercise. The baseline ImPACT composite scores of the 2 groups were then compared. Significant differences were observed for the ImPACT composite scores of verbal memory, visual memory, reaction time, and impulse control as well as for the total symptom score. No significant between-group difference was detected for the visual motor composite score. Furthermore, pretest exercise was associated with a significant increase in the overall frequency of invalid test results. Our results suggest a statistically significant difference in ImPACT composite scores between

  4. Effect of education and language on baseline concussion screening tests in professional baseball players.

    Science.gov (United States)

    Jones, Nathaniel S; Walter, Kevin D; Caplinger, Roger; Wright, Daniel; Raasch, William G; Young, Craig

    2014-07-01

    The purpose of the present study was to investigate the possible effects of sociocultural influences, specifically pertaining to language and education, on baseline neuropsychological concussion testing as obtained via immediate postconcussion assessment and cognitive testing (ImPACT) of players from a professional baseball team. A retrospective chart review. Baseline testing of a professional baseball organization. Four hundred five professional baseball players. Age, languages spoken, hometown country location (United States/Canada vs overseas), and years of education. The 5 ImPACT composite scores (verbal memory, visual memory, visual motor speed, reaction time, impulse control) and ImPACT total symptom score from the initial baseline testing. The result of t tests revealed significant differences (P education, the significant differences (P < 0.05) remained in some scores. Sociocultural differences may result in differences in computer-based neuropsychological testing scores.

  5. The stability of baseline-defined categories of alcohol consumption during the adult life-course: a 28-year prospective cohort study.

    Science.gov (United States)

    Knott, Craig S; Bell, Steven; Britton, Annie

    2018-01-01

    Studies that report the relationship between alcohol consumption and disease risk have predominantly operationalized drinking according to a single baseline measure. The resulting assumption of longitudinal stability may be simplistic and complicate interpretation of risk estimates. This study aims to describe changes to the volume of consumption during the adult life-course according to baseline categories of drinking. A prospective observational study. United Kingdom. A cohort of British civil servants totalling 6838 men and 3372 women aged 34-55 years at baseline, followed for a mean 19.1 (standard deviation = 9.5) years. The volume of weekly alcohol consumption was estimated from data concerning the frequency and number of drinks consumed. Baseline categories were defined: non-current drinkers, infrequent drinkers, 0.1-50.0 g/week, 50.1-100.0 g/week, 100.1-150.0 g/week, 150.1-250.0 g/week and >250.0 g/week. For women, the highest category was defined as > 100.0 g/week. Baseline frequency was derived as 'daily or almost daily' and 'not daily or almost daily'. Trajectories were estimated within baseline categories using growth curve models. Trajectories differed between men and women, but were relatively stable within light-to-moderate categories of baseline consumption. Drinking was least stable within the highest categories of baseline consumption (men: > 250.0 g/week; women: > 100.0 g/week), declining by 47.0 [95% confidence interval (CI) = 40.7, 53.2] and 16.8 g/week (95% CI = 12.6, 21.0), respectively, per 10-year increase in age. These declines were not a consequence of sudden transitions to complete abstention. Rates of decline appear greatest in older age, with trajectories converging toward moderate volumes. Among UK civil servants, consumption within baseline drinking categories is generally stable during the life-course, except among heavier baseline drinkers, for whom intakes decline with increasing age. This shift does not appear

  6. The relationship between psychological distress and baseline sports-related concussion testing.

    Science.gov (United States)

    Bailey, Christopher M; Samples, Hillary L; Broshek, Donna K; Freeman, Jason R; Barth, Jeffrey T

    2010-07-01

    This study examined the effect of psychological distress on neurocognitive performance measured during baseline concussion testing. Archival data were utilized to examine correlations between personality testing and computerized baseline concussion testing. Significantly correlated personality measures were entered into linear regression analyses, predicting baseline concussion testing performance. Suicidal ideation was examined categorically. Athletes underwent testing and screening at a university athletic training facility. Participants included 47 collegiate football players 17 to 19 years old, the majority of whom were in their first year of college. Participants were administered the Concussion Resolution Index (CRI), an internet-based neurocognitive test designed to monitor and manage both at-risk and concussed athletes. Participants took the Personality Assessment Inventory (PAI), a self-administered inventory designed to measure clinical syndromes, treatment considerations, and interpersonal style. Scales and subscales from the PAI were utilized to determine the influence psychological distress had on the CRI indices: simple reaction time, complex reaction time, and processing speed. Analyses revealed several significant correlations among aspects of somatic concern, depression, anxiety, substance abuse, and suicidal ideation and CRI performance, each with at least a moderate effect. When entered into a linear regression, the block of combined psychological symptoms accounted for a significant amount of baseline CRI performance, with moderate to large effects (r = 0.23-0.30). When examined categorically, participants with suicidal ideation showed significantly slower simple reaction time and complex reaction time, with a similar trend on processing speed. Given the possibility of obscured concussion deficits after injury, implications for premature return to play, and the need to target psychological distress outright, these findings heighten the clinical

  7. The importance of proper administration and interpretation of neuropsychological baseline and postconcussion computerized testing.

    Science.gov (United States)

    Moser, Rosemarie Scolaro; Schatz, Philip; Lichtenstein, Jonathan D

    2015-01-01

    Media coverage, litigation, and new legislation have resulted in a heightened awareness of the prevalence of sports concussion in both adult and youth athletes. Baseline and postconcussion testing is now commonly used for the assessment and management of sports-related concussion in schools and in youth sports leagues. With increased use of computerized neurocognitive sports concussion testing, there is a need for standards for proper administration and interpretation. To date, there has been a lack of standardized procedures by which assessments are administered. More specifically, individuals who are not properly trained often interpret test results, and their methods of interpretation vary considerably. The purpose of this article is to outline factors affecting the validity of test results, to provide examples of misuse and misinterpretation of test results, and to communicate the need to administer testing in the most effective and useful manner. An increase in the quality of test administration and application may serve to decrease the prevalence of invalid test results and increase the accuracy and utility of baseline test results if an athlete sustains a concussion. Standards for test use should model the American Psychological Association and Centers for Disease Control and Prevention guidelines, as well as the recent findings of the joint position paper on computerized neuropsychological assessment devices.

  8. Baseline Fracture Toughness and CGR testing of alloys X-750 and XM-19 (EPRI Phase I)

    International Nuclear Information System (INIS)

    Jackson, J.H.; Teysseyre, S.P.

    2012-01-01

    The Advanced Test Reactor National Scientific User Facility (ATR NSUF) and Electric Power Research Institute (EPRI) formed an agreement to test representative alloys used as reactor structural materials as a pilot program toward establishing guidelines for future ATR NSUF research programs. This report contains results from the portion of this program established as Phase I (of three phases) that entails baseline fracture toughness, stress corrosion cracking (SCC), and tensile testing of selected materials for comparison to similar tests conducted at GE Global Research. The intent of this Phase I research program is to determine baseline properties for the materials of interest prior to irradiation, and to ensure comparability between laboratories using similar testing techniques, prior to applying these techniques to the same materials after having been irradiated at the Advanced Test Reactor (ATR). The materials chosen for this research are the nickel based super alloy X-750, and nitrogen strengthened austenitic stainless steel XM-19. A spare core shroud upper support bracket of alloy X-750 was purchased by EPRI from Southern Co. and a section of XM-19 plate was purchased by EPRI from GE-Hitachi. These materials were sectioned at GE Global Research and provided to INL.

  9. Post launch calibration and testing of the Advanced Baseline Imager on the GOES-R satellite

    Science.gov (United States)

    Lebair, William; Rollins, C.; Kline, John; Todirita, M.; Kronenwetter, J.

    2016-05-01

    The Geostationary Operational Environmental Satellite R (GOES-R) series is the planned next generation of operational weather satellites for the United State's National Oceanic and Atmospheric Administration. The first launch of the GOES-R series is planned for October 2016. The GOES-R series satellites and instruments are being developed by the National Aeronautics and Space Administration (NASA). One of the key instruments on the GOES-R series is the Advance Baseline Imager (ABI). The ABI is a multi-channel, visible through infrared, passive imaging radiometer. The ABI will provide moderate spatial and spectral resolution at high temporal and radiometric resolution to accurately monitor rapidly changing weather. Initial on-orbit calibration and performance characterization is crucial to establishing baseline used to maintain performance throughout mission life. A series of tests has been planned to establish the post launch performance and establish the parameters needed to process the data in the Ground Processing Algorithm. The large number of detectors for each channel required to provide the needed temporal coverage presents unique challenges for accurately calibrating ABI and minimizing striping. This paper discusses the planned tests to be performed on ABI over the six-month Post Launch Test period and the expected performance as it relates to ground tests.

  10. Sex Differences and Self-Reported Attention Problems During Baseline Concussion Testing.

    Science.gov (United States)

    Brooks, Brian L; Iverson, Grant L; Atkins, Joseph E; Zafonte, Ross; Berkner, Paul D

    2016-01-01

    Amateur athletic programs often use computerized cognitive testing as part of their concussion management programs. There is evidence that athletes with preexisting attention problems will have worse cognitive performance and more symptoms at baseline testing. The purpose of this study was to examine whether attention problems affect assessments differently for male and female athletes. Participants were drawn from a database that included 6,840 adolescents from Maine who completed Immediate Postconcussion Assessment and Cognitive Testing (ImPACT) at baseline (primary outcome measure). The final sample included 249 boys and 100 girls with self-reported attention problems. Each participant was individually matched for sex, age, number of past concussions, and sport to a control participant (249 boys, 100 girls). Boys with attention problems had worse reaction time than boys without attention problems. Girls with attention problems had worse visual-motor speed than girls without attention problems. Boys with attention problems reported more total symptoms, including more cognitive-sensory and sleep-arousal symptoms, compared with boys without attention problems. Girls with attention problems reported more cognitive-sensory, sleep-arousal, and affective symptoms than girls without attention problems. When considering the assessment, management, and outcome from concussions in adolescent athletes, it is important to consider both sex and preinjury attention problems regarding cognitive test results and symptom reporting.

  11. Final Report. Baseline LAW Glass Formulation Testing, VSL-03R3460-1, Rev. 0

    Energy Technology Data Exchange (ETDEWEB)

    Muller, Isabelle S. [The Catholic University of America, Washington, DC (United States); Pegg, Ian L. [The Catholic University of America, Washington, DC (United States); Gan, Hao [The Catholic University of America, Washington, DC (United States); Buechele, Andrew [The Catholic University of America, Washington, DC (United States); Rielley, Elizabeth [The Catholic University of America, Washington, DC (United States); Bazemore, Gina [The Catholic University of America, Washington, DC (United States); Cecil, Richard [The Catholic University of America, Washington, DC (United States); Hight, Kenneth [The Catholic University of America, Washington, DC (United States); Mooers, Cavin [The Catholic University of America, Washington, DC (United States); Lai, Shan-Tao T. [The Catholic University of America, Washington, DC (United States); Kruger, Albert A. [The Catholic University of America, Washington, DC (United States)

    2015-06-18

    The major objective of the baseline glass formulation work was to develop and select glass formulations that are compliant with contractual and processing requirements for each of the LAW waste streams. Other objectives of the work included preparation and characterization of glasses with respect to the properties of interest, optimization of sulfate loading in the glasses, evaluation of ability to achieve waste loading limits, testing to demonstrate compatibility of glass melts with melter materials of construction, development of glass formulations to support ILAW qualification activities, and identification of glass formulation issues with respect to contract specifications and processing requirements.

  12. Standard for baseline water-well testing for coalbed methane/natural gas in coal operations

    International Nuclear Information System (INIS)

    2006-04-01

    Interest in developing coalbed methane (CBM) is increasing with the decline of conventional natural gas reserves. In Alberta, where CBM is in the early stages of development, the drilling, production and operational rules for CBM are the same as those that apply to natural gas. The government of Alberta is presently examining the rules and regulations that apply to CBM to determine if they are appropriate for responsible development and balanced with environmental protection. CBM development has the potential to affect water aquifers and water supply. As such, a new standard has been developed by Alberta Environment in collaboration with the Alberta Energy and Utilities Board which requires that companies involved in the development of shallow CBM must offer to test rural Albertan's water wells prior to drilling. The companies will submit baseline groundwater data to both Alberta Environment and the landowner. The broader application of groundwater testing will also support Alberta Environment's objective of mapping all groundwater resources in the province. This new standard will help achieve continued protection of provincial groundwater resources and Albertan's groundwater supplies. It will also facilitate responsible CBM development and the government's Water for Life strategy. This document explained the protocols for testing, sampling and analyzing groundwater. The standard provides scientific information to support achievement of the outcomes as well as a regulatory basis for water well testing and baseline data collection prior to CBM development. If a landowner registers a complaint regarding a perceived change in well water quantity and quality after CBM development, then the developers must retest the water well to address the landowner's concerns. The tests evaluate water well capacity, water quality, routine potability and analysis for water quality parameters, including major ionic constituents, bacteriological analysis and presence or absence of gas

  13. Comprehensive baseline environmental audit of former underground test areas in Colorado, Nevada, and New Mexico

    International Nuclear Information System (INIS)

    1994-05-01

    This report documents the results of the Comprehensive Baseline Environmental Audit of Former Underground Test Areas (FUTAS) in the States of Colorado, Nevada, and New Mexico. DOE and contractor systems for management of environmental protection activities on the Nevada Test Site (NTS) were not within the scope of the audit. The audit was conducted May 16-May 26, 1994, by the Office of Environmental Audit (EH-24). DOE 5482.1 B, open-quotes Environment, Safety, and Health Appraisal Programclose quotes, establishes the mission of EH-24, which is to provide comprehensive, independent oversight of Department-wide environmental programs on behalf of the Secretary of Energy. The ultimate goal of EH-24 is to enhance environmental protection and minimize risk to public health and the environment. EH-24 accomplishes its mission using systematic and periodic evaluations of DOE's environmental programs within line organizations and supplemental activities that strengthen self-assessment and oversight functions within program, field, and contractor organizations. These evaluations function as a vehicle through which the Secretary and program managers are apprised of the status and vulnerabilities of Departmental environmental activities and environmental management systems. Several types of evaluations are conducted, including: (1) comprehensive baseline environmental audits; (2) routine environmental audits; (3) environmental management assessments; and (4) special issue reviews

  14. Baseline staging tests based on molecular subtype is necessary for newly diagnosed breast cancer.

    Science.gov (United States)

    Chen, Xuesong; Sun, Lichun; Cong, Yingying; Zhang, Tingting; Lin, Qiushi; Meng, Qingwei; Pang, Hui; Zhao, Yanbin; Li, Yu; Cai, Li; Dong, Xiaoqun

    2014-03-17

    Bone scanning (BS), liver ultrasonography (LUS), and chest radiography (CXR) are commonly recommended for baseline staging in patients with newly diagnosed breast cancer. The purpose of this study is to demonstrate whether these tests are indicated for specific patient subpopulation based on clinical staging and molecular subtype. A retrospective study on 5406 patients with newly diagnosed breast cancer was conducted to identify differences in occurrence of metastasis based on clinical staging and molecular subtypes. All patients had been evaluated by BS, LUS and CXR at diagnosis. Complete information on clinical staging was available in 5184 patients. For stage I, II, and III, bone metastasis rate was 0%, 0.6% and 2.7%, respectively (P diagnosed breast cancer.

  15. Loneliness, social integration and consumption of sugar-containing beverages: testing the social baseline theory.

    Science.gov (United States)

    Henriksen, Roger Ekeberg; Torsheim, Torbjørn; Thuen, Frode

    2014-01-01

    Social Baseline Theory (SBT) proposes that close relationships aid in metabolic resource management and that individuals without significant relationships may experience more demands on their own neural metabolic resources on a daily basis when solving problems, remaining vigilant against potential threats and regulating emotional responses. This study tests a hypothesised consequence derived from SBT: relative social isolation leads to increased levels of sugar intake. Based on cross-sectional, self-reported data from the Norwegian Mother and Child Cohort Study (N = 90 084), information on social integration and the consumption of both sugar-sweetened and artificially sweetened sodas and juices was obtained from a large number of women in early pregnancy. Multiple regression analyses were conducted to assess whether loneliness, marital status, relationship satisfaction, advice from others than partner, and cohesion at work is associated with consumption of sodas and juices. Perceived loneliness was associated with elevated intake of all sugary beverages, while relationship satisfaction was negatively associated with all sugary beverages. Being married or cohabitating, having supportive friends, and having a sense of togetherness at work were associated with lower intake of two out of three sugar-containing beverages. These associations were significant, even after controlling for factors such as body mass index, weight related self-image, depression, physical activity, educational level, age and income. In comparison, a statistically significant relationship emerged between relationship satisfaction and artificially sweetened cola. No other predictor variables were significantly associated with any type of artificially sweetened beverage. This study indicates that loneliness and social integration influence the level of consumption of sugary beverages. The results support the hypothesis derived from the Social Baseline Theory that relative social isolation leads

  16. Is Baseline Cardiac Autonomic Modulation Related to Performance and Physiological Responses Following a Supramaximal Judo Test?

    Science.gov (United States)

    Blasco-Lafarga, Cristina; Martínez-Navarro, Ignacio; Mateo-March, Manuel

    2013-01-01

    Little research exists concerning Heart Rate (HR) Variability (HRV) following supramaximal efforts focused on upper-body explosive strength-endurance. Since they may be very demanding, it seems of interest to analyse the relationship among performance, lactate and HR dynamics (i.e. HR, HRV and complexity) following them; as well as to know how baseline cardiac autonomic modulation mediates these relationships. The present study aimed to analyse associations between baseline and post-exercise HR dynamics following a supramaximal Judo test, and their relationship with lactate, in a sample of 22 highly-trained male judoists (20.70±4.56 years). A large association between the increase in HR from resting to exercise condition and performance suggests that individuals exerted a greater sympathetic response to achieve a better performance (Rating of Perceived Exertion: 20; post-exercise peak lactate: 11.57±2.24 mmol/L; 95.76±4.13 % of age-predicted HRmax). Athletes with higher vagal modulation and lower sympathetic modulation at rest achieved both a significant larger ∆HR and a faster post-exercise lactate removal. A enhanced resting parasympathetic modulation might be therefore related to a further usage of autonomic resources and a better immediate metabolic recovery during supramaximal exertions. Furthermore, analyses of variance displayed a persistent increase in α1 and a decrease in lnRMSSD along the 15 min of recovery, which are indicative of a diminished vagal modulation together with a sympathovagal balance leaning to sympathetic domination. Eventually, time-domain indices (lnRMSSD) showed no lactate correlations, while nonlinear indices (α1 and lnSaEn) appeared to be moderate to strongly correlated with it, thus pointing to shared mechanisms between neuroautonomic and metabolic regulation. PMID:24205273

  17. Baseline Testing of the EV Global E-Bike with Ultracapacitors

    Science.gov (United States)

    Eichenberg, Dennis J.; Kolacz, John S.; Tavernelli, Paul F.

    2001-01-01

    The NASA John H. Glenn Research Center initiated baseline testing of the EV Global E-Bike SX with ultracapacitors as a way to reduce pollution in urban areas, reduce fossil fuel consumption, and reduce operating costs for transportation systems. The E-Bike provides an inexpensive approach to advance the state of art in hybrid technology in a practical application. The project transfers space technology to terrestrial use via nontraditional partners, and provides power system data valuable for future space applications. The work was done under the Hybrid Power Management (HPM) Program, which includes the Hybrid Electric Transit Bus (HETB). The E-Bike is a state of the art, ground up, hybrid electrical bicycle. Unique features of the vehicle's power system include the use of an efficient, 400 W electric hub motor, and a seven-speed derailleur system that permits operation as fully electric, fully pedal, or a combination of the two. Other innovative features, such as regenerative braking through ultracapacitor energy storage, are planned. Regenerative braking recovers much of the kinetic energy of the vehicle during deceleration. A description of the E-bike, the results of performance testing, and future vehicle development plans are given in this report. The report concludes that the E-Bike provides excellent performance, and that the implementation of ultracapacitors in the power system can provide significant performance improvements.

  18. Development of a smart-antenna test-bed, demonstrating software defined digital beamforming

    NARCIS (Netherlands)

    Kluwer, T.; Slump, Cornelis H.; Schiphorst, Roelof; Hoeksema, F.W.

    2001-01-01

    This paper describes a smart-antenna test-bed consisting of ‘common of the shelf’ (COTS) hardware and software defined radio components. The use of software radio components enables a flexible platform to implement and test mobile communication systems as a real-world system. The test-bed is

  19. Fusion power demonstration - a baseline for the mirror engineering test reactor

    International Nuclear Information System (INIS)

    Henning, C.D.; Logan, B.G.; Neef, W.S.

    1983-01-01

    Developing a definition of an engineering test reactor (ETR) is a current goal of the Office of Fusion Energy (OFE). As a baseline for the mirror ETR, the Fusion Power Demonstration (FPD) concept has been pursued at Lawrence Livermore National Laboratory (LLNL) in cooperation with Grumman Aerospace, TRW, and the Idaho National Engineering Laboratory. Envisioned as an intermediate step to fusion power applications, the FPD would achieve DT ignition in the central cell, after which blankets and power conversion would be added to produce net power. To achieve ignition, a minimum central cell length of 67.5 m is needed to supply the ion and alpha particles radial drift pumping losses in the transition region. The resulting fusion power is 360 MW. Low electron-cyclotron heating power of 12 MW, ion-cyclotron heating of 2.5 MW, and a sloshing ion beam power of 1.0 MW result in a net plasma Q of 22. A primary technological challenge is the 24-T, 45-cm bore choke coil, comprising a copper hybrid insert within a 15 to 18 T superconducting coil

  20. A Baseline Patient Model to Support Testing of Medical Cyber-Physical Systems.

    Science.gov (United States)

    Silva, Lenardo C; Perkusich, Mirko; Almeida, Hyggo O; Perkusich, Angelo; Lima, Mateus A M; Gorgônio, Kyller C

    2015-01-01

    Medical Cyber-Physical Systems (MCPS) are currently a trending topic of research. The main challenges are related to the integration and interoperability of connected medical devices, patient safety, physiologic closed-loop control, and the verification and validation of these systems. In this paper, we focus on patient safety and MCPS validation. We present a formal patient model to be used in health care systems validation without jeopardizing the patient's health. To determine the basic patient conditions, our model considers the four main vital signs: heart rate, respiratory rate, blood pressure and body temperature. To generate the vital signs we used regression models based on statistical analysis of a clinical database. Our solution should be used as a starting point for a behavioral patient model and adapted to specific clinical scenarios. We present the modeling process of the baseline patient model and show its evaluation. The conception process may be used to build different patient models. The results show the feasibility of the proposed model as an alternative to the immediate need for clinical trials to test these medical systems.

  1. Hardware test program for evaluation of baseline range-range rate sensor concept

    Science.gov (United States)

    1985-01-01

    The baseline range/range rate sensor concept was evaluated. The Interrupted CW (ICW) mode of operation continued with emphasis on establishing the sensitivity of the video portion of the receiver was 7 dB less than the theoretical value. This departs from test results of previous implementations in which achieved sensitivity was within 1.5 to 2 dB of the theoretical value. Several potential causes of this discrepancy in performance were identified and are scheduled for further investigation. Results indicate that a cost savings in both per unit and program costs are realizable by eliminating one of the modes of operation. An acquisition (total program) cost savings of approximately 10% is projected by eliminating the CW mode of operation. The modified R/R sensor would operate in the ICW mode only and would provide coverage from initial acquisition at 12 nmi to within a few hundred feet of the OMV. If the ICW mode only were selected, then an accompanying sensor would be required to provide coverage from a few hundred feet to docking.

  2. Baseline tests for arc melter vitrification of INEL buried wastes. Volume 1: Facility description and summary data report

    International Nuclear Information System (INIS)

    Oden, L.L.; O'Connor, W.K.; Turner, P.C.; Soelberg, N.R.; Anderson, G.L.

    1993-01-01

    This report presents field results and raw data from the Buried Waste Integrated Demonstration (BWID) Arc Melter Vitrification Project Phase 1 baseline test series conducted by the Idaho National Engineering Laboratory (INEL) in cooperation with the U.S. Bureau of Mines (USBM). The baseline test series was conducted using the electric arc melter facility at the USBM Albany Research Center in Albany, Oregon. Five different surrogate waste feed mixtures were tested that simulated thermally-oxidized, buried, TRU-contaminated, mixed wastes and soils present at the INEL. The USBM Arc Furnace Integrated Waste Processing Test Facility includes a continuous feed system, the arc melting furnace, an offgas control system, and utilities. The melter is a sealed, 3-phase alternating current (ac) furnace approximately 2 m high and 1.3 m wide. The furnace has a capacity of 1 metric ton of steel and can process as much as 1,500 lb/h of soil-type waste materials. The surrogate feed materials included five mixtures designed to simulate incinerated TRU-contaminated buried waste materials mixed with INEL soil. Process samples, melter system operations data and offgas composition data were obtained during the baseline tests to evaluate the melter performance and meet test objectives. Samples and data gathered during this program included (a) automatically and manually logged melter systems operations data, (b) process samples of slag, metal and fume solids, and (c) offgas composition, temperature, velocity, flowrate, moisture content, particulate loading and metals content. This report consists of 2 volumes: Volume I summarizes the baseline test operations. It includes an executive summary, system and facility description, review of the surrogate waste mixtures, and a description of the baseline test activities, measurements, and sample collection. Volume II contains the raw test data and sample analyses from samples collected during the baseline tests

  3. Baseline Neurocognitive Test Results In Non-concussed Athletes: Does Sleep Matter?

    OpenAIRE

    McClure, D. Jake; Zuckerman, Scott L.; Kutscher, Scott J.; Gregory, Andrew; Solomon, Gary S.

    2013-01-01

    Objectives: When managing sport-related concussions (SRC), sports medicine physicians utilize serial neurocognitive assessments and self-reported symptom inventories when evaluating athlete recovery and safety for returning to play (RTP). Since post-concussive RTP goals include symptom resolution and return to neurocognitive baseline, clinical decisions rest on an understanding of modifiers of baseline performance. Several studies have reported the influence of age, gender and sport on baseli...

  4. Baseline rationing

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Moreno-Ternero, Juan D.; Østerdal, Lars Peter Raahave

    The standard problem of adjudicating conflicting claims describes a situation in which a given amount of a divisible good has to be allocated among agents who hold claims against it exceeding the available amount. This paper considers more general rationing problems in which, in addition to claims...... to international protocols for the reduction of greenhouse emissions, or water distribution in drought periods. We define a family of allocation methods for such general rationing problems - called baseline rationing rules - and provide an axiomatic characterization for it. Any baseline rationing rule within...... the family is associated with a standard rule and we show that if the latter obeys some properties reflecting principles of impartiality, priority and solidarity, the former obeys them too....

  5. Prevalence of Invalid Performance on Baseline Testing for Sport-Related Concussion by Age and Validity Indicator.

    Science.gov (United States)

    Abeare, Christopher A; Messa, Isabelle; Zuccato, Brandon G; Merker, Bradley; Erdodi, Laszlo

    2018-03-12

    Estimated base rates of invalid performance on baseline testing (base rates of failure) for the management of sport-related concussion range from 6.1% to 40.0%, depending on the validity indicator used. The instability of this key measure represents a challenge in the clinical interpretation of test results that could undermine the utility of baseline testing. To determine the prevalence of invalid performance on baseline testing and to assess whether the prevalence varies as a function of age and validity indicator. This retrospective, cross-sectional study included data collected between January 1, 2012, and December 31, 2016, from a clinical referral center in the Midwestern United States. Participants included 7897 consecutively tested, equivalently proportioned male and female athletes aged 10 to 21 years, who completed baseline neurocognitive testing for the purpose of concussion management. Baseline assessment was conducted with the Immediate Postconcussion Assessment and Cognitive Testing (ImPACT), a computerized neurocognitive test designed for assessment of concussion. Base rates of failure on published ImPACT validity indicators were compared within and across age groups. Hypotheses were developed after data collection but prior to analyses. Of the 7897 study participants, 4086 (51.7%) were male, mean (SD) age was 14.71 (1.78) years, 7820 (99.0%) were primarily English speaking, and the mean (SD) educational level was 8.79 (1.68) years. The base rate of failure ranged from 6.4% to 47.6% across individual indicators. Most of the sample (55.7%) failed at least 1 of 4 validity indicators. The base rate of failure varied considerably across age groups (117 of 140 [83.6%] for those aged 10 years to 14 of 48 [29.2%] for those aged 21 years), representing a risk ratio of 2.86 (95% CI, 2.60-3.16; P indicator and the age of the examinee. The strong age association, with 3 of 4 participants aged 10 to 12 years failing validity indicators, suggests that the

  6. Recommendation to include fragrance mix 2 and hydroxyisohexyl 3-cyclohexene carboxaldehyde (Lyral) in the European baseline patch test series.

    Science.gov (United States)

    Bruze, Magnus; Andersen, Klaus Ejner; Goossens, An

    2008-03-01

    The currently used fragrance mix in the European baseline patch test series (baseline series) fails to detect a substantial number of clinically relevant fragrance allergies. To investigate whether it is justified to include hydroxyisohexyl 3-cyclohexene carboxaldehyde (Lyral) and fragrance mix 2 containing hydroxyisohexyl 3-cyclohexene carboxaldehyde, citral, farnesol, coumarin, citronellol, and alpha-hexyl cinnamal in the European baseline patch test series. Survey of the literature on reported frequencies of contact allergy and allergic contact dermatitis from fragrance mix 2 and hydroxyisohexyl 3-cyclohexene carboxaldehyde (Lyral) as well as reported results of experimental provocation test. Fragrance mix 2 has been demonstrated to be a useful additional marker of fragrance allergy with contact allergy rates up to 5% when included in various national baseline patch test series. Of the fragrance substances present in fragrance mix 2, hydroxyisohexyl 3-cyclohexene carboxaldehyde is the most common sensitizer. Contact allergy rates between 1.5% and 3% have been reported for hydroxyisohexyl 3-cyclohexene carboxaldehyde in petrolatum (pet.) at 5% from various European centres when tested in consecutive dermatitis patients. From 2008, pet. preparations of fragrance mix 2 at 14% w/w (5.6 mg/cm(2)) and hydroxyisohexyl 3-cyclohexene carboxaldehyde at 5% w/w (2.0 mg/cm(2)) are recommended for inclusion in the baseline series. With the Finn Chamber technique, a dose of 20 mg pet. preparation is recommended. Whenever there is a positive reaction to fragrance mix 2, additional patch testing with the 6 ingredients, 5 if there are simultaneous positive reactions to hydroxyisohexyl 3-cyclohexene carboxaldehyde and fragrance mix 2, is recommended.

  7. Recommendation to include fragrance mix 2 and hydroxyisohexyl 3-cyclohexene carboxaldehyde (Lyral) in the European baseline patch test series

    DEFF Research Database (Denmark)

    Bruze, Magnus; Andersen, Klaus Ejner; Goossens, An

    2008-01-01

    various European centres when tested in consecutive dermatitis patients. CONCLUSIONS: From 2008, pet. preparations of fragrance mix 2 at 14% w/w (5.6 mg/cm(2)) and hydroxyisohexyl 3-cyclohexene carboxaldehyde at 5% w/w (2.0 mg/cm(2)) are recommended for inclusion in the baseline series. With the Finn...

  8. Moral Schemas and Tacit Judgement or How the Defining Issues Test Is Supported by Cognitive Science.

    Science.gov (United States)

    Narvaez, Darcia; Bock, Tonia

    2002-01-01

    Discusses three core moral judgement ideas: (1) modern schema theory, (2) automatic decision-making frequency, and (3) implicit processes as the default mode of human information processing. Compares the Defining Issues Test (measures the beginnings of moral judgement) and the Lawrence Kohlberg Moral Judgement Interview (measures the highest level…

  9. Cumulative Effects of Concussion History on Baseline Computerized Neurocognitive Test Scores: Systematic Review and Meta-analysis.

    Science.gov (United States)

    Alsalaheen, Bara; Stockdale, Kayla; Pechumer, Dana; Giessing, Alexander; He, Xuming; Broglio, Steven P

    It is unclear whether individuals with a history of single or multiple clinically recovered concussions exhibit worse cognitive performance on baseline testing compared with individuals with no concussion history. To analyze the effects of concussion history on baseline neurocognitive performance using a computerized neurocognitive test. PubMed, CINAHL, and psycINFO were searched in November 2015. The search was supplemented by a hand search of references. Studies were included if participants completed the Immediate Post-concussion Assessment and Cognitive Test (ImPACT) at baseline (ie, preseason) and if performance was stratified by previous history of single or multiple concussions. Systematic review and meta-analysis. Level 2. Sample size, demographic characteristics of participants, as well as performance of participants on verbal memory, visual memory, visual-motor processing speed, and reaction time were extracted from each study. A random-effects pooled meta-analysis revealed that, with the exception of worsened visual memory for those with 1 previous concussion (Hedges g = 0.10), no differences were observed between participants with 1 or multiple concussions compared with participants without previous concussions. With the exception of decreased visual memory based on history of 1 concussion, history of 1 or multiple concussions was not associated with worse baseline cognitive performance.

  10. The 1993 baseline biological studies and proposed monitoring plan for the Device Assembly Facility at the Nevada Test Site

    Energy Technology Data Exchange (ETDEWEB)

    Woodward, B.D.; Hunter, R.B.; Greger, P.D.; Saethre, M.B.

    1995-02-01

    This report contains baseline data and recommendations for future monitoring of plants and animals near the new Device Assembly Facility (DAF) on the Nevada Test Site (NTS). The facility is a large structure designed for safely assembling nuclear weapons. Baseline data was collected in 1993, prior to the scheduled beginning of DAF operations in early 1995. Studies were not performed prior to construction and part of the task of monitoring operational effects will be to distinguish those effects from the extensive disturbance effects resulting from construction. Baseline information on species abundances and distributions was collected on ephemeral and perennial plants, mammals, reptiles, and birds in the desert ecosystems within three kilometers (km) of the DAF. Particular attention was paid to effects of selected disturbances, such as the paved road, sewage pond, and the flood-control dike, associated with the facility. Radiological monitoring of areas surrounding the DAF is not included in this report.

  11. Space-Based Reconfigurable Software Defined Radio Test Bed Aboard International Space Station

    Science.gov (United States)

    Reinhart, Richard C.; Lux, James P.

    2014-01-01

    The National Aeronautical and Space Administration (NASA) recently launched a new software defined radio research test bed to the International Space Station. The test bed, sponsored by the Space Communications and Navigation (SCaN) Office within NASA is referred to as the SCaN Testbed. The SCaN Testbed is a highly capable communications system, composed of three software defined radios, integrated into a flight system, and mounted to the truss of the International Space Station. Software defined radios offer the future promise of in-flight reconfigurability, autonomy, and eventually cognitive operation. The adoption of software defined radios offers space missions a new way to develop and operate space transceivers for communications and navigation. Reconfigurable or software defined radios with communications and navigation functions implemented in software or VHDL (Very High Speed Hardware Description Language) provide the capability to change the functionality of the radio during development or after launch. The ability to change the operating characteristics of a radio through software once deployed to space offers the flexibility to adapt to new science opportunities, recover from anomalies within the science payload or communication system, and potentially reduce development cost and risk by adapting generic space platforms to meet specific mission requirements. The software defined radios on the SCaN Testbed are each compliant to NASA's Space Telecommunications Radio System (STRS) Architecture. The STRS Architecture is an open, non-proprietary architecture that defines interfaces for the connections between radio components. It provides an operating environment to abstract the communication waveform application from the underlying platform specific hardware such as digital-to-analog converters, analog-to-digital converters, oscillators, RF attenuators, automatic gain control circuits, FPGAs, general-purpose processors, etc. and the interconnections among

  12. Hardware test program for evaluation of baseline range/range rate sensor concept

    Science.gov (United States)

    Pernic, E.

    1985-01-01

    The test program Phase II effort provides additional design information in terms of range and range rate (R/R) sensor performance when observing and tracking a typical spacecraft target. The target used in the test program was a one-third scale model of the Hubble Space Telescope (HST) available at the MSFC test site where the tests were performed. A modified Bendix millimeter wave radar served as the R/R sensor test bed for evaluation of range and range rate tracking performance, and generation of radar signature characteristics of the spacecraft target. A summary of program test results and conclusions are presented along with detailed description of the Bendix test bed radar with accompaning instrumentation. The MSFC test site and facilities are described. The test procedures used to establish background levels, and the calibration procedures used in the range accuracy tests and RCS (radar cross section) signature measurements, are presented and a condensed version of the daily log kept during the 5 September through 17 September test period is also presented. The test program results are given starting with the RCS signature measurements, then continuing with range measurement accuracy test results and finally the range and range rate tracking accuracy test results.

  13. Baseline studies in the desert ecosystem at East Mesa Geothermal Test Site, Imperial Valley, California

    Energy Technology Data Exchange (ETDEWEB)

    Romney, E.M.; Wallace, A.; Lunt, O.R.; Ackerman, T.A.; Kinnear, J.E.

    1977-09-01

    Baseline data reported herein for soil, vegetation, and small mammal components of the East Mesa desert ecosystem represent a collection period from October 1975 to September 1977. Inasmuch as changes in salt balance from geothermal brine sources are of potential impact upon the ecosystem, considerable analytical effort was given to the determination of element constituents in soil, plant, and animal samples. A preliminary synthesis of data was done to investigate the heterogeneity of element constituents among the sampled population and to summarize results. Findings indicate that periodic sampling and chemical analysis of vegetation around an industrialized geothermal energy source is probably the best way to monitor the surrounding ecosystem for assuring containment of any resource pollutants.

  14. Gender Differences in Symptom Reporting on Baseline Sport Concussion Testing Across the Youth Age Span.

    Science.gov (United States)

    Moser, Rosemarie Scolaro; Olek, Lauren; Schatz, Philip

    2018-02-06

    Little is known regarding gender differences in concussion symptom reporting developmentally across the age span, specifically in pre-adolescent athletes. The present study asks: Do boys and girls differ in symptom reporting across the pre-adolescent to post-adolescent age span? This retrospective study utilized baseline assessments from 11,695 10-22 year-old athletes assigned to 3 independent groups: Pre-adolescent 10-12 year olds (n = 1,367; 12%), Adolescent 13-17 year olds (n = 2,974; 25%), and Late Adolescent 18-22 year olds (n = 7,354; 63%). Males represented 60% of the sample. Baseline ImPACT composite scores and Post-Concussion Symptom Scale scores (Total, Physical, Cognitive, Emotional, Sleep) were analyzed for the effects of age and gender. Statistically significant main effects were found for age and gender on all ImPACT composites, Total Symptoms, and Symptom factors. Significant interaction effects were noted between age and gender for all ImPACT composites, Total Symptoms, and Symptom factors. Total Symptoms and all Symptom factors were highest in adolescents (ages 13-17) for males and females. In the 10-12 age group, females displayed lower Total Symptoms, Physical, and Sleep factors than males. The notion of females being more likely than males to report symptoms does not appear to apply across the developmental age span, particularly prior to adolescence. Females show greater emotional endorsement across the youth age span (10-22 years). Adolescence (13-17 years) appears to be a time of increased symptomatology that may lessen after the age of 18. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. Baseline monitoring and simulated liquid release test report for Tank W-9, Oak Ridge National Laboratory, Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    1997-08-01

    This document provides the Environmental Restoration Program with the baseline dry well conductivity monitoring data and simulated liquid release tests to support the use of Gunite and Associated Tank (GAAT) W-9 as a temporary consolidation tank during waste removal operations. Information provided in this report forms part of the technical basis for criticality safety, systems safety, engineering design and waste management as they apply to the GAAT treatability study and waste removal actions

  16. The CEGB approach to defining the commissioning tests for prime movers

    International Nuclear Information System (INIS)

    Horne, B.E.

    1986-01-01

    This paper describes the CEGB approach to demonstrating during commissioning the adequacy of the reliability of the large on-site essential electrical power sources installed in the CAGR power stations. In this approach the reliability requirements of the essential electrical supplies at the power stations are defined and then the reliability requirements of the particular gas turbine and diesel generator installation derived. The paper outlines the probabilistic methods used in arriving at the specific start and run test programmes which were subsequently carried out. The results achieved in these test programmes in demonstrating that the reliability requirements were satisfied, are presented In the paper. (author)

  17. The CEGB approach to defining the commissioning tests for prime movers

    Energy Technology Data Exchange (ETDEWEB)

    Horne, B. E. [CEGB, Generation Development and Construction Division, Barnett Way, Barnwood, Gloucester GL4 7RS (United Kingdom)

    1986-02-15

    This paper describes the CEGB approach to demonstrating during commissioning the adequacy of the reliability of the large on-site essential electrical power sources installed in the CAGR power stations. In this approach the reliability requirements of the essential electrical supplies at the power stations are defined and then the reliability requirements of the particular gas turbine and diesel generator installation derived. The paper outlines the probabilistic methods used in arriving at the specific start and run test programmes which were subsequently carried out. The results achieved in these test programmes in demonstrating that the reliability requirements were satisfied, are presented In the paper. (author)

  18. Baseline tests of the C. H. Waterman DAF electric passenger vehicle

    Science.gov (United States)

    Sargent, N. B.; Maslowski, E. A.; Soltis, R. F.; Schuh, R. M.

    1977-01-01

    An electric vehicle was tested as part of an Energy Research Development Administration (ERDA) project to characterize the state-of-the-art of electric vehicles. The Waterman vehicle performance test results are presented in this report. The vehicle is a converted four-passenger DAF 46 sedan. It is powered by sixteen 6-volt traction batteries through a three-step contactor controller actuated by a foot throttle to change the voltage applied to the 6.7 kW motor. The braking system is a conventional hydraulic braking system.

  19. Test documentation to convert TWRS baseline data for RDD-100 upgrades

    International Nuclear Information System (INIS)

    Gneiting, B.C.

    1997-01-01

    This document describes the test documentation required for converting between different versions of the RDD-100 software application. The area of focus is the successful conversion of the master data set between different versions of the database tool and their corresponding data structures

  20. Baseline test data for the EVA electric vehicle. [low energy consumption automobiles

    Science.gov (United States)

    Harhay, W. C.; Bozek, J.

    1976-01-01

    Two electric vehicles from Electric Vehicle Associates were evaluated for ERDA at the Transportation Research Center of Ohio. The vehicles, loaded to a gross vehicle weight of 3750 pounds, had a range of 56.3 miles at a steady speed of 25 mph and a 27.4 miles range during acceleration-deceleration tests to a top speed of 30 mph. Energy consumption varied from 0.48 kw-hr/mi. to 0.59 kw-hr/mi.

  1. Baseline tests of the C. H. Waterman Renault 5 electric passenger vehicle

    Science.gov (United States)

    Sargent, N. B.; Mcbrien, E. F.; Slavick, R. J.

    1977-01-01

    The Waterman vehicle, a four passenger Renault 5 GTL, performance test results are presented and characterized the state-of-the-art of electric vehicles. It was powered by sixteen 6-volt traction batteries through a two-step contactor controller actuated by a foot throttle to change the voltage applied to the 6.7 -kilowatt motor. The motor output shaft was connected to a front-wheel-drive transaxle that contains a four-speed manual transmission and clutch. The braking system was a conventional hydraulic braking system.

  2. Baseline investigations of bats and birds at Wind Turbine Test Centre Østerild

    DEFF Research Database (Denmark)

    The Department of Bioscience, Aarhus University was commissioned by the Danish Nature Agency to undertake a bat and bird monitoring programme prior to the construction of a national test centre for wind turbines near Østerild in Thy, Denmark. The occurrence and activity level of bats in Østerild...... Plantation and the vicinity were monitored in summer and autumn 2011. Bats were recorded on 57-100% of surveyed nights at individual wind turbine sites, ponds and lakes. A total of seven species were recorded. Pond bats were recorded at all sites and throughout the survey period in the plantation. Whooper...... swan, taiga bean goose, pink-footed goose and common crane were included as focal species in the ornithological investigations. In addition, species specific data on all bird species occurring regularly in the study area were collected. On the basis of a preliminary assessment of collision risk...

  3. Test of Lorentz and CPT violation with short baseline neutrino oscillation excesses

    Energy Technology Data Exchange (ETDEWEB)

    Aguilar-Arevalo, A.A. [Instituto de Ciencias Nucleares, Universidad Nacional Autonoma de Mexico, D.F. 04510 (Mexico); Anderson, C.E. [Yale University, New Haven, CT 06520 (United States); Bazarko, A.O. [Princeton University, Princeton, NJ 08544 (United States); Brice, S.J.; Brown, B.C. [Fermi National Accelerator Laboratory, Batavia, IL 60510 (United States); Bugel, L. [Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Cao, J. [University of Michigan, Ann Arbor, MI 48109 (United States); Coney, L. [Columbia University, New York, NY 10027 (United States); Conrad, J.M. [Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Cox, D.C. [Indiana University, Bloomington, IN 47405 (United States); Curioni, A. [Yale University, New Haven, CT 06520 (United States); Dharmapalan, R. [University of Alabama, Tuscaloosa, AL 35487 (United States); Djurcic, Z. [Argonne National Laboratory, Argonne, IL 60439 (United States); Finley, D.A. [Fermi National Accelerator Laboratory, Batavia, IL 60510 (United States); Fleming, B.T. [Yale University, New Haven, CT 06520 (United States); Ford, R.; Garcia, F.G. [Fermi National Accelerator Laboratory, Batavia, IL 60510 (United States); Garvey, G.T. [Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Grange, J. [University of Florida, Gainesville, FL 32611 (United States); Green, C. [Fermi National Accelerator Laboratory, Batavia, IL 60510 (United States); Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); and others

    2013-01-29

    The sidereal time dependence of MiniBooNE {nu}{sub e} and {nu}{sup Macron }{sub e} appearance data is analyzed to search for evidence of Lorentz and CPT violation. An unbinned Kolmogorov-Smirnov (K-S) test shows both the {nu}{sub e} and {nu}{sup Macron }{sub e} appearance data are compatible with the null sidereal variation hypothesis to more than 5%. Using an unbinned likelihood fit with a Lorentz-violating oscillation model derived from the Standard Model Extension (SME) to describe any excess events over background, we find that the {nu}{sub e} appearance data prefer a sidereal time-independent solution, and the {nu}{sup Macron }{sub e} appearance data slightly prefer a sidereal time-dependent solution. Limits of order 10{sup -20} GeV are placed on combinations of SME coefficients. These limits give the best limits on certain SME coefficients for {nu}{sub {mu}}{yields}{nu}{sub e} and {nu}{sup Macron }{sub {mu}}{yields}{nu}{sup Macron }{sub e} oscillations. The fit values and limits of combinations of SME coefficients are provided.

  4. Test of Lorentz and CPT violation with short baseline neutrino oscillation excesses

    International Nuclear Information System (INIS)

    Aguilar-Arevalo, A.A.; Anderson, C.E.; Bazarko, A.O.; Brice, S.J.; Brown, B.C.; Bugel, L.; Cao, J.; Coney, L.; Conrad, J.M.; Cox, D.C.; Curioni, A.; Dharmapalan, R.; Djurcic, Z.; Finley, D.A.; Fleming, B.T.; Ford, R.; Garcia, F.G.; Garvey, G.T.; Grange, J.; Green, C.

    2013-01-01

    The sidereal time dependence of MiniBooNE ν e and ν ¯ e appearance data is analyzed to search for evidence of Lorentz and CPT violation. An unbinned Kolmogorov–Smirnov (K–S) test shows both the ν e and ν ¯ e appearance data are compatible with the null sidereal variation hypothesis to more than 5%. Using an unbinned likelihood fit with a Lorentz-violating oscillation model derived from the Standard Model Extension (SME) to describe any excess events over background, we find that the ν e appearance data prefer a sidereal time-independent solution, and the ν ¯ e appearance data slightly prefer a sidereal time-dependent solution. Limits of order 10 −20 GeV are placed on combinations of SME coefficients. These limits give the best limits on certain SME coefficients for ν μ →ν e and ν ¯ μ →ν ¯ e oscillations. The fit values and limits of combinations of SME coefficients are provided.

  5. Recommendation to test limonene hydroperoxides 0·3% and linalool hydroperoxides 1·0% in the British baseline patch test series.

    Science.gov (United States)

    Wlodek, C; Penfold, C M; Bourke, J F; Chowdhury, M M U; Cooper, S M; Ghaffar, S; Green, C; Holden, C R; Johnston, G A; Mughal, A A; Reckling, C; Sabroe, R A; Stone, N M; Thompson, D; Wilkinson, S M; Buckley, D A

    2017-12-01

    There is a significant rate of sensitization worldwide to the oxidized fragrance terpenes limonene and linalool. Patch testing to oxidized terpenes is not routinely carried out; the ideal patch test concentration is unknown. To determine the best test concentrations for limonene and linalool hydroperoxides, added to the British baseline patch test series, to optimize detection of true allergy and to minimize irritant reactions. During 2013-2014, 4563 consecutive patients in 12 U.K. centres were tested to hydroperoxides of limonene in petrolatum (pet.) 0·3%, 0·2% and 0·1%, and hydroperoxides of linalool 1·0%, 0·5% and 0·25% pet. Irritant reactions were recorded separately from doubtful reactions. Concomitant reactions to other fragrance markers and clinical relevance were documented. Limonene hydroperoxide 0·3% gave positive reactions in 241 (5·3%) patients, irritant reactions in 93 (2·0%) and doubtful reactions in 110 (2·4%). Linalool hydroperoxide 1·0% gave positive reactions in 352 (7·7%), irritant reactions in 178 (3·9%) and doubtful reactions in 132 (2·9%). A total of 119 patients with crescendo reactions to 0·3% limonene would have been missed if only tested with 0·1% and 131 patients with crescendo reactions to 1·0% linalool would have been missed if only tested with 0·25%. In almost two-thirds of patients with positive patch tests to limonene and linalool the reaction was clinically relevant. The majority of patients did not react to any fragrance marker in the baseline series. We recommend that limonene hydroperoxides be tested at 0·3% and linalool hydroperoxides at 1·0% in the British baseline patch test series. © 2017 British Association of Dermatologists.

  6. Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing

    Energy Technology Data Exchange (ETDEWEB)

    Price, Phillip N.; Granderson, Jessica; Sohn, Michael; Addy, Nathan; Jump, David

    2013-09-01

    The overarching goal of this work is to advance the capabilities of technology evaluators in evaluating the building-level baseline modeling capabilities of Energy Management and Information System (EMIS) software. Through their customer engagement platforms and products, EMIS software products have the potential to produce whole-building energy savings through multiple strategies: building system operation improvements, equipment efficiency upgrades and replacements, and inducement of behavioral change among the occupants and operations personnel. Some offerings may also automate the quantification of whole-building energy savings, relative to a baseline period, using empirical models that relate energy consumption to key influencing parameters, such as ambient weather conditions and building operation schedule. These automated baseline models can be used to streamline the whole-building measurement and verification (M&V) process, and therefore are of critical importance in the context of multi-measure whole-building focused utility efficiency programs. This report documents the findings of a study that was conducted to begin answering critical questions regarding quantification of savings at the whole-building level, and the use of automated and commercial software tools. To evaluate the modeling capabilities of EMIS software particular to the use case of whole-building savings estimation, four research questions were addressed: 1. What is a general methodology that can be used to evaluate baseline model performance, both in terms of a) overall robustness, and b) relative to other models? 2. How can that general methodology be applied to evaluate proprietary models that are embedded in commercial EMIS tools? How might one handle practical issues associated with data security, intellectual property, appropriate testing ‘blinds’, and large data sets? 3. How can buildings be pre-screened to identify those that are the most model-predictable, and therefore those

  7. Challenges in defining a radiologic and hydrologic source term for underground nuclear test centers, Nevada Test Site, Nye County, Nevada

    International Nuclear Information System (INIS)

    Smith, D.K.

    1995-06-01

    The compilation of a radionuclide inventory for long-lived radioactive contaminants residual from nuclear testing provides a partial measure of the radiologic source term at the Nevada Test Site. The radiologic source term also includes potentially mobile short-lived radionuclides excluded from the inventory. The radiologic source term for tritium is known with accuracy and is equivalent to the hydrologic source term within the saturated zone. Definition of the total hydrologic source term for fission and activation products that have high activities for decades following underground testing involves knowledge and assumptions which are presently unavailable. Systematic investigation of the behavior of fission products, activation products and actinides under saturated or Partially saturated conditions is imperative to define a representative total hydrologic source term. This is particularly important given the heterogeneous distribution of radionuclides within testing centers. Data quality objectives which emphasize a combination of measurements and credible estimates of the hydrologic source term are a priority for near-field investigations at the Nevada Test Site

  8. Moral Rationality and Intuition: An Exploration of Relationships between the Defining Issues Test and the Moral Foundations Questionnaire

    Science.gov (United States)

    Glover, Rebecca J.; Natesan, Prathiba; Wang, Jie; Rohr, Danielle; McAfee-Etheridge, Lauri; Booker, Dana D.; Bishop, James; Lee, David; Kildare, Cory; Wu, Minwei

    2014-01-01

    Explorations of relationships between Haidt's Moral Foundations Questionnaire (MFQ) and indices of moral decision-making assessed by the Defining Issues Test have been limited to correlational analyses. This study used Harm, Fairness, Ingroup, Authority and Purity to predict overall moral judgment and individual Defining Issues Test-2 (DIT-2)…

  9. Insights From the Defining Issues Test on Moral Reasoning Competencies Development in Community Pharmacists.

    Science.gov (United States)

    Roche, Cicely; Thoma, Steve

    2017-10-01

    Objective. To investigate whether a profession-specific educational intervention affected the development of moral reasoning competencies in community pharmacists, as measured by the Defining Issues Test (DIT2). Methods. This research used a repeated measures pre-post educational intervention design as a quasi-randomized, controlled, crossover study to evaluate changes in the moral reasoning scores of 27 volunteer community pharmacists in Ireland. Results. Changes in pharmacists' moral reasoning competencies development, as reported by P-Scores and N2-Scores, were found to be significant. In addition, interaction effects were observed between developmental scores on the DIT2 and whether participants were determined to be consolidated in their reasoning pre- and post-engagement with the educational intervention. Conclusion. Short profession-specific educational interventions have the potential to positively affect the development of moral reasoning competencies of community pharmacists.

  10. Baseline growth and reproductive parameters in Lymnaea stagnalis for OECD test guideline development: optimization of diets and culturing conditions

    DEFF Research Database (Denmark)

    Holbech, Henrik; Hutchinson, Tom

    laboratories in Denmark, Germany and the UK for the OECD pre-validation work to date. Laboratory cultures of L. stagnalis are traditionally fed fresh (preferably organic) lettuce; however, interrupted supplies of fresh lettuce in some countries in 2011 highlighted a potential problem for the draft OECD test...... of a mollusc reproduction test guideline. An ad hoc mollusc expert group has been formed in Europe to validate methods that can meet this need. Currently, a key species for use in this context is the freshwater gastropod Lymnaea stagnalis. An important aspect of this work is to first develop a specific...... pathogen free defined strain of L. stagnalis and second to establish a historical database of growth and reproductive rates under defined culturing conditions. A mass culture of the RENILYS® strain of L. stagnalis have been established at INRA (France) since 2002 and has been distributed to research...

  11. Assessing group differences in biodiversity by simultaneously testing a user-defined selection of diversity indices.

    Science.gov (United States)

    Pallmann, Philip; Schaarschmidt, Frank; Hothorn, Ludwig A; Fischer, Christiane; Nacke, Heiko; Priesnitz, Kai U; Schork, Nicholas J

    2012-11-01

    Comparing diversities between groups is a task biologists are frequently faced with, for example in ecological field trials or when dealing with metagenomics data. However, researchers often waver about which measure of diversity to choose as there is a multitude of approaches available. As Jost (2008, Molecular Ecology, 17, 4015) has pointed out, widely used measures such as the Shannon or Simpson index have undesirable properties which make them hard to compare and interpret. Many of the problems associated with the use of these 'raw' indices can be corrected by transforming them into 'true' diversity measures. We introduce a technique that allows the comparison of two or more groups of observations and simultaneously tests a user-defined selection of a number of 'true' diversity measures. This procedure yields multiplicity-adjusted P-values according to the method of Westfall and Young (1993, Resampling-Based Multiple Testing: Examples and Methods for p-Value Adjustment, 49, 941), which ensures that the rate of false positives (type I error) does not rise when the number of groups and/or diversity indices is extended. Software is available in the R package 'simboot'. © 2012 Blackwell Publishing Ltd.

  12. So ware-Defined Network Solutions for Science Scenarios: Performance Testing Framework and Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Settlemyer, Bradley [Los Alamos National Laboratory (LANL); Kettimuthu, R. [Argonne National Laboratory (ANL); Boley, Josh [Argonne National Laboratory (ANL); Katramatos, Dimitrios [Brookhaven National Laboratory (BNL); Rao, Nageswara S. [ORNL; Sen, Satyabrata [ORNL; Liu, Qiang [ORNL

    2018-01-01

    High-performance scientific work flows utilize supercomputers, scientific instruments, and large storage systems. Their executions require fast setup of a small number of dedicated network connections across the geographically distributed facility sites. We present Software-Defined Network (SDN) solutions consisting of site daemons that use dpctl, Floodlight, ONOS, or OpenDaylight controllers to set up these connections. The development of these SDN solutions could be quite disruptive to the infrastructure, while requiring a close coordination among multiple sites; in addition, the large number of possible controller and device combinations to investigate could make the infrastructure unavailable to regular users for extended periods of time. In response, we develop a Virtual Science Network Environment (VSNE) using virtual machines, Mininet, and custom scripts that support the development, testing, and evaluation of SDN solutions, without the constraints and expenses of multi-site physical infrastructures; furthermore, the chosen solutions can be directly transferred to production deployments. By complementing VSNE with a physical testbed, we conduct targeted performance tests of various SDN solutions to help choose the best candidates. In addition, we propose a switching response method to assess the setup times and throughput performances of different SDN solutions, and present experimental results that show their advantages and limitations.

  13. Current patch test results with the European baseline series and extensions to it from the 'European Surveillance System on Contact Allergy' network, 2007-2008

    DEFF Research Database (Denmark)

    Uter, Wolfgang; Aberer, Werner; Armario-Hita, José Carlos

    2012-01-01

    The pattern of contact sensitization to the supposedly most important allergens assembled in the baseline series differs between countries, presumably at least partly because of exposure differences. Objectives. To describe the prevalence of contact sensitization to allergens tested in consecutiv...

  14. Current patch test results with the European baseline series and extensions to it from the 'European Surveillance System on Contact Allergy' network, 2007-2008

    NARCIS (Netherlands)

    Uter, Wolfgang; Aberer, Werner; Armario-Hita, José Carlos; Fernandez-Vozmediano, José M; Ayala, Fabio; Balato, Anna; Bauer, Andrea; Ballmer-Weber, Barbara; Beliauskiene, Aiste; Fortina, Anna Belloni; Bircher, Andreas; Brasch, Jochen; Chowdhury, Mahbub M U; Coenraads, Pieter-Jan; Schuttelaar, Marie-Louise; Cooper, Sue; Czarnecka-Operacz, Magda; Zmudzinska, Maria; Elsner, Peter; English, John S C; Frosch, Peter J; Fuchs, Thomas; García-Gavín, Juan; Fernández-Redondo, Virginia; Gawkrodger, David J; Giménez-Arnau, Ana; Green, Cathy M; Horne, Helen L; Johansen, Jeanne Duus; Jolanki, Riitta; Pesonen, Maria; King, Clodagh M; Krêcisz, Beata; Chomiczewska, Dorota; Kiec-Swierczynska, Marta; Larese, Francesca; Mahler, Vera; Ormerod, Anthony D; Peserico, Andrea; Rantanen, Tapio; Rustemeyer, Thomas; Sánchez-Pérez, Javier; Sansom, Jane E; Silvestre, Juan Fco; Simon, Dagmar; Spiewak, Radoslaw; Statham, Barry N; Stone, Natalie; Wilkinson, Mark; Schnuch, Axel

    BACKGROUND: The pattern of contact sensitization to the supposedly most important allergens assembled in the baseline series differs between countries, presumably at least partly because of exposure differences. Objectives. To describe the prevalence of contact sensitization to allergens tested in

  15. Baseline seismic survey for the 2nd offshore methane hydrate production test in the Eastern Nankai Trough

    Science.gov (United States)

    Teranishi, Y.; Inamori, T.; Kobayashi, T.; Fujii, T.; Saeki, T.; Takahashi, H.; Kobayashi, F.

    2017-12-01

    JOGMEC carries out seismic monitoring surveys before and after the 2nd offshore methane hydrate (MH) production test in the Eastern Nankai Trough and evaluates MH dissociation behavior from the time-lapse seismic response. In 2016, JOGMEC deployed Ocean Bottom Cable (OBC) system provided by OCC in the Daini Atsumi Knoll with water depths of 900-1100 m. The main challenge of the seismic survey was to optimize the cable layout for ensuring an effective time-lapse seismic detectability while overcoming the following two issues: 1. OBC receiver lines were limited to only two lines. It was predicted that the imaging of shallow reflectors would suffer from lack of continuity and resolution due to this limitation of receiver lines. 2. The seafloor and shallow sedimentary layers including monitoring target are dipping to the Northwest direction. It was predicted that the refection points would laterally shift to up-dip direction (Southeast direction). In order to understand the impact of the issues above, the seismic survey was designed with elastic wave field simulation. The reflection seismic survey for baseline data was conducted in August 2016. A total of 70 receiver stations distributed along one cable were deployed successfully and a total of 9952 shots were fired. After the baseline seismic survey, the hydrophone and geophone vertical component datasets were processed as outlined below: designaturing, denoising, surface consistent deconvolution and surface consistent amplitude correction. High-frequency imaging with Reverse Time Migration (RTM) was introduced to these data sets. Improvements in imaging from the RTM are remarkable compared to the Kirchhoff migration and the existing Pre-stack time migration with 3D marine surface seismic data obtained and processed in 2002, especially in the following parts. The MH concentrated zone which has complex structures. Below the Bottom Simulating Reflector (BSR) which is present as a impedance-contrast boundary

  16. Lymphatic filariasis mapping by Immunochromatographic Test cards and baseline microfilaria survey prior to mass drug administration in Sierra Leone

    Directory of Open Access Journals (Sweden)

    Koroma Joseph B

    2012-01-01

    Full Text Available Abstract Background National mapping of lymphatic filariasis (LF was conducted using Immunochromatographic tests (ICT in 2005 to determine endemicity and geographic spread of the disease. A baseline microfilaria survey was then conducted to determine LF prevalence and microfilaria intensity. Methods In 2005 1,982 persons of 15 years and over from 14 health districts were selected and fingertip blood samples were tested with ICT cards. In 2007-8 blood samples were taken between 10 p.m. and 2 a.m. and examined for microfilaria (mf from 9,288 persons from 16 sentinel sites representing each district and 2 additional sites for districts with populations over 500,000 (Bo and Kenema. Results The overall LF prevalence by ICT cards was 21% (males 28%, females 15%. All districts had a prevalence of Wuchereria bancrofti antigen > 1%. Distribution of LF prevalence showed a strong spatial correlation pattern with high prevalence in a large area in the northeast gradually decreasing to a relatively low prevalence in the southwest coast. High prevalence was found in the northeast, Bombali (52%, Koinadugu (46%, Tonkolili (37% and Kono (30%. Low prevalence was found in the southwest, Bonthe (3% and Pujehun (4%. The mf prevalence was higher in the northeast: Bombali, 6.7%, Koinadugu 5.7%, Port Loko 4.4% and Kono 2.4%. Overall there was a significant difference in mf prevalence by gender: males 2.9%, females 1.8% (p = 0.0002 and within districts in Kailahun, Kono, Port Loko, Moyamba and Koinadugu (all p 20 years (2.5% than in people ≤ 20 years (1.7% (p = 0.043. The overall arithmetic mean mf density was 50.30 mf/ml among mf-positive individuals and 1.19 mf/ml in the population examined which varied significantly between districts. Conclusions The ICT results showed that LF was endemic nationwide and that preventive chemotherapy (PCT was justified across the country. Both the ICT and microfilaraemia surveys found that prevalence was greater in males than females

  17. Links between early baseline cortisol, attachment classification, and problem behaviors: A test of differential susceptibility versus diathesis-stress.

    Science.gov (United States)

    Fong, Michelle C; Measelle, Jeffrey; Conradt, Elisabeth; Ablow, Jennifer C

    2017-02-01

    The purpose of the current study was to predict concurrent levels of problem behaviors from young children's baseline cortisol and attachment classification, a proxy for the quality of caregiving experienced. In a sample of 58 children living at or below the federal poverty threshold, children's baseline cortisol levels, attachment classification, and problem behaviors were assessed at 17 months of age. We hypothesized that an interaction between baseline cortisol and attachment classification would predict problem behaviors above and beyond any main effects of baseline cortisol and attachment. However, based on limited prior research, we did not predict whether or not this interaction would be more consistent with diathesis-stress or differential susceptibility models. Consistent with diathesis-stress theory, the results indicated no significant differences in problem behavior levels among children with high baseline cortisol. In contrast, children with low baseline cortisol had the highest level of problem behaviors in the context of a disorganized attachment relationship. However, in the context of a secure attachment relationship, children with low baseline cortisol looked no different, with respect to problem behavior levels, then children with high cortisol levels. These findings have substantive implications for the socioemotional development of children reared in poverty. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Analysis of Baseline Computerized Neurocognitive Testing Results among 5–11-Year-Old Male and Female Children Playing Sports in Recreational Leagues in Florida

    Directory of Open Access Journals (Sweden)

    Karen D. Liller

    2017-09-01

    Full Text Available There is a paucity of data related to sports injuries, concussions, and computerized neurocognitive testing (CNT among very young athletes playing sports in recreational settings. The purpose of this study was to report baseline CNT results among male and female children, ages 5–11, playing sports in Hillsborough County, Florida using ImPACT Pediatric, which is specifically designed for this population. Data were collected from 2016 to 2017. The results show that 657 baseline tests were conducted and t-tests and linear regression were used to assess mean significant differences in composite scores with sex and age. Results showed that females scored better on visual memory and in general as age increased, baseline scores improved. The results can be used to build further studies on the use of CNT in recreational settings and their role in concussion treatment, management, and interventions.

  19. Software-Defined Radio Global System for Mobile Communications Transmitter Development for Heterogeneous Network Vulnerability Testing

    Science.gov (United States)

    2013-12-01

    AbdelWahab, “ 2G / 3G Inter-RAT Handover Performance Analysis,” Second European Conference on Antennas and Propagation, pp. 1, 8, 11–16, Nov. 2007. [19] J...RADIO GLOBAL SYSTEM FOR MOBILE COMMUNICATIONS TRANSMITTER DEVELOPMENT FOR HETEROGENEOUS NETWORK VULNERABILITY TESTING by Carson C. McAbee... MOBILE COMMUNICATIONS TRANSMITTER DEVELOPMENT FOR HETEROGENEOUS NETWORK VULNERABILITY TESTING 5. FUNDING NUMBERS 6. AUTHOR(S) Carson C. McAbee

  20. A COTS RF Optical Software Defined Radio for the Integrated Radio and Optical Communications Test Bed

    Science.gov (United States)

    Nappier, Jennifer M.; Zeleznikar, Daniel J.; Wroblewski, Adam C.; Tokars, Roger P.; Schoenholz, Bryan L.; Lantz, Nicholas C.

    2016-01-01

    The Integrated Radio and Optical Communications (iROC) project at the National Aeronautics and Space Administration (NASA) is investigating the merits of a hybrid radio frequency (RF) and optical communication system for deep space missions. In an effort to demonstrate the feasibility and advantages of a hybrid RFOptical software defined radio (SDR), a laboratory prototype was assembled from primarily commercial-off-the-shelf (COTS) hardware components. This COTS platform has been used to demonstrate simultaneous transmission of the radio and optical communications waveforms through to the physical layer (telescope and antenna). This paper details the hardware and software used in the platform and various measures of its performance. A laboratory optical receiver platform has also been assembled in order to demonstrate hybrid free space links in combination with the transmitter.

  1. Could Daylight Glare Be Defined Mathematically?Results of Testing the DGIN Method in Japan

    Science.gov (United States)

    Nazzal, Ali; Oki, Masato

    Discomfort glare from daylight is a common problem without valid prediction methods so far. A new mathematical DGIN (New Daylight Glare Index) method tries to respond the challenge. This paper reports on experiments carried out in daylit office environment in Japan to test applicability of the method. Slight positive correlation was found between the DGIN and the subjective evaluation. Additionally, a high Ladaptation value together with the small ratio of Lwindow to Ladaptation was obviously experienced sufficient to neutralize the effect of glare discomfort. However, subjective assessments are poor glare indicators and not reliable in testing glare prediction methods. DGIN is a good indicator of daylight glare, and when the DGIN value is analyzed together with the measured illuminance ratios, discomfort glare from daylight can be analyzed in a quantitative manner. The DGIN method could serve architects and lighting designers in testing daylighting systems, and also guide the action of daylight responsive lighting controls.

  2. Correlations between tests of aging in Hiroshima subjects: an attempt to define physiologic age

    Energy Technology Data Exchange (ETDEWEB)

    Hollingsworth, J W; Hashizume, Asaji; Jablon, Seymour

    1964-12-01

    Nine physiologic functions which change with age were measured in 437 subjects during their regular visits to the Atomic Bomb Casualty Commission clinic in Hiroshima, Japan. This pilot study was undertaken to determine the feasibility of collecting such data in a population sample physiologic age score. Tests conducted consisted of: skin elasticity, systolic blood pressure, vital capacity, hand grip strength, light extinction time, vibrometer, visual activity, audiometry, and serum cholesterol. The study demonstrated that adequate sample data could be obtained, and that statistical treatment could construct a physiologic age for individual subjects. However, the tests were of limited value below age 40, and the validation of the concept of physiologic age requires eventual correlation with mortality. Since the ABCC program includes a highly accurate mortality survey, it is hoped that data on physiologic aging can be collected and eventually related to mortality. 11 references, 3 figures, 6 tables.

  3. Using Bilateral Functional and Anthropometric Tests to Define Symmetry in Cross-Country Skiers

    Directory of Open Access Journals (Sweden)

    Björklund Glenn

    2017-12-01

    Full Text Available The aim of this study was to evaluate the symmetry of anthropometry and muscle function in cross-country skiers and their association to vertical jumping power. Twenty cross-country skiers were recruited (21.7 ± 3.8 yrs, 180.6 ± 7.6 cm, 73.2 ± 7.6 kg. Anthropometric data was obtained using an iDXA scan. VO2max was determined using the diagonal stride technique on a ski treadmill. Bilateral functional tests for the upper and lower body were the handgrip and standing heel-rise tests. Vertical jump height and power were assessed with a counter movement jump. Percent asymmetry was calculated using a symmetry index and four absolute symmetry index levels. At a group level the upper body was more asymmetrical with regard to lean muscle mass (p = 0.022, d = 0.17 and functional strength (p = 0.019, d = 0.51 than the lower body. At an individual level the expected frequencies for absolute symmetry level indexes showed the largest deviation from zero for the heel-rise test (χ2 = 16.97, p = 0.001, while the leg lean mass deviated the least (χ2 = 0.42, p = 0.517. No relationships were observed between absolute symmetry level indexes of the lower body and counter movement jump performance (p > 0.05. As a group the skiers display a more asymmetrical upper body than lower body regarding muscle mass and strength. Interestingly at the individual level, despite symmetrical lean leg muscle mass the heel-rise test showed the largest asymmetry. This finding indicates a mismatch in muscle function for the lower body.

  4. Using Bilateral Functional and Anthropometric Tests to Define Symmetry in Cross-Country Skiers.

    Science.gov (United States)

    Björklund, Glenn; Alricsson, Marie; Svantesson, Ulla

    2017-12-01

    The aim of this study was to evaluate the symmetry of anthropometry and muscle function in cross-country skiers and their association to vertical jumping power. Twenty cross-country skiers were recruited (21.7 ± 3.8 yrs, 180.6 ± 7.6 cm, 73.2 ± 7.6 kg). Anthropometric data was obtained using an iDXA scan. VO 2max was determined using the diagonal stride technique on a ski treadmill. Bilateral functional tests for the upper and lower body were the handgrip and standing heel-rise tests. Vertical jump height and power were assessed with a counter movement jump. Percent asymmetry was calculated using a symmetry index and four absolute symmetry index levels. At a group level the upper body was more asymmetrical with regard to lean muscle mass (p = 0.022, d = 0.17) and functional strength (p = 0.019, d = 0.51) than the lower body. At an individual level the expected frequencies for absolute symmetry level indexes showed the largest deviation from zero for the heel-rise test (χ2 = 16.97, p = 0.001), while the leg lean mass deviated the least (χ2 = 0.42, p = 0.517). No relationships were observed between absolute symmetry level indexes of the lower body and counter movement jump performance (p > 0.05). As a group the skiers display a more asymmetrical upper body than lower body regarding muscle mass and strength. Interestingly at the individual level, despite symmetrical lean leg muscle mass the heel-rise test showed the largest asymmetry. This finding indicates a mismatch in muscle function for the lower body.

  5. Standardisation of defined approaches for skin sensitisation testing to support regulatory use and international adoption: position of the International Cooperation on Alternative Test Methods.

    Science.gov (United States)

    Casati, S; Aschberger, K; Barroso, J; Casey, W; Delgado, I; Kim, T S; Kleinstreuer, N; Kojima, H; Lee, J K; Lowit, A; Park, H K; Régimbald-Krnel, M J; Strickland, J; Whelan, M; Yang, Y; Zuang, Valérie

    2018-02-01

    Skin sensitisation is the regulatory endpoint that has been at the centre of concerted efforts to replace animal testing in recent years, as demonstrated by the Organisation for Economic Co-operation and Development (OECD) adoption of five non-animal methods addressing mechanisms under the first three key events of the skin sensitisation adverse outcome pathway. Nevertheless, the currently adopted methods, when used in isolation, are not sufficient to fulfil regulatory requirements on the skin sensitisation potential and potency of chemicals comparable to that provided by the regulatory animal tests. For this reason, a number of defined approaches integrating data from these methods with other relevant information have been proposed and documented by the OECD. With the aim to further enhance regulatory consideration and adoption of defined approaches, the European Union Reference Laboratory for Alternatives to Animal testing in collaboration with the International Cooperation on Alternative Test Methods hosted, on 4-5 October 2016, a workshop on the international regulatory applicability and acceptance of alternative non-animal approaches, i.e., defined approaches, to skin sensitisation assessment of chemicals used in a variety of sectors. The workshop convened representatives from more than 20 regulatory authorities from the European Union, United States, Canada, Japan, South Korea, Brazil and China. There was a general consensus among the workshop participants that to maximise global regulatory acceptance of data generated with defined approaches, international harmonisation and standardisation are needed. Potential assessment criteria were defined for a systematic evaluation of existing defined approaches that would facilitate their translation into international standards, e.g., into a performance-based Test Guideline. Informed by the discussions at the workshop, the ICATM members propose practical ways to further promote the regulatory use and facilitate

  6. KIGAM Seafloor Observation System (KISOS) for the baseline study in monitoring of gas hydrate test production in the Ulleung Basin, Korea

    Science.gov (United States)

    Lee, Sung-rock; Chun, Jong-hwa

    2013-04-01

    For the baseline study in the monitoring gas hydrate test production in the Ulleung Basin, Korea Institute of Geoscience and Mineral Resources (KIGAM) has developed the KIGAM Seafloor Observation System (KISOS) for seafloor exploration using unmanned remotely operated vehicle connected with a ship by a cable. The KISOS consists of a transponder of an acoustic positioning system (USBL), a bottom finding pinger, still camera, video camera, water sampler, and measuring devices (methane, oxygen, CTD, and turbidity sensors) mounted on the unmanned ROV, and a sediment collecting device collecting sediment on the seafloor. It is very important to monitoring the environmental risks (gas leakage and production water/drilling mud discharge) which may be occurred during the gas hydrate test production drilling. The KISOS will be applied to solely conduct baseline study with the KIGAM seafloor monitoring system (KIMOS) of the Korean gas hydrate program in the future. The large scale of environmental monitoring program includes the environmental impact assessment such as seafloor disturbance and subsidence, detection of methane gas leakage around well and cold seep, methane bubbles and dissolved methane, change of marine environments, chemical factor variation of water column and seabed, diffusion of drilling mud and production water, and biological factors of biodiversity and marine habitats before and after drilling test well and nearby areas. The design of the baseline survey will be determined based on the result of SIMAP simulation in 2013. The baseline survey will be performed to provide the gas leakage and production water/drilling mud discharge before and after gas hydrate test production. The field data of the baseline study will be evaluated by the simulation and verification of SIMAP simulator in 2014. In the presentation, the authors would like introduce the configuration of KISOS and applicability to the seafloor observation for the gas hydrate test production in

  7. Defining and Developing "Critical Thinking" Through Devising and Testing Multiple Explanations of the Same Phenomenon

    Science.gov (United States)

    Etkina, Eugenia; Planinšič, Gorazd

    2015-10-01

    Most physics teachers would agree that one of the main reasons for her/his students to take physics is to learn to think critically. However, for years we have been assessing our students mostly on the knowledge of physics content (conceptually and quantitatively). Only recently have science educators started moving systematically towards achieving and assessing this critical thinking goal. In this paper we seek to show how guiding students to devise and test multiple explanations of observed phenomena can be used to improve their critical thinking.

  8. Patch test results with fragrance markers of the baseline series - analysis of the European Surveillance System on Contact Allergies (ESSCA) network 2009-2012

    DEFF Research Database (Denmark)

    Frosch, Peter J; Duus Johansen, Jeanne; Schuttelaar, Marie-Louise A

    2015-01-01

    BACKGROUND: Contact allergy to fragrances is common, and impairs quality of life, particularly in young women. OBJECTIVE: To provide current results on the prevalences of sensitization to fragrance allergens used as markers in the baseline series of most European countries. METHODS: Data of patie......BACKGROUND: Contact allergy to fragrances is common, and impairs quality of life, particularly in young women. OBJECTIVE: To provide current results on the prevalences of sensitization to fragrance allergens used as markers in the baseline series of most European countries. METHODS: Data...... of patients consecutively patch tested between 2009 and 2012 in 12 European countries with fragrance allergens contained in the baseline series were collected by the European Surveillance System on Contact Allergies network and descriptively analysed. Four departments used the TRUE Test(®) system. RESULTS......: Contact allergy to fragrances is common throughout Europe, with regional variation probably being explained by patch test technique, and differences in exposure and referral patterns. The current basic markers of fragrance sensitivity in the baseline series should be supplemented with additional fragrance...

  9. Patch test results with fragrance markers of the baseline series - analysis of the European Surveillance System on Contact Allergies (ESSCA) network 2009-2012.

    Science.gov (United States)

    Frosch, Peter J; Duus Johansen, Jeanne; Schuttelaar, Marie-Louise A; Silvestre, Juan F; Sánchez-Pérez, Javier; Weisshaar, Elke; Uter, Wolfgang

    2015-09-01

    Contact allergy to fragrances is common, and impairs quality of life, particularly in young women. To provide current results on the prevalences of sensitization to fragrance allergens used as markers in the baseline series of most European countries. Data of patients consecutively patch tested between 2009 and 2012 in 12 European countries with fragrance allergens contained in the baseline series were collected by the European Surveillance System on Contact Allergies network and descriptively analysed. Four departments used the TRUE Test(®) system. The 'basic markers' were tested on 51 477 [fragrance mix II (FM II)] to 57 123 [Myroxylon pereirae, balsam of Peru] patients, and yielded positive reactions as follows: fragrance mix I 6.9%, Myroxylon pereirae 5.4%, FM II 3.8%, colophonium 2.6%, and hydroxyisohexyl 3-cyclohexene carboxaldehyde 1.7%, with some regional differences. Prevalences with TRUE Test(®) allergens were lower. Additional fragrances were tested on 3643 (trimethylbenzenepropanol) to 14 071 (oil of turpentine) patients, and yielded between 2.6% (Cananga odorata) and 0.7% (trimethylbenzenepropanol) positive reactions. Contact allergy to fragrances is common throughout Europe, with regional variation probably being explained by patch test technique, and differences in exposure and referral patterns. The current basic markers of fragrance sensitivity in the baseline series should be supplemented with additional fragrance allergens. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  10. Does the Reporting Quality of Diagnostic Test Accuracy Studies, as Defined by STARD 2015, Affect Citation?

    Science.gov (United States)

    Choi, Young Jun; Chung, Mi Sun; Koo, Hyun Jung; Park, Ji Eun; Yoon, Hee Mang; Park, Seong Ho

    2016-01-01

    To determine the rate with which diagnostic test accuracy studies that are published in a general radiology journal adhere to the Standards for Reporting of Diagnostic Accuracy Studies (STARD) 2015, and to explore the relationship between adherence rate and citation rate while avoiding confounding by journal factors. All eligible diagnostic test accuracy studies that were published in the Korean Journal of Radiology in 2011-2015 were identified. Five reviewers assessed each article for yes/no compliance with 27 of the 30 STARD 2015 checklist items (items 28, 29, and 30 were excluded). The total STARD score (number of fulfilled STARD items) was calculated. The score of the 15 STARD items that related directly to the Quality Assessment of Diagnostic Accuracy Studies (QUADAS)-2 was also calculated. The number of times each article was cited (as indicated by the Web of Science) after publication until March 2016 and the article exposure time (time in months between publication and March 2016) were extracted. Sixty-three articles were analyzed. The mean (range) total and QUADAS-2-related STARD scores were 20.0 (14.5-25) and 11.4 (7-15), respectively. The mean citation number was 4 (0-21). Citation number did not associate significantly with either STARD score after accounting for exposure time (total score: correlation coefficient = 0.154, p = 0.232; QUADAS-2-related score: correlation coefficient = 0.143, p = 0.266). The degree of adherence to STARD 2015 was moderate for this journal, indicating that there is room for improvement. When adjusted for exposure time, the degree of adherence did not affect the citation rate.

  11. Dose the reporting quality of diagnostic test accuracy studies, as defined by STARD 2015, affect citation?

    International Nuclear Information System (INIS)

    Choi, Young Jun; Chung, Mi Sun; Koo, Hyun Jung; Park, Ji Eun; Yoon, Hee Mang; Park, Seong Ho

    2016-01-01

    To determine the rate with which diagnostic test accuracy studies that are published in a general radiology journal adhere to the Standards for Reporting of Diagnostic Accuracy Studies (STARD) 2015, and to explore the relationship between adherence rate and citation rate while avoiding confounding by journal factors. All eligible diagnostic test accuracy studies that were published in the Korean Journal of Radiology in 2011–2015 were identified. Five reviewers assessed each article for yes/no compliance with 27 of the 30 STARD 2015 checklist items (items 28, 29, and 30 were excluded). The total STARD score (number of fulfilled STARD items) was calculated. The score of the 15 STARD items that related directly to the Quality Assessment of Diagnostic Accuracy Studies (QUADAS)-2 was also calculated. The number of times each article was cited (as indicated by the Web of Science) after publication until March 2016 and the article exposure time (time in months between publication and March 2016) were extracted. Sixty-three articles were analyzed. The mean (range) total and QUADAS-2-related STARD scores were 20.0 (14.5–25) and 11.4 (7–15), respectively. The mean citation number was 4 (0–21). Citation number did not associate significantly with either STARD score after accounting for exposure time (total score: correlation coefficient = 0.154, p = 0.232; QUADAS-2-related score: correlation coefficient = 0.143, p = 0.266). The degree of adherence to STARD 2015 was moderate for this journal, indicating that there is room for improvement. When adjusted for exposure time, the degree of adherence did not affect the citation rate

  12. Dose the reporting quality of diagnostic test accuracy studies, as defined by STARD 2015, affect citation?

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Young Jun; Chung, Mi Sun; Koo, Hyun Jung; Park, Ji Eun; Yoon, Hee Mang; Park, Seong Ho [Dept. of Radiology and Research Institute of Radiology, University of Ulsan College of Medicine, Asan Medical Center, Seoul (Korea, Republic of)

    2016-09-15

    To determine the rate with which diagnostic test accuracy studies that are published in a general radiology journal adhere to the Standards for Reporting of Diagnostic Accuracy Studies (STARD) 2015, and to explore the relationship between adherence rate and citation rate while avoiding confounding by journal factors. All eligible diagnostic test accuracy studies that were published in the Korean Journal of Radiology in 2011–2015 were identified. Five reviewers assessed each article for yes/no compliance with 27 of the 30 STARD 2015 checklist items (items 28, 29, and 30 were excluded). The total STARD score (number of fulfilled STARD items) was calculated. The score of the 15 STARD items that related directly to the Quality Assessment of Diagnostic Accuracy Studies (QUADAS)-2 was also calculated. The number of times each article was cited (as indicated by the Web of Science) after publication until March 2016 and the article exposure time (time in months between publication and March 2016) were extracted. Sixty-three articles were analyzed. The mean (range) total and QUADAS-2-related STARD scores were 20.0 (14.5–25) and 11.4 (7–15), respectively. The mean citation number was 4 (0–21). Citation number did not associate significantly with either STARD score after accounting for exposure time (total score: correlation coefficient = 0.154, p = 0.232; QUADAS-2-related score: correlation coefficient = 0.143, p = 0.266). The degree of adherence to STARD 2015 was moderate for this journal, indicating that there is room for improvement. When adjusted for exposure time, the degree of adherence did not affect the citation rate.

  13. An Optical Receiver Post Processing System for the Integrated Radio and Optical Communications Software Defined Radio Test Bed

    Science.gov (United States)

    Nappier, Jennifer M.; Tokars, Roger P.; Wroblewski, Adam C.

    2016-01-01

    The Integrated Radio and Optical Communications (iROC) project at the National Aeronautics and Space Administrations (NASA) Glenn Research Center is investigating the feasibility of a hybrid radio frequency (RF) and optical communication system for future deep space missions. As a part of this investigation, a test bed for a radio frequency (RF) and optical software defined radio (SDR) has been built. Receivers and modems for the NASA deep space optical waveform are not commercially available so a custom ground optical receiver system has been built. This paper documents the ground optical receiver, which is used in order to test the RF and optical SDR in a free space optical communications link.

  14. An Optical Receiver Post-Processing System for the Integrated Radio and Optical Communications Software Defined Radio Test Bed

    Science.gov (United States)

    Nappier, Jennifer M.; Tokars, Roger P.; Wroblewski, Adam C.

    2016-01-01

    The Integrated Radio and Optical Communications (iROC) project at the National Aeronautics and Space Administration's (NASA) Glenn Research Center is investigating the feasibility of a hybrid radio frequency (RF) and optical communication system for future deep space missions. As a part of this investigation, a test bed for a radio frequency (RF) and optical software defined radio (SDR) has been built. Receivers and modems for the NASA deep space optical waveform are not commercially available so a custom ground optical receiver system has been built. This paper documents the ground optical receiver, which is used in order to test the RF and optical SDR in a free space optical communications link.

  15. Baseline conditions at Olkiluoto

    International Nuclear Information System (INIS)

    2003-09-01

    The main purpose of this report is to establish a reference point - defined as the data collected up until the end of year 2002 - for the coming phases of the Finnish spent nuclear fuel disposal programme. The focus is: to define the current surface and underground conditions at the site, both as regards the properties for which a change is expected and for the properties which are of particular interest for long-term safety or environmental impact; to establish, as far as possible, the natural fluctuation of properties that are potentially affected by construction of the underground laboratory, the ONKALO, and to provide references to data on parameters or use in model development and testing and to use models to assist in understanding and interpreting the data. The emphasis of the baseline description is on bedrock characteristics that are relevant to the long-term safety of a spent fuel repository and, hence, to include the hydrogeological, hydrogeochemical, rock mechanical, tectonic and seismic conditions of the site. The construction of the ONKALO will also affect some conditions on the surface, and, therefore, a description of the main characteristics of the nature and the man-made constructions at Olkiluoto is also given. This report is primarily a road map to the available information on the prevailing conditions at the Olkiluoto site and a framework for understanding of data collected. Hence, it refers to numerous available background reports and other archived information produced over the past 20 years or more, and forms a recapitulation and revaluation of the characterisation data of the Olkiluoto site. (orig.)

  16. Defining surgical criteria for empty nose syndrome: Validation of the office-based cotton test and clinical interpretability of the validated Empty Nose Syndrome 6-Item Questionnaire.

    Science.gov (United States)

    Thamboo, Andrew; Velasquez, Nathalia; Habib, Al-Rahim R; Zarabanda, David; Paknezhad, Hassan; Nayak, Jayakar V

    2017-08-01

    The validated Empty Nose Syndrome 6-Item Questionnaire (ENS6Q) identifies empty nose syndrome (ENS) patients. The unvalidated cotton test assesses improvement in ENS-related symptoms. By first validating the cotton test using the ENS6Q, we define the minimal clinically important difference (MCID) score for the ENS6Q. Individual case-control study. Fifteen patients diagnosed with ENS and 18 controls with non-ENS sinonasal conditions underwent office cotton placement. Both groups completed ENS6Q testing in three conditions-precotton, cotton in situ, and postcotton-to measure the reproducibility of ENS6Q scoring. Participants also completed a five-item transition scale ranging from "much better" to "much worse" to rate subjective changes in nasal breathing with and without cotton placement. Mean changes for each transition point, and the ENS6Q MCID, were then calculated. In the precotton condition, significant differences (P < .001) in all ENS6Q questions between ENS and controls were noted. With cotton in situ, nearly all prior ENS6Q differences normalized between ENS and control patients. For ENS patients, the changes in the mean differences between the precotton and cotton in situ conditions compared to postcotton versus cotton in situ conditions were insignificant among individuals. Including all 33 participants, the mean change in the ENS6Q between the parameters "a little better" and "about the same" was 4.25 (standard deviation [SD] = 5.79) and -2.00 (SD = 3.70), giving an MCID of 6.25. Cotton testing is a validated office test to assess for ENS patients. Cotton testing also helped to determine the MCID of the ENS6Q, which is a 7-point change from the baseline ENS6Q score. 3b. Laryngoscope, 127:1746-1752, 2017. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.

  17. A pilot investigation of the Motivation Behaviors Checklist (MBC): An observational rating scale of effort towards testing for baseline sports-concussion assessment.

    Science.gov (United States)

    Rabinowitz, Amanda R; Merritt, Victoria; Arnett, Peter A

    2016-08-01

    Baseline neuropsychological testing is commonly used in the management of sports-related concussion. However, underperformance due to poor effort could lead to invalid conclusions regarding postconcussion cognitive decline. We designed the Motivation Behaviors Checklist (MBC) as an observational rating scale to assess effort towards baseline neuropsychological testing. Here we present preliminary data in support of its reliability and validity. MBC items were generated based on the consensus of a panel of graduate students, undergraduates, and a clinical neuropsychologist who conduct neuropsychological evaluations for a sports concussion management program. A total of 261 college athletes were administered a standard neuropsychological test battery in addition to the MBC. A subset of evaluations (n= 101) was videotape and viewed by a second rater. Exploratory factor analysis (EFA) was used to refine the scale, and reliability and validity were evaluated. EFA revealed that the MBC items represent four latent factors-Complaints, Poor Focus, Psychomotor Agitation, and Impulsivity. Reliability analyses demonstrated that the MBC has good inter-rater reliability (intraclass correlation coefficient, ICC = .767) and internal consistency (α = .839). The construct validity of the MBC is supported by large correlations with examiners' ratings of effort (ρ = -.623) and medium-sized relationships with cognitive performance and self-ratings of effort (|ρ| between .263 and .345). Discriminant validity was supported by nonsignificant correlations with measures of depression and postconcussion symptoms (ρ = .056 and .082, respectively). These findings provide preliminary evidence that the MBC could be a useful adjunct to baseline neuropsychological evaluations for sports-concussion management.

  18. C-Peptide, Baseline and Postprandial Insulin Resistance after a Carbohydrate-Rich Test Meal - Evidence for an Increased Insulin Clearance in PCOS Patients?

    Science.gov (United States)

    Stassek, J; Erdmann, J; Ohnolz, F; Berg, F D; Kiechle, M; Seifert-Klauss, V

    2017-01-01

    Introduction Known characteristics of patients with PCOS include infertility, menstrual disorders, hirsutism and also often insulin resistance. These symptoms increase with increasing body weight. In the LIPCOS study ( L ifestyle I ntervention for Patients with Polycystic Ovary Syndrome [ PCOS ]) long-term changes of the PCOS in dependence on pregnancy and parenthood were systematically assessed. In the framework of the LIPCOS study, PCOS patients were given a standardised carbohydrate-rich test meal in order to examine glucose homeostasis and insulin secretion. The results were compared with those of a eumenorrhoeic control group who all had corresponding BMI values and corresponding ages. Methods and Patients 41 PCOS patients (without diabetes) and 68 controls received a standardised carbohydrate-rich test meal (260 kcal, 62 % carbohydrates, 32 % fat, 6 % proteins) in order to generate a submaximal insulin and glucose stimulation. The values were determined at baseline and postprandial after 60, 120 and 180 minutes. In addition, the corresponding C-peptide levels were recorded. Results In the PCOS patients (n = 41), the insulin secretion test after a standardised test meal showed almost identical baseline and postprandial insulin levels when compared with those of the age- and BMI-matched eumenorrhoeic controls (n = 68). In the PCOS patients, the baseline and postprandial glucose levels were significantly elevated (92.88 ± 10.28 [PCOS] vs. 85.07 ± 9.42 mg/dL [controls]; p PCOS patients formally exhibit a higher fasting insulin resistance than controls. In spite of the higher stimulated C-peptide levels, the insulin levels did not increase more strongly with increasing glucose levels than in controls which may be indicative of a higher insulin clearance in PCOS patients.

  19. Testing environment shape differentially modulates baseline and nicotine-induced changes in behavior: Sex differences, hypoactivity, and behavioral sensitization.

    Science.gov (United States)

    Illenberger, J M; Mactutus, C F; Booze, R M; Harrod, S B

    2018-02-01

    In those who use nicotine, the likelihood of dependence, negative health consequences, and failed treatment outcomes differ as a function of gender. Women may be more sensitive to learning processes driven by repeated nicotine exposure that influence conditioned approach and craving. Sex differences in nicotine's influence over overt behaviors (i.e. hypoactivity or behavioral sensitization) can be examined using passive drug administration models in male and female rats. Following repeated intravenous (IV) nicotine injections, behavioral sensitization is enhanced in female rats compared to males. Nonetheless, characteristics of the testing environment also mediate rodent behavior following drug administration. The current experiment used a within-subjects design to determine if nicotine-induced changes in horizontal activity, center entries, and rearing displayed by male and female rats is detected when behavior was recorded in round vs. square chambers. Behaviors were recorded from each group (males-round: n=19; males-square: n=18; females-square: n=19; and females-round: n=19) immediately following IV injection of saline, acute nicotine, and repeated nicotine (0.05mg/kg/injection). Prior to nicotine treatment, sex differences were apparent only in round chambers. Following nicotine administration, the order of magnitude for the chamber that provided enhanced detection of hypoactivity or sensitization was contingent upon both the dependent measure under examination and the animal's biological sex. As such, round and square testing chambers provide different, and sometimes contradictory, accounts of how male and female rats respond to nicotine treatment. It is possible that a central mechanism such as stress or cue sensitivity is impacted by both drug exposure and environment to drive the sex differences observed in the current experiment. Until these complex relations are better understood, experiments considering sex differences in drug responses should balance

  20. Resting-state test-retest reliability of a priori defined canonical networks over different preprocessing steps.

    Science.gov (United States)

    Varikuti, Deepthi P; Hoffstaedter, Felix; Genon, Sarah; Schwender, Holger; Reid, Andrew T; Eickhoff, Simon B

    2017-04-01

    Resting-state functional connectivity analysis has become a widely used method for the investigation of human brain connectivity and pathology. The measurement of neuronal activity by functional MRI, however, is impeded by various nuisance signals that reduce the stability of functional connectivity. Several methods exist to address this predicament, but little consensus has yet been reached on the most appropriate approach. Given the crucial importance of reliability for the development of clinical applications, we here investigated the effect of various confound removal approaches on the test-retest reliability of functional-connectivity estimates in two previously defined functional brain networks. Our results showed that gray matter masking improved the reliability of connectivity estimates, whereas denoising based on principal components analysis reduced it. We additionally observed that refraining from using any correction for global signals provided the best test-retest reliability, but failed to reproduce anti-correlations between what have been previously described as antagonistic networks. This suggests that improved reliability can come at the expense of potentially poorer biological validity. Consistent with this, we observed that reliability was proportional to the retained variance, which presumably included structured noise, such as reliable nuisance signals (for instance, noise induced by cardiac processes). We conclude that compromises are necessary between maximizing test-retest reliability and removing variance that may be attributable to non-neuronal sources.

  1. Resting-state test-retest reliability of a priori defined canonical networks over different preprocessing steps

    Science.gov (United States)

    Varikuti, Deepthi P.; Hoffstaedter, Felix; Genon, Sarah; Schwender, Holger; Reid, Andrew T.; Eickhoff, Simon B.

    2016-01-01

    Resting-state functional connectivity analysis has become a widely used method for the investigation of human brain connectivity and pathology. The measurement of neuronal activity by functional MRI, however, is impeded by various nuisance signals that reduce the stability of functional connectivity. Several methods exist to address this predicament, but little consensus has yet been reached on the most appropriate approach. Given the crucial importance of reliability for the development of clinical applications, we here investigated the effect of various confound removal approaches on the test-retest reliability of functional-connectivity estimates in two previously defined functional brain networks. Our results showed that grey matter masking improved the reliability of connectivity estimates, whereas de-noising based on principal components analysis reduced it. We additionally observed that refraining from using any correction for global signals provided the best test-retest reliability, but failed to reproduce anti-correlations between what have been previously described as antagonistic networks. This suggests that improved reliability can come at the expense of potentially poorer biological validity. Consistent with this, we observed that reliability was proportional to the retained variance, which presumably included structured noise, such as reliable nuisance signals (for instance, noise induced by cardiac processes). We conclude that compromises are necessary between maximizing test-retest reliability and removing variance that may be attributable to non-neuronal sources. PMID:27550015

  2. Classification of soft-shell materials for leisure outdoor jackets by clo defined from thermal properties testing

    Science.gov (United States)

    Tesinova, P.; Steklova, P.; Duchacova, T.

    2017-10-01

    Materials for outdoor activities are produced in various combinations and lamination helps to combine two or more components for gaining high comfort properties and lighten the structure. Producers can choose exact suitable material for construction of part or set of so called layered clothing for expected activity. Decreasing the weight of materials when preserving of high quality of water-vapour permeability, wind resistivity and hydrostatic resistivity and other comfort and usage properties is a big task nowadays. This paper is focused on thermal properties as an important parameter for being comfort during outdoor activities. Softshell materials were chosen for testing and computation of clo. Results compared with standardised clo table helps us to classify thermal insulation of the set of fabrics when defining proper clothing category.

  3. Birth weight predicted baseline muscular efficiency, but not response of energy expenditure to calorie restriction: An empirical test of the predictive adaptive response hypothesis.

    Science.gov (United States)

    Workman, Megan; Baker, Jack; Lancaster, Jane B; Mermier, Christine; Alcock, Joe

    2016-07-01

    Aiming to test the evolutionary significance of relationships linking prenatal growth conditions to adult phenotypes, this study examined whether birth size predicts energetic savings during fasting. We specifically tested a Predictive Adaptive Response (PAR) model that predicts greater energetic saving among adults who were born small. Data were collected from a convenience sample of young adults living in Albuquerque, NM (n = 34). Indirect calorimetry quantified changes in resting energy expenditure (REE) and active muscular efficiency that occurred in response to a 29-h fast. Multiple regression analyses linked birth weight to baseline and postfast metabolic values while controlling for appropriate confounders (e.g., sex, body mass). Birth weight did not moderate the relationship between body size and energy expenditure, nor did it predict the magnitude change in REE or muscular efficiency observed from baseline to after fasting. Alternative indicators of birth size were also examined (e.g., low v. normal birth weight, comparison of tertiles), with no effects found. However, baseline muscular efficiency improved by 1.1% per 725 g (S.D.) increase in birth weight (P = 0.037). Birth size did not influence the sensitivity of metabolic demands to fasting-neither at rest nor during activity. Moreover, small birth size predicted a reduction in the efficiency with which muscles convert energy expended into work accomplished. These results do not support the ascription of adaptive function to phenotypes associated with small birth size. © 2015 Wiley Periodicals, Inc. Am. J. Hum. Biol. 28:484-492, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  4. The potential of clustering methods to define intersection test scenarios: Assessing real-life performance of AEB.

    Science.gov (United States)

    Sander, Ulrich; Lubbe, Nils

    2018-04-01

    Intersection accidents are frequent and harmful. The accident types 'straight crossing path' (SCP), 'left turn across path - oncoming direction' (LTAP/OD), and 'left-turn across path - lateral direction' (LTAP/LD) represent around 95% of all intersection accidents and one-third of all police-reported car-to-car accidents in Germany. The European New Car Assessment Program (Euro NCAP) have announced that intersection scenarios will be included in their rating from 2020; however, how these scenarios are to be tested has not been defined. This study investigates whether clustering methods can be used to identify a small number of test scenarios sufficiently representative of the accident dataset to evaluate Intersection Automated Emergency Braking (AEB). Data from the German In-Depth Accident Study (GIDAS) and the GIDAS-based Pre-Crash Matrix (PCM) from 1999 to 2016, containing 784 SCP and 453 LTAP/OD accidents, were analyzed with principal component methods to identify variables that account for the relevant total variances of the sample. Three different methods for data clustering were applied to each of the accident types, two similarity-based approaches, namely Hierarchical Clustering (HC) and Partitioning Around Medoids (PAM), and the probability-based Latent Class Clustering (LCC). The optimum number of clusters was derived for HC and PAM with the silhouette method. The PAM algorithm was both initiated with random start medoid selection and medoids from HC. For LCC, the Bayesian Information Criterion (BIC) was used to determine the optimal number of clusters. Test scenarios were defined from optimal cluster medoids weighted by their real-life representation in GIDAS. The set of variables for clustering was further varied to investigate the influence of variable type and character. We quantified how accurately each cluster variation represents real-life AEB performance using pre-crash simulations with PCM data and a generic algorithm for AEB intervention. The

  5. Recommendation to increase the test concentration of methylchloroisothiazolinone/methylisothiazolinone in the European baseline patch test series - on behalf of the European Society of Contact Dermatitis and the European Environmental and Contact Dermatitis Research Group.

    Science.gov (United States)

    Bruze, Magnus; Goossens, An; Isaksson, Marléne

    2014-07-01

    Methylchloroisothiazolinone (MCI)/methylisothiazolinone (MI) in aqua is present in the European baseline patch test series at 100 ppm, whereas 200 ppm has been used in Sweden since 1986, in Spain in the late 1980s, and, in recent years, also in the United Kingdom and Ireland. With regard to MCI/MI, to investigate the data on contact allergy rates in dermatitis patients, the frequencies of allergic contact dermatitis in the same group, and adverse reactions, particularly patch test sensitization in tested dermatitis patients, and to find the optimal patch test concentration as dose in mg/cm(2) . We performed a survey of the literature found via the National Library of Medicine (PubMed, http://www.ncbi.nlm.nih.gov/pubmed, last accessed 20 February 2014). MCI/MI at 200 ppm aq. diagnosis substantially more contact allergy and allergic contact dermatitis, without any registered increase in patch test sensitization, than the presently used concentration of 100 ppm. MCI/MI at 200 ppm aq. is recommended to be included in the European baseline patch test series. To avoid patch test sensitization, a dose of 0.006 mg/cm(2) must not be exceeded, which means a volume of 15 µl for Finn Chambers(®) (Ø 8 mm). © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  6. Nonlinear Dynamic Inversion Baseline Control Law: Flight-Test Results for the Full-scale Advanced Systems Testbed F/A-18 Airplane

    Science.gov (United States)

    Miller, Christopher J.

    2011-01-01

    A model reference nonlinear dynamic inversion control law has been developed to provide a baseline controller for research into simple adaptive elements for advanced flight control laws. This controller has been implemented and tested in a hardware-in-the-loop simulation and in flight. The flight results agree well with the simulation predictions and show good handling qualities throughout the tested flight envelope with some noteworthy deficiencies highlighted both by handling qualities metrics and pilot comments. Many design choices and implementation details reflect the requirements placed on the system by the nonlinear flight environment and the desire to keep the system as simple as possible to easily allow the addition of the adaptive elements. The flight-test results and how they compare to the simulation predictions are discussed, along with a discussion about how each element affected pilot opinions. Additionally, aspects of the design that performed better than expected are presented, as well as some simple improvements that will be suggested for follow-on work.

  7. A SYSTEMATIC SEARCH FOR PERIODICALLY VARYING QUASARS IN PAN-STARRS1: AN EXTENDED BASELINE TEST IN MEDIUM DEEP SURVEY FIELD MD09

    Energy Technology Data Exchange (ETDEWEB)

    Liu, T.; Gezari, S. [Department of Astronomy, University of Maryland, College Park, MD 20742 (United States); Burgett, W. [GMTO Corp, 465 N. Halstead St, Suite 250, Pasadena, CA 91107 (United States); Chambers, K.; Hodapp, K.; Huber, M.; Kudritzki, R.-P.; Magnier, E.; Tonry, J.; Wainscoat, R.; Waters, C. [Institute for Astronomy, University of Hawaii at Manoa, 2680 Woodlawn Drive, Honolulu, HI 96822 (United States); Draper, P.; Metcalfe, N., E-mail: tingting@astro.umd.edu [Department of Physics, University of Durham, South Road, Durham DH1 3LE (United Kingdom)

    2016-12-10

    We present a systematic search for periodically varying quasars and supermassive black hole binary (SMBHB) candidates in the Pan-STARRS1 (PS1) Medium Deep Survey’s MD09 field. From a color-selected sample of 670 quasars extracted from a multi-band deep-stack catalog of point sources, we locally select variable quasars and look for coherent periods with the Lomb–Scargle periodogram. Three candidates from our sample demonstrate strong variability for more than ∼3 cycles, and their PS1 light curves are well fitted to sinusoidal functions. We test the persistence of the candidates’ apparent periodic variations detected during the 4.2 years of the PS1 survey with archival photometric data from the SDSS Stripe 82 survey or new monitoring with the Large Monolithic Imager at the Discovery Channel Telescope. None of the three periodic candidates (including PSO J334.2028+1.4075) remain persistent over the extended baseline of 7–14 years, corresponding to a detection rate of <1 in 670 quasars in a search area of ≈5 deg{sup 2}. Even though SMBHBs should be a common product of the hierarchal growth of galaxies, and periodic variability in SMBHBs has been theoretically predicted, a systematic search for such signatures in a large optical survey is strongly limited by its temporal baseline and the “red noise” associated with normal quasar variability. We show that follow-up long-term monitoring (≳5 cycles) is crucial to our search for these systems.

  8. Polytrauma Defined by the New Berlin Definition: A Validation Test Based on Propensity-Score Matching Approach.

    Science.gov (United States)

    Rau, Cheng-Shyuan; Wu, Shao-Chun; Kuo, Pao-Jen; Chen, Yi-Chun; Chien, Peng-Chen; Hsieh, Hsiao-Yun; Hsieh, Ching-Hua

    2017-09-11

    Background: Polytrauma patients are expected to have a higher risk of mortality than that obtained by the summation of expected mortality owing to their individual injuries. This study was designed to investigate the outcome of patients with polytrauma, which was defined using the new Berlin definition, as cases with an Abbreviated Injury Scale (AIS) ≥ 3 for two or more different body regions and one or more additional variables from five physiologic parameters (hypotension [systolic blood pressure ≤ 90 mmHg], unconsciousness [Glasgow Coma Scale score ≤ 8], acidosis [base excess ≤ -6.0], coagulopathy [partial thromboplastin time ≥ 40 s or international normalized ratio ≥ 1.4], and age [≥70 years]). Methods: We retrieved detailed data on 369 polytrauma patients and 1260 non-polytrauma patients with an overall Injury Severity Score (ISS) ≥ 18 who were hospitalized between 1 January 2009 and 31 December 2015 for the treatment of all traumatic injuries, from the Trauma Registry System at a level I trauma center. Patients with burn injury or incomplete registered data were excluded. Categorical data were compared with two-sided Fisher exact or Pearson chi-square tests. The unpaired Student t -test and the Mann-Whitney U -test was used to analyze normally distributed continuous data and non-normally distributed data, respectively. Propensity-score matched cohort in a 1:1 ratio was allocated using the NCSS software with logistic regression to evaluate the effect of polytrauma on patient outcomes. Results: The polytrauma patients had a significantly higher ISS than non-polytrauma patients (median (interquartile range Q1-Q3), 29 (22-36) vs. 24 (20-25), respectively; p Polytrauma patients had a 1.9-fold higher odds of mortality than non-polytrauma patients (95% CI 1.38-2.49; p polytrauma patients, polytrauma patients had a substantially longer hospital length of stay (LOS). In addition, a higher proportion of polytrauma patients were admitted to the intensive

  9. Evolution of short cognitive test performance in stroke patients with vascular cognitive impairment and vascular dementia: Baseline evaluation and follow-up

    Science.gov (United States)

    Custodio, Nilton; Montesinos, Rosa; Lira, David; Herrera-Perez, Eder; Bardales, Yadira; Valeriano-Lorenzo, Lucia

    2017-01-01

    ABSTRACT. There is limited evidence about the progression of cognitive performance during the post-stroke stage. Objective: To assess the evolution of cognitive performance in stroke patients without vascular cognitive impairment (VCI), patients with vascular mild cognitive impairment (MCI), and patients with vascular dementia (VD). Methods: A prospective cohort of stroke outpatients from two secondary medical centers in Lima, Peru was studied. We performed standardized evaluations at definitive diagnosis (baseline evaluation), and control follow-ups at 6 and 12 months, including a battery of short cognitive tests: Clinical Dementia Rating (CDR), Addenbrooke's Cognitive Examination (ACE), and INECO Frontal Screening (IFS). Results: 152 outpatients completed the follow-up, showing progressive increase in mean score on the CDR(0.34 to 0.46), contrary to the pattern observed on the ACE and IFS (78.18 to 76.48 and 23.63 to 22.24). The box plot for the CDR test showed that VCI patients had progressive worsening (0.79 to 0.16). Conversely, this trend was not observed in subjects without VCI. The box plot for the ACE and IFS showed that, for the majority of the differentiated stroke types, both non-VCI and VCI patients had progressive worsening. Conclusion: According to both ACE and IFS results during a 1-year follow-up, the cognitive performance of stroke patients worsened, a trend which was particularly consistent in infarction-type stroke patients. PMID:29354218

  10. Evolution of short cognitive test performance in stroke patients with vascular cognitive impairment and vascular dementia: Baseline evaluation and follow-up

    Directory of Open Access Journals (Sweden)

    Nilton Custodio

    Full Text Available ABSTRACT. There is limited evidence about the progression of cognitive performance during the post-stroke stage. Objective: To assess the evolution of cognitive performance in stroke patients without vascular cognitive impairment (VCI, patients with vascular mild cognitive impairment (MCI, and patients with vascular dementia (VD. Methods: A prospective cohort of stroke outpatients from two secondary medical centers in Lima, Peru was studied. We performed standardized evaluations at definitive diagnosis (baseline evaluation, and control follow-ups at 6 and 12 months, including a battery of short cognitive tests: Clinical Dementia Rating (CDR, Addenbrooke's Cognitive Examination (ACE, and INECO Frontal Screening (IFS. Results: 152 outpatients completed the follow-up, showing progressive increase in mean score on the CDR(0.34 to 0.46, contrary to the pattern observed on the ACE and IFS (78.18 to 76.48 and 23.63 to 22.24. The box plot for the CDR test showed that VCI patients had progressive worsening (0.79 to 0.16. Conversely, this trend was not observed in subjects without VCI. The box plot for the ACE and IFS showed that, for the majority of the differentiated stroke types, both non-VCI and VCI patients had progressive worsening. Conclusion: According to both ACE and IFS results during a 1-year follow-up, the cognitive performance of stroke patients worsened, a trend which was particularly consistent in infarction-type stroke patients.

  11. Neutrino physics with short baseline experiments

    International Nuclear Information System (INIS)

    Zimmerman, E.D.

    2006-01-01

    Neutrino physics with low- to medium-energy beams has progressed steadily over the last several years. Neutrino oscillation searches at short baseline (defined as 2 - -> 0.1eV 2 . One positive signal, from the LSND collaboration, exists and is being tested by the MiniBooNE experiment. Neutrino cross-section measurements are being made by MiniBooNE and K2K, which will be important for reducing systematic errors in present and future oscillation measurements. In the near future, dedicated cross- section experiments will begin operating at Fermilab. (author)

  12. 16 CFR Table 3 to Part 1512 - Minimum Acceptable Values for the Quantity A Defined in the Retroreflective Tire and Rim Test...

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Minimum Acceptable Values for the Quantity A Defined in the Retroreflective Tire and Rim Test Procedure 3 Table 3 to Part 1512 Commercial Practices... Retroreflective Tire and Rim Test Procedure Observation angle (degrees) Entrance angle (degrees) Minimum...

  13. SLUDGE BATCH 4 BASELINE MELT RATE FURNACE AND SLURRY-FED MELT RATE FURNACE TESTS WITH FRITS 418 AND 510 (U)

    International Nuclear Information System (INIS)

    Smith, M; Timothy Jones, T; Donald02 Miller, D

    2007-01-01

    Several Slurry-Fed Melt Rate Furnace (SMRF) tests with earlier projections of the Sludge Batch 4 (SB4) composition have been performed.1,2 The first SB4 SMRF test used Frits 418 and 320, however it was found after the test that the REDuction/OXidation (REDOX) correlation at that time did not have the proper oxidation state for manganese. Because the manganese level in the SB4 sludge was higher than previous sludge batches tested, the impact of the higher manganese oxidation state was greater. The glasses were highly oxidized and very foamy, and therefore the results were inconclusive. After resolving this REDOX issue, Frits 418, 425, and 503 were tested in the SMRF with the updated baseline SB4 projection. Based on dry-fed Melt Rate Furnace (MRF) tests and the above mentioned SMRF tests, two previous frit recommendations were made by the Savannah River National Laboratory (SRNL) for processing of SB4 in the Defense Waste Processing Facility (DWPF). The first was Frit 503 based on the June 2006 composition projections.3 The recommendation was changed to Frit 418 as a result of the October 2006 composition projections (after the Tank 40 decant was implemented as part of the preparation plan). However, the start of SB4 processing was delayed due to the control room consolidation outage and the repair of the valve box in the Tank 51 to Tank 40 transfer line. These delays resulted in changes to the projected SB4 composition. Due to the slight change in composition and based on preliminary dry-fed MRF testing, SRNL believed that Frit 510 would increase throughput in processing SB4 in DWPF. Frit 418, which was used in processing Sludge Batch 3 (SB3), was a viable candidate and available in DWPF. Therefore, it was used during the initial SB4 processing. Due to the potential for higher melt rates with Frit 510, SMRF tests with the latest SB4 composition (1298 canisters) and Frits 510 and 418 were performed at a targeted waste loading (WL) of 35%. The '1298 canisters

  14. Defining the baseline in social life cycle assessment

    DEFF Research Database (Denmark)

    Jørgensen, Andreas; Finkbeiner, Matthias; Jørgensen, Michael Søgaard

    2010-01-01

    A relatively broad consensus has formed that the purpose of developing and using the social life cycle assessment (SLCA) is to improve the social conditions for the stakeholders affected by the assessed product's life cycle. To create this effect, the SLCA, among other things, needs to provide...... valid assessments of the consequence of the decision that it is to support. The consequence of a decision to implement a life cycle of a product can be seen as the difference between the decision being implemented and 'non-implemented' product life cycle. This difference can to some extent be found...... using the consequential environmental life cycle assessment (ELCA) methodology to identify the processes that change as a consequence of the decision. However, if social impacts are understood as certain changes in the lives of the stakeholders, then social impacts are not only related to product life...

  15. Supplemental Environmental Baseline Survey for Proposed Land Use Permit Modification for Expansion of the Dynamic Explosive Test Site (DETS) 9940 Main Complex Parking Lot

    Energy Technology Data Exchange (ETDEWEB)

    Peek, Dennis W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-10-01

    The “subject property” is comprised of a parcel of land within the Kirtland Military Reservation, Bernalillo County, New Mexico, as shown on the map in Appendix B of this document. The land requirement for the parking lot addition to the 9940 Main Complex is approximately 2.7 acres. The scope of this Supplemental Environmental Baseline Survey (SEBS) is for the parking lot addition land transfer only. For details on the original 9940 Main Complex see Environmental Baseline Survey, Land Use Permit Request for the 9940 Complex PERM/0-KI-00-0001, August 21, 2003, and for details on the 9940 Complex Expansion see Environmental Baseline Survey, Proposed Land Use Permit Expansion for 9940 DETS Complex, June 24, 2009. The 2.7-acre parcel of land for the new parking lot, which is the subject of this EBS (also referred to as the “subject property”), is adjacent to the southwest boundary of the original 12.3- acre 9940 Main Complex. No testing is known to have taken place on the subject property site. The only activity known to have taken place was the burial of overhead utility lines in 2014. Adjacent to the subject property, the 9940 Main Complex was originally a 12.3-acre site used by the Department of Energy (DOE) under a land use permit from the United States Air Force (USAF). Historical use of the site, dating from 1964, included arming, fusing, and firing of explosives and testing of explosives systems components. In the late 1970s and early 1980s experiments at the 9940 Main Complex shifted toward reactor safety issues. From 1983 to 1988, fuel coolant interaction (FCI) experiments were conducted, as were experiments with conventional high explosives (HE). Today, the land is used for training of the Nuclear Emergency Response community and for research on energetic materials. In 2009, the original complex was expanded to include four additional 20-acre areas: 9940 Training South, 9940 Training East, T-Range 6, and Training West Landing Zone. The proposed use of

  16. Serological IgG Testing to Diagnose Alimentary Induced Diseases and Monitoring Efficacy of an Individual Defined Diet in Dogs

    OpenAIRE

    Anne-Margré C. Vink

    2014-01-01

    Background. Food-related allergies and intolerances are frequently occurring in dogs. Diagnosis and monitoring according ‘Golden Standard’ of elimination efficiency is, however, time consuming, expensive, and requires expert clinical setting. In order to facilitate rapid and robust, quantitative testing of intolerance, and determining the individual offending foods, a serological test is implicated for Alimentary Induced Diseases and manifestations. Method. As we developed Medisynx IgG Human ...

  17. Define Project

    DEFF Research Database (Denmark)

    Munk-Madsen, Andreas

    2005-01-01

    "Project" is a key concept in IS management. The word is frequently used in textbooks and standards. Yet we seldom find a precise definition of the concept. This paper discusses how to define the concept of a project. The proposed definition covers both heavily formalized projects and informally...... organized, agile projects. Based on the proposed definition popular existing definitions are discussed....

  18. "Dermatitis" defined.

    Science.gov (United States)

    Smith, Suzanne M; Nedorost, Susan T

    2010-01-01

    The term "dermatitis" can be defined narrowly or broadly, clinically or histologically. A common and costly condition, dermatitis is underresourced compared to other chronic skin conditions. The lack of a collectively understood definition of dermatitis and its subcategories could be the primary barrier. To investigate how dermatologists define the term "dermatitis" and determine if a consensus on the definition of this term and other related terms exists. A seven-question survey of dermatologists nationwide was conducted. Of respondents (n  =  122), half consider dermatitis to be any inflammation of the skin. Nearly half (47.5%) use the term interchangeably with "eczema." Virtually all (> 96%) endorse the subcategory "atopic" under the terms "dermatitis" and "eczema," but the subcategories "contact," "drug hypersensitivity," and "occupational" are more highly endorsed under the term "dermatitis" than under the term "eczema." Over half (55.7%) personally consider "dermatitis" to have a broad meaning, and even more (62.3%) believe that dermatologists as a whole define the term broadly. There is a lack of consensus among experts in defining dermatitis, eczema, and their related subcategories.

  19. How Not to Evaluate a Psychological Measure: Rebuttal to Criticism of the Defining Issues Test of Moral Judgment Development by Curzer and Colleagues

    Science.gov (United States)

    Thoma, Stephen J.; Bebeau, Muriel J.; Narvaez, Darcia

    2016-01-01

    In a 2014 paper in "Theory and Research in Education," Howard Curzer and colleagues critique the Defining Issues Test of moral judgment development according to eight criteria that are described as difficulties any measure of educational outcomes must address. This article highlights how Curzer et al. do not consult existing empirical…

  20. A COTS RF/Optical Software Defined Radio for the Integrated Radio and Optical Communications Test Bed

    Science.gov (United States)

    Nappier, Jennifer M.; Zeleznikar, Daniel J.; Wroblewski, Adam C.; Tokars, Roger P.; Schoenholz, Bryan L.; Lantz, Nicholas C.

    2017-01-01

    The Integrated Radio and Optical Communications (iROC) project at the National Aeronautics and Space Administration (NASA) is investigating the merits of a hybrid radio frequency (RF) and optical communication system for deep space missions. In an effort to demonstrate the feasibility and advantages of a hybrid RF/Optical software defined radio (SDR), a laboratory prototype was assembled from primarily commercial-off-the-shelf (COTS) hardware components. This COTS platform has been used to demonstrate simultaneous transmission of the radio and optical communications waveforms through to the physical layer (telescope and antenna). This paper details the hardware and software used in the platform and various measures of its performance. A laboratory optical receiver platform has also been assembled in order to demonstrate hybrid free space links in combination with the transmitter.

  1. Baseline restoration using current conveyors

    International Nuclear Information System (INIS)

    Morgado, A.M.L.S.; Simoes, J.B.; Correia, C.M.

    1996-01-01

    A good performance of high resolution nuclear spectrometry systems, at high pulse rates, demands restoration of baseline between pulses, in order to remove rate dependent baseline shifts. This restoration is performed by circuits named baseline restorers (BLRs) which also remove low frequency noise, such as power supply hum and detector microphonics. This paper presents simple circuits for baseline restoration based on a commercial current conveyor (CCII01). Tests were performed, on two circuits, with periodic trapezoidal shaped pulses in order to measure the baseline restoration for several pulse rates and restorer duty cycles. For the current conveyor based Robinson restorer, the peak shift was less than 10 mV, for duty cycles up to 60%, at high pulse rates. Duty cycles up to 80% were also tested, being the maximum peak shift 21 mV. The peak shift for the current conveyor based Grubic restorer was also measured. The maximum value found was 30 mV at 82% duty cycle. Keeping the duty cycle below 60% improves greatly the restorer performance. The ability of both baseline restorer architectures to reject low frequency modulation is also measured, with good results on both circuits

  2. Toxicity testing in the 21 century: defining new risk assessment approaches based on perturbation of intracellular toxicity pathways.

    Directory of Open Access Journals (Sweden)

    Sudin Bhattacharya

    Full Text Available The approaches to quantitatively assessing the health risks of chemical exposure have not changed appreciably in the past 50 to 80 years, the focus remaining on high-dose studies that measure adverse outcomes in homogeneous animal populations. This expensive, low-throughput approach relies on conservative extrapolations to relate animal studies to much lower-dose human exposures and is of questionable relevance to predicting risks to humans at their typical low exposures. It makes little use of a mechanistic understanding of the mode of action by which chemicals perturb biological processes in human cells and tissues. An alternative vision, proposed by the U.S. National Research Council (NRC report Toxicity Testing in the 21(st Century: A Vision and a Strategy, called for moving away from traditional high-dose animal studies to an approach based on perturbation of cellular responses using well-designed in vitro assays. Central to this vision are (a "toxicity pathways" (the innate cellular pathways that may be perturbed by chemicals and (b the determination of chemical concentration ranges where those perturbations are likely to be excessive, thereby leading to adverse health effects if present for a prolonged duration in an intact organism. In this paper we briefly review the original NRC report and responses to that report over the past 3 years, and discuss how the change in testing might be achieved in the U.S. and in the European Union (EU. EU initiatives in developing alternatives to animal testing of cosmetic ingredients have run very much in parallel with the NRC report. Moving from current practice to the NRC vision would require using prototype toxicity pathways to develop case studies showing the new vision in action. In this vein, we also discuss how the proposed strategy for toxicity testing might be applied to the toxicity pathways associated with DNA damage and repair.

  3. Defined culture medium for stem cell differentiation: applicability of serum-free conditions in the mouse embryonic stem cell test.

    Science.gov (United States)

    Riebeling, Christian; Schlechter, Katharina; Buesen, Roland; Spielmann, Horst; Luch, Andreas; Seiler, Andrea

    2011-06-01

    The embryonic stem cell test (EST) is a validated method to assess the developmental toxicity potency of chemicals. It was developed to reduce animal use and allow faster testing for hazard assessment. The cells used in this method are maintained and differentiated in media containing foetal calf serum. This animal product is of considerable variation in quality, and individual batches require extensive testing for their applicability in the EST. Moreover, its production involves a large number of foetuses and possible animal suffering. We demonstrate the serum-free medium and feeder cell-free maintenance of the mouse embryonic stem cell line D3 and investigate the use of specific growth factors for induction of cardiac differentiation. Using a combination of bone morphogenetic protein-2, bone morphogenetic protein-4, activin A and ascorbic acid, embryoid bodies efficiently differentiated into contracting myocardium. Additionally, examining levels of intracellular marker proteins by flow cytometry not only confirmed differentiation into cardiomyocytes, but demonstrated significant differentiation into neuronal cells in the same time frame. Thus, this approach might allow for simultaneous detection of developmental effects on both early mesodermal and neuroectodermal differentiation. The serum-free conditions for maintenance and differentiation of D3 cells described here enhance the transferability and standardisation and hence the performance of the EST. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Defining chaos.

    Science.gov (United States)

    Hunt, Brian R; Ott, Edward

    2015-09-01

    In this paper, we propose, discuss, and illustrate a computationally feasible definition of chaos which can be applied very generally to situations that are commonly encountered, including attractors, repellers, and non-periodically forced systems. This definition is based on an entropy-like quantity, which we call "expansion entropy," and we define chaos as occurring when this quantity is positive. We relate and compare expansion entropy to the well-known concept of topological entropy to which it is equivalent under appropriate conditions. We also present example illustrations, discuss computational implementations, and point out issues arising from attempts at giving definitions of chaos that are not entropy-based.

  5. Patch test results with fragrance markers of the baseline series - analysis of the European Surveillance System on Contact Allergies (ESSCA) network 2009-2012

    NARCIS (Netherlands)

    Frosch, Peter J.; Johansen, Jeanne Duus; Schuttelaar, Marie-Louise A.; Silvestre, Juan F.; Sanchez-Perez, Javier; Weisshaar, Elke; Uter, Wolfgang

    Background. Contact allergy to fragrances is common, and impairs quality of life, particularly in young women. Objective. To provide current results on the prevalences of sensitization to fragrance allergens used as markers in the baseline series of most European countries. Methods. Data of patients

  6. Patch test study of 90 patients with tattoo reactions: negative outcome of allergy patch test to baseline batteries and culprit inks suggests allergen(s) are generated in the skin through haptenization.

    Science.gov (United States)

    Serup, Jørgen; Hutton Carlsen, Katrina

    2014-11-01

    Tattoo reactions, especially in red tattoos, are often suggested as allergic in nature, however, systematic evaluation by patch testing has not performed in the past. To report the results of patch testing in 90 patients with non-infectious chronic tattoo reactions. From 2009 to 2013 at the 'Tattoo Clinic', Department of Dermatology, Bispebjerg University Hospital, 90 patients were patch tested with batteries of baseline allergens, disperse dyes/textile allergens, and a selection of tattoo ink stock products, which, according to case observations, were problematic, supplemented with individual culprit inks when accessible. Patients with reactions to the tattoo colour red, the most predominant colour associated with skin reactions, showed negative patch test results with common allergens. Outcomes were also negative in patients who had experienced concomitant reactions in another hitherto tolerated tattoo of the same colour as the problematic tattoo. The allergen or allergens responsible for tattoo reactions are not present directly in tattoo ink stock products. This is despite the fact that clinical histories suggest that the vast majority of clinical reactions, especially reactions to red and red nuances, are likely to be allergic events caused by the injected inks. We suggest that the responsible allergen results from a complicated and slow process of haptenization, which may even include photochemical cleavage of red azo pigment. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. Testing a machine-learning algorithm to predict the persistence and severity of major depressive disorder from baseline self-reports.

    Science.gov (United States)

    Kessler, R C; van Loo, H M; Wardenaar, K J; Bossarte, R M; Brenner, L A; Cai, T; Ebert, D D; Hwang, I; Li, J; de Jonge, P; Nierenberg, A A; Petukhova, M V; Rosellini, A J; Sampson, N A; Schoevers, R A; Wilcox, M A; Zaslavsky, A M

    2016-10-01

    Heterogeneity of major depressive disorder (MDD) illness course complicates clinical decision-making. Although efforts to use symptom profiles or biomarkers to develop clinically useful prognostic subtypes have had limited success, a recent report showed that machine-learning (ML) models developed from self-reports about incident episode characteristics and comorbidities among respondents with lifetime MDD in the World Health Organization World Mental Health (WMH) Surveys predicted MDD persistence, chronicity and severity with good accuracy. We report results of model validation in an independent prospective national household sample of 1056 respondents with lifetime MDD at baseline. The WMH ML models were applied to these baseline data to generate predicted outcome scores that were compared with observed scores assessed 10-12 years after baseline. ML model prediction accuracy was also compared with that of conventional logistic regression models. Area under the receiver operating characteristic curve based on ML (0.63 for high chronicity and 0.71-0.76 for the other prospective outcomes) was consistently higher than for the logistic models (0.62-0.70) despite the latter models including more predictors. A total of 34.6-38.1% of respondents with subsequent high persistence chronicity and 40.8-55.8% with the severity indicators were in the top 20% of the baseline ML-predicted risk distribution, while only 0.9% of respondents with subsequent hospitalizations and 1.5% with suicide attempts were in the lowest 20% of the ML-predicted risk distribution. These results confirm that clinically useful MDD risk-stratification models can be generated from baseline patient self-reports and that ML methods improve on conventional methods in developing such models.

  8. Defining Cyberbullying.

    Science.gov (United States)

    Englander, Elizabeth; Donnerstein, Edward; Kowalski, Robin; Lin, Carolyn A; Parti, Katalin

    2017-11-01

    Is cyberbullying essentially the same as bullying, or is it a qualitatively different activity? The lack of a consensual, nuanced definition has limited the field's ability to examine these issues. Evidence suggests that being a perpetrator of one is related to being a perpetrator of the other; furthermore, strong relationships can also be noted between being a victim of either type of attack. It also seems that both types of social cruelty have a psychological impact, although the effects of being cyberbullied may be worse than those of being bullied in a traditional sense (evidence here is by no means definitive). A complicating factor is that the 3 characteristics that define bullying (intent, repetition, and power imbalance) do not always translate well into digital behaviors. Qualities specific to digital environments often render cyberbullying and bullying different in circumstances, motivations, and outcomes. To make significant progress in addressing cyberbullying, certain key research questions need to be addressed. These are as follows: How can we define, distinguish between, and understand the nature of cyberbullying and other forms of digital conflict and cruelty, including online harassment and sexual harassment? Once we have a functional taxonomy of the different types of digital cruelty, what are the short- and long-term effects of exposure to or participation in these social behaviors? What are the idiosyncratic characteristics of digital communication that users can be taught? Finally, how can we apply this information to develop and evaluate effective prevention programs? Copyright © 2017 by the American Academy of Pediatrics.

  9. Evaluation of different jumping tests in defining position-specific and performance-level differences in high level basketball players

    Directory of Open Access Journals (Sweden)

    Miran Pehar

    2017-04-01

    Full Text Available The importance of jumping ability in basketball is well known, but there is an evident lack of studies that have examined different jumping testing protocols in basketball players at advanced levels. The aim of this study was to assess the applicability of different tests of jumping capacity in identifying differences between (i playing position and (ii competitive levels of professional players. Participants were 110 male professional basketball players (height: 194.92±8.09 cm; body mass: 89.33±10.91 kg; 21.58±3.92 years of age; Guards, 49; Forwards, 22; Centres, 39 who competed in the first (n = 58 and second division (n = 52. The variables included anthropometrics and jumping test performance. Jumping performances were evaluated by the standing broad jump (SBJ, countermovement jump (CMJ, reactive strength index (RSI, repeated reactive strength ability (RRSA and four running vertical jumps: maximal jump with (i take-off from the dominant leg and (ii non-dominant leg, lay-up shot jump with take-off from the (iii dominant leg and (iv non-dominant leg. First-division players were taller (ES: 0.76, 95%CI: 0.35-1.16, moderate differences, heavier (0.69, 0.29-1.10, had higher maximal reach height (0.67, 0.26-1.07, moderate differences, and had lower body fat % (-0.87, -1.27-0.45, moderate differences than second-division players. The playing positions differed significantly in three of four running jump achievements, RSI and RRSA, with Centres being least successful. The first-division players were superior to second-division players in SBJ (0.63, 0.23-1.03; 0.87, 0.26-1.43; 0.76, 0.11-1.63, all moderate differences, for total sample, Guards, and Forwards, respectively. Running vertical jumps and repeated jumping capacity can be used as valid measures of position-specific jumping ability in basketball. The differences between playing levels in vertical jumping achievement can be observed by assessing vertical jump scores together with differences

  10. Evaluation of different jumping tests in defining position-specific and performance-level differences in high level basketball players.

    Science.gov (United States)

    Pehar, Miran; Sekulic, Damir; Sisic, Nedim; Spasic, Miodrag; Uljevic, Ognjen; Krolo, Ante; Milanovic, Zoran; Sattler, Tine

    2017-09-01

    The importance of jumping ability in basketball is well known, but there is an evident lack of studies that have examined different jumping testing protocols in basketball players at advanced levels. The aim of this study was to assess the applicability of different tests of jumping capacity in identifying differences between (i) playing position and (ii) competitive levels of professional players. Participants were 110 male professional basketball players (height: 194.92±8.09 cm; body mass: 89.33±10.91 kg; 21.58±3.92 years of age; Guards, 49; Forwards, 22; Centres, 39) who competed in the first (n = 58) and second division (n = 52). The variables included anthropometrics and jumping test performance. Jumping performances were evaluated by the standing broad jump (SBJ), countermovement jump (CMJ), reactive strength index (RSI), repeated reactive strength ability (RRSA) and four running vertical jumps: maximal jump with (i) take-off from the dominant leg and (ii) non-dominant leg, lay-up shot jump with take-off from the (iii) dominant leg and (iv) non-dominant leg. First-division players were taller (ES: 0.76, 95%CI: 0.35-1.16, moderate differences), heavier (0.69, 0.29-1.10), had higher maximal reach height (0.67, 0.26-1.07, moderate differences), and had lower body fat % (-0.87, -1.27-0.45, moderate differences) than second-division players. The playing positions differed significantly in three of four running jump achievements, RSI and RRSA, with Centres being least successful. The first-division players were superior to second-division players in SBJ (0.63, 0.23-1.03; 0.87, 0.26-1.43; 0.76, 0.11-1.63, all moderate differences, for total sample, Guards, and Forwards, respectively). Running vertical jumps and repeated jumping capacity can be used as valid measures of position-specific jumping ability in basketball. The differences between playing levels in vertical jumping achievement can be observed by assessing vertical jump scores together with differences

  11. Logistics Operations Management Center: Maintenance Support Baseline (LOMC-MSB)

    Science.gov (United States)

    Kurrus, R.; Stump, F.

    1995-01-01

    The Logistics Operations Management Center Maintenance Support Baseline is defined. A historical record of systems, applied to and deleted from, designs in support of future management and/or technical analysis is provided. All Flight elements, Ground Support Equipment, Facility Systems and Equipment and Test Support Equipment for which LOMC has responsibilities at Kennedy Space Center and other locations are listed. International Space Station Alpha Program documentation is supplemented. The responsibility of the Space Station Launch Site Support Office is established.

  12. Program reference schedule baseline

    International Nuclear Information System (INIS)

    1986-07-01

    This Program Reference Schedule Baseline (PRSB) provides the baseline Program-level milestones and associated schedules for the Civilian Radioactive Waste Management Program. It integrates all Program-level schedule-related activities. This schedule baseline will be used by the Director, Office of Civilian Radioactive Waste Management (OCRWM), and his staff to monitor compliance with Program objectives. Chapter 1 includes brief discussions concerning the relationship of the PRSB to the Program Reference Cost Baseline (PRCB), the Mission Plan, the Project Decision Schedule, the Total System Life Cycle Cost report, the Program Management Information System report, the Program Milestone Review, annual budget preparation, and system element plans. Chapter 2 includes the identification of all Level 0, or Program-level, milestones, while Chapter 3 presents and discusses the critical path schedules that correspond to those Level 0 milestones

  13. Long Baseline Observatory (LBO)

    Data.gov (United States)

    Federal Laboratory Consortium — The Long Baseline Observatory (LBO) comprises ten radio telescopes spanning 5,351 miles. It's the world's largest, sharpest, dedicated telescope array. With an eye...

  14. Association of a culturally defined syndrome (nervios) with chest pain and DSM-IV affective disorders in Hispanic patients referred for cardiac stress testing.

    Science.gov (United States)

    Pavlik, Valory N; Hyman, David J; Wendt, Juliet A; Orengo, Claudia

    2004-01-01

    Hispanics have a high prevalence of cardiovascular risk factors, most notably type 2 diabetes. However, in a large public hospital in Houston, Texas, Hispanic patients referred for cardiac stress testing were significantly more likely to have normal test results than were Whites or non-Hispanic Blacks. We undertook an exploratory study to determine if nervios, a culturally based syndrome that shares similarities with both panic disorder and anginal symptoms, is sufficiently prevalent among Hispanics referred for cardiac testing to be considered as a possible explanation for the high probability of a normal test result. Hispanic patients were recruited consecutively when they presented for a cardiac stress test. A bilingual interviewer administered a brief medical history, the Rose Angina Questionnaire (RAQ), a questionnaire to assess a history of nervios and associated symptoms, and the PRIME-MD, a validated brief questionnaire to diagnose DSM-IV defined affective disorders. The average age of the 114 participants (38 men and 76 women) was 57 years, and the average educational attainment was 7 years. Overall, 50% of participants reported a history of chronic nervios, and 14% reported an acute subtype known as ataque de nervios. Only 2% of patients had DSM-IV defined panic disorder, and 59% of patients had a positive RAQ score (ie, Rose questionnaire angina). The acute subtype, ataque de nervios, but not chronic nervios, was related to an increased probability of having Rose questionnaire angina (P=.006). Adjusted for covariates, a positive history of chronic nervios, but not Rose questionnaire angina, was significantly associated with a normal cardiac test result (OR=2.97, P=.04). Nervios is common among Hispanics with symptoms of cardiac disease. Additional research is needed to understand how nervios symptoms differ from chest pain in Hispanics and the role of nervios in referral for cardiac workup by primary care providers and emergency room personnel.

  15. Pro-HEART - a randomized clinical trial to test the effectiveness of a high protein diet targeting obese individuals with heart failure: rationale, design and baseline characteristics.

    Science.gov (United States)

    Motie, Marjan; Evangelista, Lorraine S; Horwich, Tamara; Hamilton, Michele; Lombardo, Dawn; Cooper, Dan M; Galassetti, Pietro R; Fonarow, Gregg C

    2013-11-01

    There is ample research to support the potential benefits of a high protein diet on clinical outcomes in overweight/obese, diabetic subjects. However, nutritional management of overweight/obese individuals with heart failure (HF) and type 2 diabetes mellitus (DM) or metabolic syndrome (MS) is poorly understood and few clinical guidelines related to nutritional approaches exist for this subgroup. This article describes the design, methods, and baseline characteristics of study participants enrolled in Pro-HEART, a randomized clinical trial to determine the short term and long term effects of a high protein diet (30% protein [~110 g/day], 40% carbohydrates [150 g/day], 30% fat [~50 g/day]) versus a standard protein diet (15% protein [~55 g/day], 55% carbohydrates [~200 g/day], 30% fat [~50 g/day]) on body weight and adiposity, cardiac structure and function, functional status, lipid profile, glycemic control, and quality of life. Between August, 2009 and May, 2013, 61 individuals agreed to participate in the study; 52 (85%) - mean age 58.2 ± 9.8 years; 15.4% Blacks; 57.7% Whites; 19.2% Hispanics; 7.7% Asians; 73.1% male; weight 112.0 ± 22.6 kg - were randomized to a 3-month intensive weight management program of either a high protein or standard protein diet; data were collected at baseline, 3 months, and 15 months. This study has the potential to reveal significant details about the role of macronutrients in weight management of overweight/obese individuals with HF and DM or MS. © 2013 Elsevier Inc. All rights reserved.

  16. TAPIR--Finnish national geochemical baseline database.

    Science.gov (United States)

    Jarva, Jaana; Tarvainen, Timo; Reinikainen, Jussi; Eklund, Mikael

    2010-09-15

    In Finland, a Government Decree on the Assessment of Soil Contamination and Remediation Needs has generated a need for reliable and readily accessible data on geochemical baseline concentrations in Finnish soils. According to the Decree, baseline concentrations, referring both to the natural geological background concentrations and the diffuse anthropogenic input of substances, shall be taken into account in the soil contamination assessment process. This baseline information is provided in a national geochemical baseline database, TAPIR, that is publicly available via the Internet. Geochemical provinces with elevated baseline concentrations were delineated to provide regional geochemical baseline values. The nationwide geochemical datasets were used to divide Finland into geochemical provinces. Several metals (Co, Cr, Cu, Ni, V, and Zn) showed anomalous concentrations in seven regions that were defined as metal provinces. Arsenic did not follow a similar distribution to any other elements, and four arsenic provinces were separately determined. Nationwide geochemical datasets were not available for some other important elements such as Cd and Pb. Although these elements are included in the TAPIR system, their distribution does not necessarily follow the ones pre-defined for metal and arsenic provinces. Regional geochemical baseline values, presented as upper limit of geochemical variation within the region, can be used as trigger values to assess potential soil contamination. Baseline values have also been used to determine upper and lower guideline values that must be taken into account as a tool in basic risk assessment. If regional geochemical baseline values are available, the national guideline values prescribed in the Decree based on ecological risks can be modified accordingly. The national geochemical baseline database provides scientifically sound, easily accessible and generally accepted information on the baseline values, and it can be used in various

  17. Design, Development and Pre-Flight Testing of the Communications, Navigation, and Networking Reconfigurable Testbed (Connect) to Investigate Software Defined Radio Architecture on the International Space Station

    Science.gov (United States)

    Over, Ann P.; Barrett, Michael J.; Reinhart, Richard C.; Free, James M.; Cikanek, Harry A., III

    2011-01-01

    The Communication Navigation and Networking Reconfigurable Testbed (CoNNeCT) is a NASA-sponsored mission, which will investigate the usage of Software Defined Radios (SDRs) as a multi-function communication system for space missions. A softwaredefined radio system is a communication system in which typical components of the system (e.g., modulators) are incorporated into software. The software-defined capability allows flexibility and experimentation in different modulation, coding and other parameters to understand their effects on performance. This flexibility builds inherent redundancy and flexibility into the system for improved operational efficiency, real-time changes to space missions and enhanced reliability/redundancy. The CoNNeCT Project is a collaboration between industrial radio providers and NASA. The industrial radio providers are providing the SDRs and NASA is designing, building and testing the entire flight system. The flight system will be integrated on the Express Logistics Carrier (ELC) on the International Space Station (ISS) after launch on the H-IIB Transfer Vehicle in 2012. This paper provides an overview of the technology research objectives, payload description, design challenges and pre-flight testing results.

  18. Development of a rapid agglutination latex test for diagnosis of enteropathogenic and enterohemorrhagic Escherichia coli infection in developing world: defining the biomarker, antibody and method.

    Directory of Open Access Journals (Sweden)

    Letícia B Rocha

    2014-09-01

    Full Text Available Enteropathogenic and enterohemorrhagic Escherichia coli (EPEC/EHEC are human intestinal pathogens responsible for diarrhea in both developing and industrialized countries. In research laboratories, EPEC and EHEC are defined on the basis of their pathogenic features; nevertheless, their identification in routine laboratories is expensive and laborious. Therefore, the aim of the present work was to develop a rapid and simple assay for EPEC/EHEC detection. Accordingly, the EPEC/EHEC-secreted proteins EspA and EspB were chosen as target antigens.First, we investigated the ideal conditions for EspA/EspB production/secretion by ELISA in a collection of EPEC/EHEC strains after cultivating bacterial isolates in Dulbecco's modified Eagle's medium (DMEM or DMEM containing 1% tryptone or HEp-2 cells-preconditioned DMEM, employing either anti-EspA/anti-EspB polyclonal or monoclonal antibodies developed and characterized herein. Subsequently, a rapid agglutination latex test (RALT was developed and tested with the same collection of bacterial isolates.EspB was defined as a biomarker and its corresponding monoclonal antibody as the tool for EPEC/EHEC diagnosis; the production of EspB was better in DMEM medium. RALT assay has the sensitivity and specificity required for high-impact diagnosis of neglected diseases in the developing world.RALT assay described herein can be considered an alternative assay for diarrhea diagnosis in low-income countries since it achieved 97% sensitivity, 98% specificity and 97% efficiency.

  19. Life Support Baseline Values and Assumptions Document

    Science.gov (United States)

    Anderson, Molly S.; Ewert, Michael K.; Keener, John F.

    2018-01-01

    The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. This document identifies many specific physical quantities that define life support systems, serving as a general reference for spacecraft life support system technology developers.

  20. First Grade Baseline Evaluation

    Science.gov (United States)

    Center for Innovation in Assessment (NJ1), 2013

    2013-01-01

    The First Grade Baseline Evaluation is an optional tool that can be used at the beginning of the school year to help teachers get to know the reading and language skills of each student. The evaluation is composed of seven screenings. Teachers may use the entire evaluation or choose to use those individual screenings that they find most beneficial…

  1. Lost opportunities to identify and treat HIV-positive patients: results from a baseline assessment of provider-initiated HIV testing and counselling (PITC) in Malawi.

    Science.gov (United States)

    Ahmed, Saeed; Schwarz, Monica; Flick, Robert J; Rees, Chris A; Harawa, Mwelura; Simon, Katie; Robison, Jeff A; Kazembe, Peter N; Kim, Maria H

    2016-04-01

    To assess implementation of provider-initiated testing and counselling (PITC) for HIV in Malawi. A review of PITC practices within 118 departments in 12 Ministry of Health (MoH) facilities across Malawi was conducted. Information on PITC practices was collected via a health facility survey. Data describing patient visits and HIV tests were abstracted from routinely collected programme data. Reported PITC practices were highly variable. Most providers practiced symptom-based PITC. Antenatal clinics and maternity wards reported widespread use of routine opt-out PITC. In 2014, there was approximately 1 HIV test for every 15 clinic visits. HIV status was ascertained in 94.3% (5293/5615) of patients at tuberculosis clinics, 92.6% (30,675/33,142) of patients at antenatal clinics and 49.4% (6871/13,914) of patients at sexually transmitted infection clinics. Reported challenges to delivering PITC included test kit shortages (71/71 providers), insufficient physical space (58/71) and inadequate number of HIV counsellors (32/71) while providers from inpatient units cited the inability to test on weekends. Various models of PITC currently exist at MoH facilities in Malawi. Only antenatal and maternity clinics demonstrated high rates of routine opt-out PITC. The low ratio of facility visits to HIV tests suggests missed opportunities for HIV testing. However, the high proportion of patients at TB and antenatal clinics with known HIV status suggests that routine PITC is feasible. These results underscore the need to develop clear, standardised PITC policy and protocols, and to address obstacles of limited health commodities, infrastructure and human resources. © 2016 The Authors. Tropical Medicine & International Health Published by John Wiley & Sons Ltd.

  2. Defining criteria for good environmental journalism and testing their applicability: An environmental news review as a first step to more evidence based environmental science reporting.

    Science.gov (United States)

    Rögener, Wiebke; Wormer, Holger

    2017-05-01

    While the quality of environmental science journalism has been the subject of much debate, a widely accepted benchmark to assess the quality of coverage of environmental topics is missing so far. Therefore, we have developed a set of defined criteria of environmental reporting. This instrument and its applicability are tested in a newly established monitoring project for the assessment of pieces on environmental issues, which refer to scientific sources and therefore can be regarded as a special field of science journalism. The quality is assessed in a kind of journalistic peer review. We describe the systematic development of criteria, which might also be a model procedure for other fields of science reporting. Furthermore, we present results from the monitoring of 50 environmental reports in German media. According to these preliminary data, the lack of context and the deficient elucidation of the evidence pose major problems in environmental reporting.

  3. Testing the Impact of a Multi-year, Curriculum-based Undergraduate Research Experience (MY-CURE) in the Geosciences: Baseline Observations

    Science.gov (United States)

    Allen, J. L.; Creamer, E. G.; Kuehn, S. C.

    2016-12-01

    Short-term undergraduate research experiences (URE's) provide skill and confidence enhancement to students, but it is unclear how effective they are in comparison to a dedicated, longer-term URE. This study examines the impact of a long-term URE embedded in a sequence of five courses in the geology curriculum. It begins with a sophomore course in environmental geology, and continues through mineralogy, structural geology, and petrology, before concluding at our summer geology field camp. In this sequence, they build upon individual URE's related to the structure and petrology of fault rocks from a mid-crustal shear zone. Rather than have students engage in one or more short-term URE's, they retain the same project for two calendar years so that we can assess when and how different gains, including a more sophisticated understanding of the nature of science, begin to emerge and mature. As each student progresses, we document the longitudinal development of a diverse suite of gains including: (1) Technical and higher-order research skills, (2) personal gains such as self-identity as a scientist, and (3) communication skills. In this presentation, we describe the framework of the study and baseline observations recorded during the first year of a 2-year cohort. Using a Q-sort method, students were given a deck of 16 index cards with an educational outcome listed on each. They sorted the cards into three piles: Those that encouraged an interest in geology, those that deterred an interest, and those with no impact. Participants discussed the top cards from the negative and positive piles. The top attractors to geology are collegial relationships with faculty, the opportunity to use scientific equipment, field work, the concreteness of geology, and the availability of jobs. Factors that deter interest include hours of tedious homework, math courses, and time invested in wrong answers or failed experiments/sample preparation. Factors not yet evident include confidence in

  4. Test documentation for converting TWRS baseline data from RDD-100 V3.0.2.2 to V4.0.3. Revision 1

    International Nuclear Information System (INIS)

    Gneiting, B.C.; Johnston, M.E.

    1996-01-01

    This document describes the test documentation required for converting between two versions of the RDD-100 software application, specifically version 3.0.2.2 and version 4.0.3. The area of focus in the successful conversion of the master data set between two versions of the database tool and their corresponding data structures

  5. Baseline Assessment of 25-Hydroxyvitamin D Reference Material and Proficiency Testing/External Quality Assurance Material Commutability: A Vitamin D Standardization Program Study.

    Science.gov (United States)

    Phinney, Karen W; Sempos, Christopher T; Tai, Susan S-C; Camara, Johanna E; Wise, Stephen A; Eckfeldt, John H; Hoofnagle, Andrew N; Carter, Graham D; Jones, Julia; Myers, Gary L; Durazo-Arvizu, Ramon; Miller, W Greg; Bachmann, Lorin M; Young, Ian S; Pettit, Juanita; Caldwell, Grahame; Liu, Andrew; Brooks, Stephen P J; Sarafin, Kurtis; Thamm, Michael; Mensink, Gert B M; Busch, Markus; Rabenberg, Martina; Cashman, Kevin D; Kiely, Mairead; Galvin, Karen; Zhang, Joy Y; Kinsella, Michael; Oh, Kyungwon; Lee, Sun-Wha; Jung, Chae L; Cox, Lorna; Goldberg, Gail; Guberg, Kate; Meadows, Sarah; Prentice, Ann; Tian, Lu; Brannon, Patsy M; Lucas, Robyn M; Crump, Peter M; Cavalier, Etienne; Merkel, Joyce; Betz, Joseph M

    2017-09-01

    The Vitamin D Standardization Program (VDSP) coordinated a study in 2012 to assess the commutability of reference materials and proficiency testing/external quality assurance materials for total 25-hydroxyvitamin D [25(OH)D] in human serum, the primary indicator of vitamin D status. A set of 50 single-donor serum samples as well as 17 reference and proficiency testing/external quality assessment materials were analyzed by participating laboratories that used either immunoassay or LC-MS methods for total 25(OH)D. The commutability test materials included National Institute of Standards and Technology Standard Reference Material 972a Vitamin D Metabolites in Human Serum as well as materials from the College of American Pathologists and the Vitamin D External Quality Assessment Scheme. Study protocols and data analysis procedures were in accordance with Clinical and Laboratory Standards Institute guidelines. The majority of the test materials were found to be commutable with the methods used in this commutability study. These results provide guidance for laboratories needing to choose appropriate reference materials and select proficiency or external quality assessment programs and will serve as a foundation for additional VDSP studies.

  6. Criterions for recognition of soil contamination for testing of soil extracts by means of biological testing - determination of ``baseline responses`` of noncontaminated soils; Kriterien zur Erkennung von Bodenkontaminationen bei der Testung von Bodenextrakten mit Hilfe biologischer Wirktests - Ermittlung der ``Grundlevel-Antworten`` unkontaminierter Boeden

    Energy Technology Data Exchange (ETDEWEB)

    Schuphan, I.; Gaipl, S.; Herlitz, E.; Schreiner, J.; Tietz, U. [Technische Hochschule Aachen (Germany). Lehrstuhl fuer Biologie 5

    1997-07-01

    One way to discover contaminations of soil in future is to use biological testing of soil extracts. For this purpose the baseline responses of the biological test systems have to be determined as a basis to distinguish between effects of natural soil components and those of contaminates. These basic level responses have been collected using water and organic extracts of 15 `uncontaminated` soils from different areas of Germany. The use of extracts for testing requires references between the soil extract, the soil itself and the test system. Starting values and the window of competence for testing are proposed. The starting value 1 SE is realized by equivalence of extract aliqouts of a defined soil amount (gramm soil equivalent, g SE) and the amount of test-(cultur)medium in g. The effect limits should be at least the double standard deviation of the blank (extract without soil components). In some test systems higher effect levels have to be fixed according to screening values of `uncontaminated` soils. If no relation can be found to the test-(culture)medium the so called window of competence has to be defined. This is necessary e.g. for the Salmonella mutagenicity test using agar as test medium. In this case screening results of `uncontaminated` soil extracts lead to a gramm Se range, the window of competence, inbetween which no natural response will be found. Extracts giving a mutation ratio (number of revertant colonies from sample extracts/number colonies from the blank) of {>=}2 were considered positively mutagenic. (orig.) [Deutsch] Es sollen kuenftig Bodenkontaminationen durch Testung der entsprechenden Bodenextrakte mit biologischen Wirktests aufgedeckt werden. Dazu muss als Voraussetzung der Grundlevel biologischer Antworten `unbelasteter` Boeden in Testsystemen bestimmt werden. Dieser wurde mit waessrigen und organischen Extrakten von ueber 15 `unbelasteten` Boeden aus verschiedenen Gegenden Deutschlands ermittelt. Die Verwendung von Extrakten fuer die

  7. Rationing with baselines

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Moreno-Ternero, Juan D.; Østerdal, Lars Peter Raahave

    2013-01-01

    We introduce a new operator for general rationing problems in which, besides conflicting claims, individual baselines play an important role in the rationing process. The operator builds onto ideas of composition, which are not only frequent in rationing, but also in related problems...... such as bargaining, choice, and queuing. We characterize the operator and show how it preserves some standard axioms in the literature on rationing. We also relate it to recent contributions in such literature....

  8. The TDAQ Baseline Architecture

    CERN Multimedia

    Wickens, F J

    The Trigger-DAQ community is currently busy preparing material for the DAQ, HLT and DCS TDR. Over the last few weeks a very important step has been a series of meetings to complete agreement on the baseline architecture. An overview of the architecture indicating some of the main parameters is shown in figure 1. As reported at the ATLAS Plenary during the February ATLAS week, the main area where the baseline had not yet been agreed was around the Read-Out System (ROS) and details in the DataFlow. The agreed architecture has: Read-Out Links (ROLs) from the RODs using S-Link; Read-Out Buffers (ROB) sited near the RODs, mounted in a chassis - today assumed to be a PC, using PCI bus at least for configuration, control and monitoring. The baseline assumes data aggregation, in the ROB and/or at the output (which could either be over a bus or in the network). Optimization of the data aggregation will be made in the coming months, but the current model has each ROB card receiving input from 4 ROLs, and 3 such c...

  9. Effects of prior testing lasting a full year in NCANDA adolescents: Contributions from age, sex, socioeconomic status, ethnicity, site, family history of alcohol or drug abuse, and baseline performance

    Directory of Open Access Journals (Sweden)

    Edith V. Sullivan

    2017-04-01

    Full Text Available Longitudinal study provides a robust method for tracking developmental trajectories. Yet inherent problems of retesting pose challenges in distinguishing biological developmental change from prior testing experience. We examined factors potentially influencing change scores on 16 neuropsychological test composites over 1 year in 568 adolescents in the National Consortium on Alcohol and NeuroDevelopment in Adolescence (NCANDA project. The twice-minus-once-tested method revealed that performance gain was mainly attributable to testing experience (practice with little contribution from predicted developmental effects. Group mean practice slopes for 13 composites indicated that 60% to ∼100% variance was attributable to test experience; General Ability accuracy showed the least practice effect (29%. Lower baseline performance, especially in younger participants, was a strong predictor of greater gain. Contributions from age, sex, ethnicity, examination site, socioeconomic status, or family history of alcohol/substance abuse were nil to small, even where statistically significant. Recognizing that a substantial proportion of change in longitudinal testing, even over 1-year, is attributable to testing experience indicates caution against assuming that performance gain observed during periods of maturation necessarily reflects development. Estimates of testing experience, a form of learning, may be a relevant metric for detecting interim influences, such as alcohol use or traumatic episodes, on behavior.

  10. 2017 Annual Technology Baseline

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Wesley J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hand, M. M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Eberle, Annika [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Beiter, Philipp C [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Kurup, Parthiv [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Turchi, Craig S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Feldman, David J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Margolis, Robert M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Augustine, Chad R [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Maness, Michael [Formerly NREL; O' Connor, Patrick [Oak Ridge National Laboratory

    2018-03-26

    Consistent cost and performance data for various electricity generation technologies can be difficult to find and may change frequently for certain technologies. With the Annual Technology Baseline (ATB), the National Renewable Energy Laboratory annually provides an organized and centralized set of such cost and performance data. The ATB uses the best information from the Department of Energy national laboratories' renewable energy analysts as well as information from the Energy Information Administration for fuel-based technologies. The ATB has been reviewed by experts and it includes the following electricity generation technologies: land-based wind, offshore wind, utility-scale solar photovoltaics (PV), commercial-scale solar PV, residential-scale solar PV, concentrating solar power, geothermal power, hydropower, coal, natural gas, nuclear, and conventional biopower. This webinar presentation introduces the 2017 ATB.

  11. Biofuels Baseline 2008

    Energy Technology Data Exchange (ETDEWEB)

    Hamelinck, C.; Koper, M.; Berndes, G.; Englund, O.; Diaz-Chavez, R.; Kunen, E.; Walden, D.

    2011-10-15

    The European Union is promoting the use of biofuels and other renewable energy in transport. In April 2009, the Renewable Energy Directive (2009/28/EC) was adopted that set a 10% target for renewable energy in transport in 2020. The directive sets several requirements to the sustainability of biofuels marketed in the frame of the Directive. The Commission is required to report to the European Parliament on a regular basis on a range of sustainability impacts resulting from the use of biofuels in the EU. This report serves as a baseline of information for regular monitoring on the impacts of the Directive. Chapter 2 discusses the EU biofuels market, the production and consumption of biofuels and international trade. It is derived where the feedstock for EU consumed biofuels originally come from. Chapter 3 discusses the biofuel policy framework in the EU and major third countries of supply. It looks at various policy aspects that are relevant to comply with the EU sustainability requirements. Chapter 4 discusses the environmental and social sustainability aspects associated with EU biofuels and their feedstock. Chapter 5 discusses the macro-economic effects that indirectly result from increased EU biofuels consumption, on commodity prices and land use. Chapter 6 presents country factsheets for main third countries that supplied biofuels to the EU market in 2008.

  12. Patch test results of the European baseline series among patients with occupational contact dermatitis across Europe - analyses of the European Surveillance System on Contact Allergy network, 2002-2010.

    Science.gov (United States)

    Pesonen, Maria; Jolanki, Riitta; Larese Filon, Francesca; Wilkinson, Mark; Kręcisz, Beata; Kieć-Świerczyńska, Marta; Bauer, Andrea; Mahler, Vera; John, Swen M; Schnuch, Axel; Uter, Wolfgang

    2015-03-01

    Occupational contact dermatitis is one of the most common occupational diseases in Europe. In order to develop effective preventive measures, detailed and up-to-date data on the incidence, main causes and professions at risk of occupational contact dermatitis are needed. To describe the pattern of patch test reactivity to allergens in the European baseline series of patients with occupational contact dermatitis in different occupations. We analysed data collected by the European Surveillance System on Contact Allergy (ESSCA) network from 2002 to 2010, from 11 European countries. Allergens in the European baseline series associated with an at least doubled risk of occupational contact dermatitis include: thiuram rubber chemical accelerators, epoxy resin, and the antimicrobials methylchloroisothiazolinone/methylisothiazolinone, methyldibromo glutaronitrile, and formaldehyde. The highest risk of occupational contact dermatitis was found in occupations classified as 'other personal services workers', which includes hairdressers, nursing and other healthcare professionals, precision workers in metal and related materials, and blacksmiths, tool-makers and related trades workers. In the planning and implementation of measures aimed at preventing occupational contact dermatitis, the focus should be on the identified high-risk occupational groups and the most common occupational allergies. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  13. FED baseline engineering studies report

    Energy Technology Data Exchange (ETDEWEB)

    Sager, P.H.

    1983-04-01

    Studies were carried out on the FED Baseline to improve design definition, establish feasibility, and reduce cost. Emphasis was placed on cost reduction, but significant feasibility concerns existed in several areas, and better design definition was required to establish feasibility and provide a better basis for cost estimates. Design definition and feasibility studies included the development of a labyrinth shield ring concept to prevent radiation streaming between the torus spool and the TF coil cryostat. The labyrinth shield concept which was developed reduced radiation streaming sufficiently to permit contact maintenance of the inboard EF coils. Various concepts of preventing arcing between adjacent shield sectors were also explored. It was concluded that installation of copper straps with molybdenum thermal radiation shields would provide the most reliable means of preventing arcing. Other design studies included torus spool electrical/structural concepts, test module shielding, torus seismic response, poloidal conditions in the magnets, disruption characteristics, and eddy current effects. These additional studies had no significant impact on cost but did confirm the feasibility of the basic FED Baseline concept.

  14. FED baseline engineering studies report

    International Nuclear Information System (INIS)

    Sager, P.H.

    1983-04-01

    Studies were carried out on the FED Baseline to improve design definition, establish feasibility, and reduce cost. Emphasis was placed on cost reduction, but significant feasibility concerns existed in several areas, and better design definition was required to establish feasibility and provide a better basis for cost estimates. Design definition and feasibility studies included the development of a labyrinth shield ring concept to prevent radiation streaming between the torus spool and the TF coil cryostat. The labyrinth shield concept which was developed reduced radiation streaming sufficiently to permit contact maintenance of the inboard EF coils. Various concepts of preventing arcing between adjacent shield sectors were also explored. It was concluded that installation of copper straps with molybdenum thermal radiation shields would provide the most reliable means of preventing arcing. Other design studies included torus spool electrical/structural concepts, test module shielding, torus seismic response, poloidal conditions in the magnets, disruption characteristics, and eddy current effects. These additional studies had no significant impact on cost but did confirm the feasibility of the basic FED Baseline concept

  15. Comparison of CFD Predictions of the TCA Baseline

    Science.gov (United States)

    Cappuccio, Gelsomina

    1999-01-01

    The computational fluid dynamics (CFD) comparisons being presented are compared to each other and to wind tunnel (WT) data on the baseline TCA. Some of the CFD computations were done prior to the tests and others later. Only force data (CL vs CD) from CFD will be presented as part of this report. The WT data presented comes from the testing of the baseline TCA in the Langley Unitary Plan Wind Tunnel (UPWT), Test Section #2. There are 2 sets of wind tunnel data being presented: one from test 1671 of model 2a (flapped wing) and the other from test 1679 of model 2b (solid wing). Most of the plots show only one run from each of the WT tests per configuration. But many repeat runs were taken during the tests. The WT repeat runs showed an uncertainty in the drag of +/- 0.5 count. There were times when the uncertainty in drag was better, +/- 0.25 count. Test 1671 data was of forces and pressures measured from model 2a. The wing had cutouts for installing various leading and trailing edge flaps at lower Mach numbers. The internal duct of the nacelles are not designed and fabricated as defined in the outer mold lines (OML) iges file. The internal duct was fabricated such that a linear transition occurs from the inlet to exhaust. Whereas, the iges definition has a constant area internal duct that quickly transitions from the inlet to exhaust cross sectional shape. The nacelle internal duct was fabricated, the way described, to save time and money. The variation in the cross sectional area is less than 1% from the iges definition. The nacelles were also installed with and without fairings. Fairings are defined as the build up of the nacelles on the upper wing surface so that the nacelles poke through the upper surface as defined in the OML iges file. Test 1679 data was of forces measured from model 2a and 2b. The wing for model 2b was a solid wing. The nacelles were built the same way as for model 2a, except for the nacelle base pressure installation. The nacelles were only

  16. Hydrogeology baseline study Aurora Mine

    International Nuclear Information System (INIS)

    1996-01-01

    A baseline hydrogeologic study was conducted in the area of Syncrude's proposed Aurora Mine in order to develop a conceptual regional hydrogeologic model for the area that could be used to understand groundwater flow conditions. Geologic information was obtained from over 2,000 coreholes and from data obtained between 1980 and 1996 regarding water level for the basal aquifer. A 3-D numerical groundwater flow model was developed to provide quantitative estimates of the potential environmental impacts of the proposed mining operations on the groundwater flow system. The information was presented in the context of a regional study area which encompassed much of the Athabasca Oil Sands Region, and a local study area which was defined by the lowlands of the Muskeg River Basin. Characteristics of the topography, hydrology, climate, geology, and hydrogeology of the region are described. The conclusion is that groundwater flow in the aquifer occurs mostly in a westerly direction beneath the Aurora Mine towards its inferred discharge location along the Athabasca River. Baseflow in the Muskeg River is mostly related to discharge from shallow surficial aquifers. Water in the river under baseflow conditions was fresh, of calcium-carbonate type, with very little indication of mineralization associated with deeper groundwater in the Aurora Mine area. 44 refs., 5 tabs., 31 figs

  17. OCRWM baseline management procedure for document identifiers

    International Nuclear Information System (INIS)

    1993-03-01

    This procedure establishes a uniform numbering system (document identifier) for all Program and project technical, cost, and schedule baselines, and selected management and procurement documents developed for and controlled by the Office of Civilian Radioactive Waste Management (OCRWM) for the Civilian Radioactive Waste Management System (CRWMS). The document identifier defined in this procedure is structured to ensure that the relational integrity between configuration items (CIs) and their associated documentation and software is maintained, traceable, categorical, and retrievable for the life of the program

  18. How to Choose Between Measures of Tinnitus Loudness for Clinical Research? A Report on the Reliability and Validity of an Investigator-Administered Test and a Patient-Reported Measure Using Baseline Data Collected in a Phase IIa Drug Trial.

    Science.gov (United States)

    Hall, Deborah A; Mehta, Rajnikant L; Fackrell, Kathryn

    2017-09-18

    Loudness is a major auditory dimension of tinnitus and is used to diagnose severity, counsel patients, or as a measure of clinical efficacy in audiological research. There is no standard test for tinnitus loudness, but matching and rating methods are popular. This article provides important new knowledge about the reliability and validity of an audiologist-administered tinnitus loudness matching test and a patient-reported tinnitus loudness rating. Retrospective analysis of loudness data for 91 participants with stable subjective tinnitus enrolled in a randomized controlled trial of a novel drug for tinnitus. There were two baseline assessments (screening, Day 1) and a posttreatment assessment (Day 28). About 66%-70% of the variability from screening to Day 1 was attributable to the true score. But measurement error, indicated by the smallest detectable change, was high for both tinnitus loudness matching (20 dB) and tinnitus loudness rating (3.5 units). Only loudness rating captured a sensation that was meaningful to people who lived with the experience of tinnitus. The tinnitus loudness rating performed better against acceptability criteria for reliability and validity than did the tinnitus loudness matching test administered by an audiologist. But the rating question is still limited because it is a single-item instrument and is probably able to detect only large changes (at least 3.5 points).

  19. Defining Issues Test-2: fidedignidade da versão brasileira e ponderações acerca de seu uso em pesquisas sobre moralidade Defining Issues Test-2: reliability of the Brazilian version and considerations concerning its use in studies on morality

    Directory of Open Access Journals (Sweden)

    Alessandra de Morais Shimizu

    2004-01-01

    Full Text Available Este estudo teve como objetivo avaliar a fidedignidade da tradução e adaptação brasileira do Definig Issues Test (DIT -2, assim como realizar algumas ponderações sobre a utilização desse instrumento e do DIT -1 em pesquisas sobre moralidade. Os testes DIT-1 e DIT-2 foram aplicados em 621 jovens brasileiros, proporcionalmente distribuídos conforme a cidade de procedência (Floriano/PI, Erechim/RS e Marília/SP, o tipo de escola (pública e particular e o ano escolar freqüentado (8º ano do ensino fundamental e 3º ano do ensino médio. Em relação à fidedignidade, notou-se que, apesar de os valores alcançados serem próximos àquele obtido na tradução e adaptação do DIT-1, revelam-se bem menores que os verificados nas versões originais americanas. Na verificação das pontuações alcançadas nos testes, foi observada a existência de tendências distintas no interior da amostra investigada, marcadas pelas variáveis controladas. Foram tecidas considerações sobre a validade e a interpretação desses testes.This study aims to evaluate the reliability of a Brazilian translation and adaptation of the Defining Issues Test (DIT -2, as well as to make considerations concerning the use of this tool and of DIT -1 in studies on morality. The DIT -1 and DIT -2 were administred to 621 Brazilian youngsters, equally distributed according to the city of origin (Floriano/PI, Erechim/RS and Marília/SP, the kind of school (public or private and the school year attended (8th grade and 11th grade. Regarding the reliability, it was noticed that although the values achieved were close to that one obtained in the translation and adaptation of DIT -1, they revealed themselves much lower than the ones verified in the American original versions. When checking the scores achieved in the tests the existence of some tendencies within the investigated sample was observed. Some considerations regarding the validity and interpretation of these tests are

  20. Long-term monitoring of coastal ecosystems at Las Cruces, Chile: Defining baselines to build ecological literacy in a world of change Monitoreo de largo plazo en el ecosistema marino costero de Las Cruces, Chile: Definiendo líneas base para construir alfabetización ecológica en un mundo que cambia

    Directory of Open Access Journals (Sweden)

    SERGIO A NAVARRETE

    2010-03-01

    Full Text Available Marine coastal habitats are being increasingly impacted by human activities. In addition, there are dramatic climatic disruptions that could generate important and irreversible shifts in coastal ecosystems. Long-term monitoring plays a fundamental and irreplaceable role to establish general baselines from which we can better address current and future impacts and distinguish between natural and anthropogenic changes and fluctuations. Here we highlight how over 25 years of monitoring the coastal marine ecosystem within the no-take marine protected area of Las Cruces has provided critical information to understand ecological baselines and build the necessary ecological literacy for marine management and conservation. We argue that this understanding can only be gained with simultaneous monitoring of reserves and human-impacted areas, and the development of complementary experimental studies that test alternative hypothesis about driving processes and mechanisms. In this contribution we selected four examples to illustrate long-term temporal fluctuations at all trophic levels including taxa from algae to sea birds. From these examples we draw a few general lessons: a there is co-occurrence of rapid- and slowly- unfolding ecological responses to the exclusion of humans within the same rocky shore community. The sharp differences in the pace at which depleted populations recover is at least partly related to differences in life history (dispersal capabilities of the targeted species. b Long-term monitoring of the supply-side of marine communities is critical to evaluate the potential feedback effects of local changes in abundance into the arrival of new individuals and to correctly evaluate environmental and human-induced perturbations. c Unexpected changes in local population dynamics can occur in “independent” and apparently non-interactive modules of the marine ecosystem, such as roosting sea birds inside the reserve. In addition we discuss

  1. Performance Measurement Baseline Change Request

    Data.gov (United States)

    Social Security Administration — The Performance Measurement Baseline Change Request template is used to document changes to scope, cost, schedule, or operational performance metrics for SSA's Major...

  2. Baseline methodologies for clean development mechanism projects

    Energy Technology Data Exchange (ETDEWEB)

    Lee, M.K. (ed.); Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-15

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  3. Baseline methodologies for clean development mechanism projects

    International Nuclear Information System (INIS)

    Lee, M.K.; Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-01

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  4. DairyBISS Baseline report

    NARCIS (Netherlands)

    Buizer, N.N.; Berhanu, Tinsae; Murutse, Girmay; Vugt, van S.M.

    2015-01-01

    This baseline report of the Dairy Business Information Service and Support (DairyBISS) project presents the findings of a baseline survey among 103 commercial farms and 31 firms and advisors working in the dairy value chain. Additional results from the survey among commercial dairy farms are

  5. 1993 baseline solid waste management system description

    International Nuclear Information System (INIS)

    Armacost, L.L.; Fowler, R.A.; Konynenbelt, H.S.

    1994-02-01

    Pacific Northwest Laboratory has prepared this report under the direction of Westinghouse Hanford Company. The report provides an integrated description of the system planned for managing Hanford's solid low-level waste, low-level mixed waste, transuranic waste, and transuranic mixed waste. The primary purpose of this document is to illustrate a collective view of the key functions planned at the Hanford Site to handle existing waste inventories, as well as solid wastes that will be generated in the future. By viewing this system as a whole rather than as individual projects, key facility interactions and requirements are identified and a better understanding of the overall system may be gained. The system is described so as to form a basis for modeling the system at various levels of detail. Model results provide insight into issues such as facility capacity requirements, alternative system operating strategies, and impacts of system changes (ie., startup dates). This description of the planned Hanford solid waste processing system: defines a baseline system configuration; identifies the entering waste streams to be managed within the system; identifies basic system functions and waste flows; and highlights system constraints. This system description will evolve and be revised as issues are resolved, planning decisions are made, additional data are collected, and assumptions are tested and changed. Out of necessity, this document will also be revised and updated so that a documented system description, which reflects current system planning, is always available for use by engineers and managers. It does not provide any results generated from the many alternatives that will be modeled in the course of analyzing solid waste disposal options; such results will be provided in separate documents

  6. Software Defined Networking

    DEFF Research Database (Denmark)

    Caba, Cosmin Marius

    Network Service Providers (NSP) often choose to overprovision their networks instead of deploying proper Quality of Services (QoS) mechanisms that allow for traffic differentiation and predictable quality. This tendency of overprovisioning is not sustainable for the simple reason that network...... resources are limited. Hence, to counteract this trend, current QoS mechanisms must become simpler to deploy and operate, in order to motivate NSPs to employ QoS techniques instead of overprovisioning. Software Defined Networking (SDN) represents a paradigm shift in the way telecommunication and data...... generic perspective (e.g. service provisioning speed, resources availability). As a result, new mechanisms for providing QoS are proposed, solutions for SDN-specific QoS challenges are designed and tested, and new network management concepts are prototyped, all aiming to improve QoS for network services...

  7. Program Baseline Change Control Procedure

    International Nuclear Information System (INIS)

    1993-02-01

    This procedure establishes the responsibilities and process for approving initial issues of and changes to the technical, cost, and schedule baselines, and selected management documents developed by the Office of Civilian Radioactive Waste Management (OCRWM) for the Civilian Radioactive Waste Management System. This procedure implements the OCRWM Baseline Management Plan and DOE Order 4700.1, Chg 1. It streamlines the change control process to enhance integration, accountability, and traceability of Level 0 and Level I decisions through standardized Baseline Change Proposal (BCP) forms to be used by the Level 0, 1, 2, and 3 Baseline Change Control Boards (BCCBs) and to be tracked in the OCRWM-wide Configuration Information System (CIS) Database.This procedure applies to all technical, cost, and schedule baselines controlled by the Energy System Acquisition Advisory Board (ESAAB) BCCB (Level 0) and, OCRWM Program Baseline Control Board (PBCCB) (Level 1). All baseline BCPs initiated by Level 2 or lower BCCBs, which require approval from ESAAB or PBCCB, shall be processed in accordance with this procedure. This procedure also applies to all Program-level management documents controlled by the OCRWM PBCCB

  8. Associations between baseline allergens and polysensitization

    DEFF Research Database (Denmark)

    Carlsen, B.C.; Menne, T.; Johansen, J.D.

    2008-01-01

    Background: Identification of patients at risk of developing polysensitization is not possible at present. An association between weak sensitizers and polysensitization has been hypothesized. Objectives: To examine associations of 21 allergens in the European baseline series to polysensitization....... Patients/Methods: From a database-based study with 14 998 patients patch tested with the European baseline series between 1985 and 2005, a group of 759 (5.1%) patients were polysensitized. Odds ratios were calculated to determine the relative contribution of each allergen to polysensitization. Results...... denominator for the association between the allergens and the polysensitization was apparent, and any association, whether positive or negative, was relatively low. Based on these results, sensitization to specific baseline allergens cannot be used as risk indicators for polysensitization Udgivelsesdato: 2008...

  9. Associations between baseline allergens and polysensitization

    DEFF Research Database (Denmark)

    Carlsen, Berit Christina; Menné, Torkil; Johansen, Jeanne Duus

    2008-01-01

    BACKGROUND: Identification of patients at risk of developing polysensitization is not possible at present. An association between weak sensitizers and polysensitization has been hypothesized. OBJECTIVES: To examine associations of 21 allergens in the European baseline series to polysensitization....... PATIENTS/METHODS: From a database-based study with 14 998 patients patch tested with the European baseline series between 1985 and 2005, a group of 759 (5.1%) patients were polysensitized. Odds ratios were calculated to determine the relative contribution of each allergen to polysensitization. RESULTS...... denominator for the association between the allergens and the polysensitization was apparent, and any association, whether positive or negative, was relatively low. Based on these results, sensitization to specific baseline allergens cannot be used as risk indicators for polysensitization....

  10. Oscillation Baselining and Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-27

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  11. 324 Building Baseline Radiological Characterization

    Energy Technology Data Exchange (ETDEWEB)

    R.J. Reeder, J.C. Cooper

    2010-06-24

    This report documents the analysis of radiological data collected as part of the characterization study performed in 1998. The study was performed to create a baseline of the radiological conditions in the 324 Building.

  12. Baseline-dependent averaging in radio interferometry

    Science.gov (United States)

    Wijnholds, S. J.; Willis, A. G.; Salvini, S.

    2018-05-01

    This paper presents a detailed analysis of the applicability and benefits of baseline-dependent averaging (BDA) in modern radio interferometers and in particular the Square Kilometre Array. We demonstrate that BDA does not affect the information content of the data other than a well-defined decorrelation loss for which closed form expressions are readily available. We verify these theoretical findings using simulations. We therefore conclude that BDA can be used reliably in modern radio interferometry allowing a reduction of visibility data volume (and hence processing costs for handling visibility data) by more than 80 per cent.

  13. Numerical Tests of the Virtual Human Model Response Under Dynamic Load Conditions Defined in Federal Aviation Regulation Part 23.562 and 25.562 – Preliminary Study

    Directory of Open Access Journals (Sweden)

    Lindstedt Lukasz

    2016-12-01

    Full Text Available The main aim of the presented research was to check mechanical response of human body model under loads that can occur during airplane accidents and compare results of analysis with some results of experimental tests described in literature. In simulations, new multi-purpose human body model, the VIRTHUMAN, was used. The whole model, as well as its particular segments, was earlier validated based on experimental data, which proved its accuracy to simulate human body dynamic response under condition typical for car crashes, but it was not validated for loads with predominant vertical component (loads acting along spinal column, typical for airplane crashes. Due to limitation of available experimental data, the authors focused on conducting calculations for the case introduced in 14 CFR: Parts 23.562 and 25.562, paragraph (b(1, knowing as the 60° pitch test. The analysis consists in comparison of compression load measured in lumbar section of spine of the FAA HIII Dummy (experimental model and in the Virthuman (numerical model. The performed analyses show numerical stability of the model and satisfactory agreement between experimental data and simulated Virthuman responses. In that sense, the Virthuman model, although originally developed for automotive analyses, shows also great potential to become valuable tool for applications in aviation crashworthiness and safety analyses, as well.

  14. Survival of the most transferable at the top of Jacob's ladder: Defining and testing the ωB97M(2) double hybrid density functional

    Science.gov (United States)

    Mardirossian, Narbe; Head-Gordon, Martin

    2018-06-01

    A meta-generalized gradient approximation, range-separated double hybrid (DH) density functional with VV10 non-local correlation is presented. The final 14-parameter functional form is determined by screening trillions of candidate fits through a combination of best subset selection, forward stepwise selection, and random sample consensus (RANSAC) outlier detection. The MGCDB84 database of 4986 data points is employed in this work, containing a training set of 870 data points, a validation set of 2964 data points, and a test set of 1152 data points. Following an xDH approach, orbitals from the ωB97M-V density functional are used to compute the second-order perturbation theory correction. The resulting functional, ωB97M(2), is benchmarked against a variety of leading double hybrid density functionals, including B2PLYP-D3(BJ), B2GPPLYP-D3(BJ), ωB97X-2(TQZ), XYG3, PTPSS-D3(0), XYGJ-OS, DSD-PBEP86-D3(BJ), and DSD-PBEPBE-D3(BJ). Encouragingly, the overall performance of ωB97M(2) on nearly 5000 data points clearly surpasses that of all of the tested density functionals. As a Rung 5 density functional, ωB97M(2) completes our family of combinatorially optimized functionals, complementing B97M-V on Rung 3, and ωB97X-V and ωB97M-V on Rung 4. The results suggest that ωB97M(2) has the potential to serve as a powerful predictive tool for accurate and efficient electronic structure calculations of main-group chemistry.

  15. Developing RESRAD-BASELINE for environmental baseline risk assessment

    International Nuclear Information System (INIS)

    Cheng, Jing-Jy.

    1995-01-01

    RESRAD-BASELINE is a computer code developed at Argonne developed at Argonne National Laboratory for the US Department of Energy (DOE) to perform both radiological and chemical risk assessments. The code implements the baseline risk assessment guidance of the US Environmental Protection Agency (EPA 1989). The computer code calculates (1) radiation doses and cancer risks from exposure to radioactive materials, and (2) hazard indexes and cancer risks from exposure to noncarcinogenic and carcinogenic chemicals, respectively. The user can enter measured or predicted environmental media concentrations from the graphic interface and can simulate different exposure scenarios by selecting the appropriate pathways and modifying the exposure parameters. The database used by PESRAD-BASELINE includes dose conversion factors and slope factors for radionuclides and toxicity information and properties for chemicals. The user can modify the database for use in the calculation. Sensitivity analysis can be performed while running the computer code to examine the influence of the input parameters. Use of RESRAD-BASELINE for risk analysis is easy, fast, and cost-saving. Furthermore, it ensures in consistency in methodology for both radiological and chemical risk analyses

  16. Defining Quantum Control Flow

    OpenAIRE

    Ying, Mingsheng; Yu, Nengkun; Feng, Yuan

    2012-01-01

    A remarkable difference between quantum and classical programs is that the control flow of the former can be either classical or quantum. One of the key issues in the theory of quantum programming languages is defining and understanding quantum control flow. A functional language with quantum control flow was defined by Altenkirch and Grattage [\\textit{Proc. LICS'05}, pp. 249-258]. This paper extends their work, and we introduce a general quantum control structure by defining three new quantu...

  17. Can play be defined?

    DEFF Research Database (Denmark)

    Eichberg, Henning

    2015-01-01

    Can play be defined? There is reason to raise critical questions about the established academic demand that at phenomenon – also in humanist studies – should first of all be defined, i.e. de-lineated and by neat lines limited to a “little box” that can be handled. The following chapter develops....... Human beings can very well understand play – or whatever phenomenon in human life – without defining it....

  18. Defining predictive values using three different platelet function tests for CYP2C19 phenotype status on maintenance dual antiplatelet therapy after PCI.

    Science.gov (United States)

    Zhang, Hong-Zhe; Kim, Moo Hyun; Han, Jin-Yeong; Jeong, Young-Hoon

    2014-01-01

    Published data suggests that the presence of CYP2C19*2 or *3 loss of function (LOF) alleles is indicative of increased platelet aggregation and a higher risk of adverse cardiovascular events after clopidogrel administration. We sought to determine cut-off values using three different assays for prediction of the CYP2C19 phenotype in Korean percutaneous coronary intervention (PCI) patients. We enrolled 244 patients with drug-eluting stent implantation who were receiving clopidogrel and aspirin maintenance therapy for one month or more. Platelet reactivity was assessed with light transmittance aggregometry (LTA), multiple electrode aggregometry (MEA) and the VerifyNow P2Y12 assay (VN). The CYP2C19 genotype was analyzed by polymerase chain reaction (PCR) and snapshot method. The frequency of CYP2C19 LOF allele carriers was 58.6%. The cut-off values from LTA, MEA and VerifyNow for the identification of LOF allele carriers were as follows: 10 µM ADP-induced LTA ≥ 48 %, VN>242 PRU and MEA ≥ 37 U. Between the three tests, correlation was higher between LTA vs. VN assays (r=0.69) and LTA vs. MEA (r=0.56), with moderate agreement (κ=0.46 and κ=0.46), but between VN assay and MEA, both devices using whole blood showed a lower correlation (r=0.42) and agreement (κ=0.3). Our results provide guidance regarding cut-off levels for LTA, VerifyNow and MEA assays to detect the CYP2C19 LOF allele in patients during dual antiplatelet maintenance therapy.

  19. Baseline Architecture of ITER Control System

    Science.gov (United States)

    Wallander, A.; Di Maio, F.; Journeaux, J.-Y.; Klotz, W.-D.; Makijarvi, P.; Yonekawa, I.

    2011-08-01

    The control system of ITER consists of thousands of computers processing hundreds of thousands of signals. The control system, being the primary tool for operating the machine, shall integrate, control and coordinate all these computers and signals and allow a limited number of staff to operate the machine from a central location with minimum human intervention. The primary functions of the ITER control system are plant control, supervision and coordination, both during experimental pulses and 24/7 continuous operation. The former can be split in three phases; preparation of the experiment by defining all parameters; executing the experiment including distributed feed-back control and finally collecting, archiving, analyzing and presenting all data produced by the experiment. We define the control system as a set of hardware and software components with well defined characteristics. The architecture addresses the organization of these components and their relationship to each other. We distinguish between physical and functional architecture, where the former defines the physical connections and the latter the data flow between components. In this paper, we identify the ITER control system based on the plant breakdown structure. Then, the control system is partitioned into a workable set of bounded subsystems. This partition considers at the same time the completeness and the integration of the subsystems. The components making up subsystems are identified and defined, a naming convention is introduced and the physical networks defined. Special attention is given to timing and real-time communication for distributed control. Finally we discuss baseline technologies for implementing the proposed architecture based on analysis, market surveys, prototyping and benchmarking carried out during the last year.

  20. Long baseline neutrino oscillation experiments

    International Nuclear Information System (INIS)

    Gallagher, H.

    2006-01-01

    In this paper I will review briefly the experimental results which established the existence of neutrino mixing, the current generation of long baseline accelerator experiments, and the prospects for the future. In particular I will focus on the recent analysis of the MINOS experiment. (author)

  1. Baseline Report on HB2320

    Science.gov (United States)

    State Council of Higher Education for Virginia, 2015

    2015-01-01

    Staff provides this baseline report as a summary of its preliminary considerations and initial research in fulfillment of the requirements of HB2320 from the 2015 session of the General Assembly. Codified as § 23-7.4:7, this legislation compels the Education Secretary and the State Council of Higher Education for Virginia (SCHEV) Director, in…

  2. Defining Overweight and Obesity

    Science.gov (United States)

    ... Micronutrient Malnutrition State and Local Programs Defining Adult Overweight and Obesity Recommend on Facebook Tweet Share Compartir ... weight for a given height is described as overweight or obese. Body Mass Index, or BMI, is ...

  3. Drinking Levels Defined

    Science.gov (United States)

    ... of Alcohol Consumption Alcohol's Effects on the Body Alcohol Use Disorder Fetal Alcohol Exposure Support & Treatment Alcohol Policy Special ... Definition of Drinking at Low Risk for Developing Alcohol Use Disorder (AUD): For women, low-risk drinking is defined ...

  4. Defining Documentary Film

    DEFF Research Database (Denmark)

    Juel, Henrik

    2006-01-01

    A discussion of various attemts at defining documentary film regarding form, content, truth, stile, genre or reception - and a propoposal of a positive list of essential, but non-exclusive characteristica of documentary film......A discussion of various attemts at defining documentary film regarding form, content, truth, stile, genre or reception - and a propoposal of a positive list of essential, but non-exclusive characteristica of documentary film...

  5. Rationale, design, and baseline findings from HIPP: A randomized controlled trial testing a home-based, individually-tailored physical activity print intervention for African American women in the Deep South.

    Science.gov (United States)

    Pekmezi, Dori; Ainsworth, Cole; Joseph, Rodney; Bray, Molly S; Kvale, Elizabeth; Isaac, Shiney; Desmond, Renee; Meneses, Karen; Marcus, Bess; Demark-Wahnefried, Wendy

    2016-03-01

    African American women report high rates of physical inactivity and related health disparities. In our previous formative research, we conducted a series of qualitative assessments to examine physical activity barriers and intervention preferences among African American women in the Deep South. These data were used to inform a 12-month Home-based, Individually-tailored Physical activity Print (HIPP) intervention, which is currently being evaluated against a wellness contact control condition among 84 post-menopausal African American women residing in the metropolitan area of Birmingham, Alabama. This paper reports the rationale, design and baseline findings of the HIPP trial. The accrued participants had an average age of 57 (SD=4.7), a BMI of 32.1 kg/m(2) (SD=5.16) with more than half (55%) having a college education and an annual household income under $50,000 (53.6%). At baseline, participants reported an average of 41.5 min/week (SD=49.7) of moderate intensity physical activity, and 94.1% were in the contemplation or preparation stages of readiness for physical activity. While social support for exercise from friends and family was low, baseline levels of self-efficacy, cognitive and behavioral processes of change, decisional balance, outcome expectations, and enjoyment appeared promising. Baseline data indicated high rates of obesity and low levels of physical activity, providing strong evidence of need for intervention. Moreover, scores on psychosocial measures suggested that such efforts may be well received. This line of research in technology-based approaches for promoting physical activity in African American women in the Deep South has great potential to address health disparities and impact public health. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Baseline budgeting for continuous improvement.

    Science.gov (United States)

    Kilty, G L

    1999-05-01

    This article is designed to introduce the techniques used to convert traditionally maintained department budgets to baseline budgets. This entails identifying key activities, evaluating for value-added, and implementing continuous improvement opportunities. Baseline Budgeting for Continuous Improvement was created as a result of a newly named company president's request to implement zero-based budgeting. The president was frustrated with the mind-set of the organization, namely, "Next year's budget should be 10 to 15 percent more than this year's spending." Zero-based budgeting was not the answer, but combining the principles of activity-based costing and the Just-in-Time philosophy of eliminating waste and continuous improvement did provide a solution to the problem.

  7. Automated baseline change detection - Phases 1 and 2. Final report

    International Nuclear Information System (INIS)

    Byler, E.

    1997-01-01

    The primary objective of this project is to apply robotic and optical sensor technology to the operational inspection of mixed toxic and radioactive waste stored in barrels, using Automated Baseline Change Detection (ABCD), based on image subtraction. Absolute change detection is based on detecting any visible physical changes, regardless of cause, between a current inspection image of a barrel and an archived baseline image of the same barrel. Thus, in addition to rust, the ABCD system can also detect corrosion, leaks, dents, and bulges. The ABCD approach and method rely on precise camera positioning and repositioning relative to the barrel and on feature recognition in images. The ABCD image processing software was installed on a robotic vehicle developed under a related DOE/FETC contract DE-AC21-92MC29112 Intelligent Mobile Sensor System (IMSS) and integrated with the electronics and software. This vehicle was designed especially to navigate in DOE Waste Storage Facilities. Initial system testing was performed at Fernald in June 1996. After some further development and more extensive integration the prototype integrated system was installed and tested at the Radioactive Waste Management Facility (RWMC) at INEEL beginning in April 1997 through the present (November 1997). The integrated system, composed of ABCD imaging software and IMSS mobility base, is called MISS EVE (Mobile Intelligent Sensor System--Environmental Validation Expert). Evaluation of the integrated system in RWMC Building 628, containing approximately 10,000 drums, demonstrated an easy to use system with the ability to properly navigate through the facility, image all the defined drums, and process the results into a report delivered to the operator on a GUI interface and on hard copy. Further work is needed to make the brassboard system more operationally robust

  8. Establishing a store baseline during interim storage of waste packages and a review of potential technologies for base-lining

    Energy Technology Data Exchange (ETDEWEB)

    McTeer, Jennifer; Morris, Jenny; Wickham, Stephen [Galson Sciences Ltd. Oakham, Rutland (United Kingdom); Bolton, Gary [National Nuclear Laboratory Risley, Warrington (United Kingdom); McKinney, James; Morris, Darrell [Nuclear Decommissioning Authority Moor Row, Cumbria (United Kingdom); Angus, Mike [National Nuclear Laboratory Risley, Warrington (United Kingdom); Cann, Gavin; Binks, Tracey [National Nuclear Laboratory Sellafield (United Kingdom)

    2013-07-01

    Interim storage is an essential component of the waste management lifecycle, providing a safe, secure environment for waste packages awaiting final disposal. In order to be able to monitor and detect change or degradation of the waste packages, storage building or equipment, it is necessary to know the original condition of these components (the 'waste storage system'). This paper presents an approach to establishing the baseline for a waste-storage system, and provides guidance on the selection and implementation of potential base-lining technologies. The approach is made up of two sections; assessment of base-lining needs and definition of base-lining approach. During the assessment of base-lining needs a review of available monitoring data and store/package records should be undertaken (if the store is operational). Evolutionary processes (affecting safety functions), and their corresponding indicators, that can be measured to provide a baseline for the waste-storage system should then be identified in order for the most suitable indicators to be selected for base-lining. In defining the approach, identification of opportunities to collect data and constraints is undertaken before selecting the techniques for base-lining and developing a base-lining plan. Base-lining data may be used to establish that the state of the packages is consistent with the waste acceptance criteria for the storage facility and to support the interpretation of monitoring and inspection data collected during store operations. Opportunities and constraints are identified for different store and package types. Technologies that could potentially be used to measure baseline indicators are also reviewed. (authors)

  9. Definably compact groups definable in real closed fields. I

    OpenAIRE

    Barriga, Eliana

    2017-01-01

    We study definably compact definably connected groups definable in a sufficiently saturated real closed field $R$. We introduce the notion of group-generic point for $\\bigvee$-definable groups and show the existence of group-generic points for definably compact groups definable in a sufficiently saturated o-minimal expansion of a real closed field. We use this notion along with some properties of generic sets to prove that for every definably compact definably connected group $G$ definable in...

  10. Correlation of AATCC Test Method 150 To AATCC Test Method 61 For Use With Laundering Durability Studies of Retroreflective Items As Defined in Purchase Description CO/PD-06-05A

    Science.gov (United States)

    2017-06-02

    endorsed this work under the Warfighter Improved Combat Identification Development project to explore the opportunity for reducing laundering test time and...from the TMET laundering tests. When approval is granted from Product Manager Soldier, Clothing and Individual Equipment (PdM SCIE), IR patch end...1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and

  11. Screening for HbA1c-defined prediabetes and diabetes in an at-risk greek population: performance comparison of random capillary glucose, the ADA diabetes risk test and skin fluorescence spectroscopy.

    Science.gov (United States)

    Tentolouris, Nicholas; Lathouris, Panagiotis; Lontou, Stavroula; Tzemos, Kostas; Maynard, John

    2013-04-01

    We examined the accuracy of random capillary glucose (RCG) and two noninvasive screening methods, the ADA diabetes risk test (DRT) and skin fluorescence spectroscopy (SFS) as measured by Scout DS for detecting HbA1c-defined dysglycemia or type 2 diabetes in an at-risk cohort. Subjects were recruited at two clinical sites for a single non-fasting visit. Each subject had measurements of height, weight and waist circumference. A diabetes score was calculated from skin fluorescence measured on the left forearm. A finger prick was done to measure RCG and HbA1c (A1C). Health questionnaires were completed for the DRT. Increasing dysglycemia was defined as A1C ≥ 5.7% (39 mmol/mol) or ≥ 6.0% (42 mmol/mol). Type 2 diabetes was defined as A1C ≥ 6.5% (47.5 mmol/mol). 398 of 409 subjects had complete data for analysis with means for age, body mass index, and waist of 52 years, 27 kg/m(2) and 90 cm. 51% were male. Prevalence of A1C ≥ 5.7%, ≥ 6.0% and ≥ 6.5% were 54%, 34% and 12%, respectively. Areas under the curve (AUC) for detection of increasing levels dysglycemia or diabetes for RCG were 63%, 66% and 72%, for the ADA DRT the AUCs were 75%, 76% and 81% and for SFS the AUCs were 82%, 84% and 90%, respectively. For each level of dysglycemia or diabetes, the SFS AUC was significantly higher than RCG or the ADA DRT. The noninvasive skin fluorescence spectroscopy measurement outperformed both RCG and the ADA DRT for detection of A1C-defined dysglycemia or diabetes in an at-risk cohort. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  12. Integrated Baseline Review (IBR) Handbook

    Science.gov (United States)

    Fleming, Jon F.; Terrell, Stefanie M.

    2018-01-01

    The purpose of this handbook is intended to be a how-to guide to prepare for, conduct, and close-out an Integrated Baseline Review (IBR). It discusses the steps that should be considered, describes roles and responsibilities, tips for tailoring the IBR based on risk, cost, and need for management insight, and provides lessons learned from past IBRs. Appendices contain example documentation typically used in connection with an IBR. Note that these appendices are examples only, and should be tailored to meet the needs of individual projects and contracts.

  13. Environmental Baseline File National Transportation

    International Nuclear Information System (INIS)

    Harris, M.

    1999-01-01

    This Environmental Baseline File summarizes and consolidates information related to the national-level transportation of commercial spent nuclear fuel. Topics addressed include: shipments of commercial spent nuclear fuel based on mostly truck and mostly rail shipping scenarios; transportation routing for commercial spent nuclear fuel sites and DOE sites; radionuclide inventories for various shipping container capacities; transportation routing; populations along transportation routes; urbanized area population densities; the impacts of historical, reasonably foreseeable, and general transportation; state-level food transfer factors; Federal Guidance Report No. 11 and 12 radionuclide dose conversion factors; and national average atmospheric conditions

  14. Baseline atmospheric program Australia 1993

    International Nuclear Information System (INIS)

    Francey, R.J.; Dick, A.L.; Derek, N.

    1996-01-01

    This publication reports activities, program summaries and data from the Cape Grim Baseline Air Pollution Station in Tasmania, during the calendar year 1993. These activities represent Australia's main contribution to the Background Air Pollution Monitoring Network (BAPMoN), part of the World Meteorological Organization's Global Atmosphere Watch (GAW). The report includes 5 research reports covering trace gas sampling, ozone and radon interdependence, analysis of atmospheric dimethylsulfide and carbon-disulfide, sampling of trace gas composition of the troposphere, and sulfur aerosol/CCN relationship in marine air. Summaries of program reports for the calendar year 1993 are also included. Tabs., figs., refs

  15. Defining Game Mechanics

    DEFF Research Database (Denmark)

    Sicart (Vila), Miguel Angel

    2008-01-01

    This article defins game mechanics in relation to rules and challenges. Game mechanics are methods invoked by agents for interacting with the game world. I apply this definition to a comparative analysis of the games Rez, Every Extend Extra and Shadow of the Colossus that will show the relevance...... of a formal definition of game mechanics. Udgivelsesdato: Dec 2008...

  16. Modal Logics and Definability

    OpenAIRE

    Kuusisto, Antti

    2013-01-01

    In recent years, research into the mathematical foundations of modal logic has become increasingly popular. One of the main reasons for this is the fact that modal logic seems to adapt well to the requirements of a wide range of different fields of application. This paper is a summary of some of the author’s contributions to the understanding of modal definability theory.

  17. Software Defined Cyberinfrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Foster, Ian; Blaiszik, Ben; Chard, Kyle; Chard, Ryan

    2017-07-17

    Within and across thousands of science labs, researchers and students struggle to manage data produced in experiments, simulations, and analyses. Largely manual research data lifecycle management processes mean that much time is wasted, research results are often irreproducible, and data sharing and reuse remain rare. In response, we propose a new approach to data lifecycle management in which researchers are empowered to define the actions to be performed at individual storage systems when data are created or modified: actions such as analysis, transformation, copying, and publication. We term this approach software-defined cyberinfrastructure because users can implement powerful data management policies by deploying rules to local storage systems, much as software-defined networking allows users to configure networks by deploying rules to switches.We argue that this approach can enable a new class of responsive distributed storage infrastructure that will accelerate research innovation by allowing any researcher to associate data workflows with data sources, whether local or remote, for such purposes as data ingest, characterization, indexing, and sharing. We report on early experiments with this approach in the context of experimental science, in which a simple if-trigger-then-action (IFTA) notation is used to define rules.

  18. Defining Abnormally Low Tenders

    DEFF Research Database (Denmark)

    Ølykke, Grith Skovgaard; Nyström, Johan

    2017-01-01

    The concept of an abnormally low tender is not defined in EU public procurement law. This article takes an interdisciplinary law and economics approach to examine a dataset consisting of Swedish and Danish judgments and verdicts concerning the concept of an abnormally low tender. The purpose...

  19. Software Defined Coded Networking

    DEFF Research Database (Denmark)

    Di Paola, Carla; Roetter, Daniel Enrique Lucani; Palazzo, Sergio

    2017-01-01

    the quality of each link and even across neighbouring links and using simulations to show that an additional reduction of packet transmission in the order of 40% is possible. Second, to advocate for the use of network coding (NC) jointly with software defined networking (SDN) providing an implementation...

  20. Defining depth of anesthesia.

    Science.gov (United States)

    Shafer, S L; Stanski, D R

    2008-01-01

    In this chapter, drawn largely from the synthesis of material that we first presented in the sixth edition of Miller's Anesthesia, Chap 31 (Stanski and Shafer 2005; used by permission of the publisher), we have defined anesthetic depth as the probability of non-response to stimulation, calibrated against the strength of the stimulus, the difficulty of suppressing the response, and the drug-induced probability of non-responsiveness at defined effect site concentrations. This definition requires measurement of multiple different stimuli and responses at well-defined drug concentrations. There is no one stimulus and response measurement that will capture depth of anesthesia in a clinically or scientifically meaningful manner. The "clinical art" of anesthesia requires calibration of these observations of stimuli and responses (verbal responses, movement, tachycardia) against the dose and concentration of anesthetic drugs used to reduce the probability of response, constantly adjusting the administered dose to achieve the desired anesthetic depth. In our definition of "depth of anesthesia" we define the need for two components to create the anesthetic state: hypnosis created with drugs such as propofol or the inhalational anesthetics and analgesia created with the opioids or nitrous oxide. We demonstrate the scientific evidence that profound degrees of hypnosis in the absence of analgesia will not prevent the hemodynamic responses to profoundly noxious stimuli. Also, profound degrees of analgesia do not guarantee unconsciousness. However, the combination of hypnosis and analgesia suppresses hemodynamic response to noxious stimuli and guarantees unconsciousness.

  1. Defining and classifying syncope

    NARCIS (Netherlands)

    Thijs, Roland D.; Wieling, Wouter; Kaufmann, Horacio; van Dijk, Gert

    2004-01-01

    There is no widely adopted definition or classification of syncope and related disorders. This lack of uniformity harms patient care, research, and medical education. In this article, syncope is defined as a form of transient loss of consciousness (TLOC) due to cerebral hypoperfusion. Differences

  2. High Baseline Postconcussion Symptom Scores and Concussion Outcomes in Athletes.

    Science.gov (United States)

    Custer, Aimee; Sufrinko, Alicia; Elbin, R J; Covassin, Tracey; Collins, Micky; Kontos, Anthony

    2016-02-01

    Some healthy athletes report high levels of baseline concussion symptoms, which may be attributable to several factors (eg, illness, personality, somaticizing). However, the role of baseline symptoms in outcomes after sport-related concussion (SRC) has not been empirically examined. To determine if athletes with high symptom scores at baseline performed worse than athletes without baseline symptoms on neurocognitive testing after SRC. Cohort study. High school and collegiate athletic programs. A total of 670 high school and collegiate athletes participated in the study. Participants were divided into groups with either no baseline symptoms (Postconcussion Symptom Scale [PCSS] score = 0, n = 247) or a high level of baseline symptoms (PCSS score > 18 [top 10% of sample], n = 68). Participants were evaluated at baseline and 2 to 7 days after SRC with the Immediate Post-concussion Assessment and Cognitive Test and PCSS. Outcome measures were Immediate Post-concussion Assessment and Cognitive Test composite scores (verbal memory, visual memory, visual motor processing speed, and reaction time) and total symptom score on the PCSS. The groups were compared using repeated-measures analyses of variance with Bonferroni correction to assess interactions between group and time for symptoms and neurocognitive impairment. The no-symptoms group represented 38% of the original sample, whereas the high-symptoms group represented 11% of the sample. The high-symptoms group experienced a larger decline from preinjury to postinjury than the no-symptoms group in verbal (P = .03) and visual memory (P = .05). However, total concussion-symptom scores increased from preinjury to postinjury for the no-symptoms group (P = .001) but remained stable for the high-symptoms group. Reported baseline symptoms may help identify athletes at risk for worse outcomes after SRC. Clinicians should examine baseline symptom levels to better identify patients for earlier referral and treatment for their

  3. MRI-defined subcortical ischemic vascular disease: baseline clinical and neuropsychological findings. The LADIS Study

    DEFF Research Database (Denmark)

    Jokinen, Hanna; Kalska, Hely; Ylikoski, Raija

    2009-01-01

    of global cognitive function, psychomotor speed, attention and executive functions, verbal fluency, and working memory. CONCLUSION: In this population of nondisabled older adults with WML, SIVD was related to specific clinical and functional characteristics. Neuropsychological features included psychomotor...... slowing as well as deficits in attention and executive functions....

  4. Sustainable Design of EPA's Campus in Research Triangle Park, NC—Environmental Performance Specifications in Construction Contracts—Section 01445 Testing for Indoor Air Quality, Baseline IAQ, and Materials

    Science.gov (United States)

    More information on testing for maximum indoor pollutant concentrations for acceptance of the facility, as well as requirements for Independent Materials Testing of specific materials anticipated to have major impact on indoor air quality.

  5. Defining Legal Moralism

    DEFF Research Database (Denmark)

    Thaysen, Jens Damgaard

    2015-01-01

    This paper discusses how legal moralism should be defined. It is argued that legal moralism should be defined as the position that “For any X, it is always a pro tanto reason for justifiably imposing legal regulation on X that X is morally wrong (where “morally wrong” is not conceptually equivalent...... to “harmful”)”. Furthermore, a distinction between six types of legal moralism is made. The six types are grouped according to whether they are concerned with the enforcement of positive or critical morality, and whether they are concerned with criminalising, legally restricting, or refraining from legally...... protecting morally wrong behaviour. This is interesting because not all types of legal moralism are equally vulnerable to the different critiques of legal moralism that have been put forth. Indeed, I show that some interesting types of legal moralism have not been criticised at all....

  6. Defining local food

    DEFF Research Database (Denmark)

    Eriksen, Safania Normann

    2013-01-01

    Despite evolving local food research, there is no consistent definition of “local food.” Various understandings are utilized, which have resulted in a diverse landscape of meaning. The main purpose of this paper is to examine how researchers within the local food systems literature define local...... food, and how these definitions can be used as a starting point to identify a new taxonomy of local food based on three domains of proximity....

  7. Pinellas Plant Environmental Baseline Report

    Energy Technology Data Exchange (ETDEWEB)

    1997-06-01

    The Pinellas Plant has been part of the Department of Energy`s (DOE) nuclear weapons complex since the plant opened in 1957. In March 1995, the DOE sold the Pinellas Plant to the Pinellas County Industry Council (PCIC). DOE has leased back a large portion of the plant site to facilitate transition to alternate use and safe shutdown. The current mission is to achieve a safe transition of the facility from defense production and prepare the site for alternative uses as a community resource for economic development. Toward that effort, the Pinellas Plant Environmental Baseline Report (EBR) discusses the current and past environmental conditions of the plant site. Information for the EBR is obtained from plant records. Historical process and chemical usage information for each area is reviewed during area characterizations.

  8. Integrated Baseline Review (IBR) Handbook

    Science.gov (United States)

    Fleming, Jon F.; Kehrer, Kristen C.

    2016-01-01

    The purpose of this handbook is intended to be a how-to guide to prepare for, conduct, and close-out an Integrated Baseline Review (IBR). It discusses the steps that should be considered, describes roles and responsibilities, tips for tailoring the IBR based on risk, cost, and need for management insight, and provides lessons learned from past IBRs. Appendices contain example documentation typically used in connection with an IBR. Note that these appendices are examples only, and should be tailored to meet the needs of individual projects and contracts. Following the guidance in this handbook will help customers and suppliers preparing for an IBR understand the expectations of the IBR, and ensure that the IBR meets the requirements for both in-house and contract efforts.

  9. The California Baseline Methane Survey

    Science.gov (United States)

    Duren, R. M.; Thorpe, A. K.; Hopkins, F. M.; Rafiq, T.; Bue, B. D.; Prasad, K.; Mccubbin, I.; Miller, C. E.

    2017-12-01

    The California Baseline Methane Survey is the first systematic, statewide assessment of methane point source emissions. The objectives are to reduce uncertainty in the state's methane budget and to identify emission mitigation priorities for state and local agencies, utilities and facility owners. The project combines remote sensing of large areas with airborne imaging spectroscopy and spatially resolved bottom-up data sets to detect, quantify and attribute emissions from diverse sectors including agriculture, waste management, oil and gas production and the natural gas supply chain. Phase 1 of the project surveyed nearly 180,000 individual facilities and infrastructure components across California in 2016 - achieving completeness rates ranging from 20% to 100% per emission sector at < 5 meters spatial resolution. Additionally, intensive studies of key areas and sectors were performed to assess source persistence and variability at times scales ranging from minutes to months. Phase 2 of the project continues with additional data collection in Spring and Fall 2017. We describe the survey design and measurement, modeling and analysis methods. We present initial findings regarding the spatial, temporal and sectoral distribution of methane point source emissions in California and their estimated contribution to the state's total methane budget. We provide case-studies and lessons learned about key sectors including examples where super-emitters were identified and mitigated. We summarize challenges and recommendations for future methane research, inventories and mitigation guidance within and beyond California.

  10. 2016 Annual Technology Baseline (ATB)

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Wesley; Kurup, Parthiv; Hand, Maureen; Feldman, David; Sigrin, Benjamin; Lantz, Eric; Stehly, Tyler; Augustine, Chad; Turchi, Craig; O' Connor, Patrick; Waldoch, Connor

    2016-09-01

    Consistent cost and performance data for various electricity generation technologies can be difficult to find and may change frequently for certain technologies. With the Annual Technology Baseline (ATB), National Renewable Energy Laboratory provides an organized and centralized dataset that was reviewed by internal and external experts. It uses the best information from the Department of Energy laboratory's renewable energy analysts and Energy Information Administration information for conventional technologies. The ATB will be updated annually in order to provide an up-to-date repository of current and future cost and performance data. Going forward, we plan to revise and refine the values using best available information. The ATB includes both a presentation with notes (PDF) and an associated Excel Workbook. The ATB includes the following electricity generation technologies: land-based wind; offshore wind; utility-scale solar PV; concentrating solar power; geothermal power; hydropower plants (upgrades to existing facilities, powering non-powered dams, and new stream-reach development); conventional coal; coal with carbon capture and sequestration; integrated gasification combined cycle coal; natural gas combustion turbines; natural gas combined cycle; conventional biopower. Nuclear laboratory's renewable energy analysts and Energy Information Administration information for conventional technologies. The ATB will be updated annually in order to provide an up-to-date repository of current and future cost and performance data. Going forward, we plan to revise and refine the values using best available information.

  11. Defined contribution health benefits.

    Science.gov (United States)

    Fronstin, P

    2001-03-01

    This Issue Brief discusses the emerging issue of "defined contribution" (DC) health benefits. The term "defined contribution" is used to describe a wide variety of approaches to the provision of health benefits, all of which have in common a shift in the responsibility for payment and selection of health care services from employers to employees. DC health benefits often are mentioned in the context of enabling employers to control their outlay for health benefits by avoiding increases in health care costs. DC health benefits may also shift responsibility for choosing a health plan and the associated risks of choosing a plan from employers to employees. There are three primary reasons why some employers currently are considering some sort of DC approach. First, they are once again looking for ways to keep their health care cost increases in line with overall inflation. Second, some employers are concerned that the public "backlash" against managed care will result in new legislation, regulations, and litigation that will further increase their health care costs if they do not distance themselves from health care decisions. Third, employers have modified not only most employee benefit plans, but labor market practices in general, by giving workers more choice, control, and flexibility. DC-type health benefits have existed as cafeteria plans since the 1980s. A cafeteria plan gives each employee the opportunity to determine the allocation of his or her total compensation (within employer-defined limits) among various employee benefits (primarily retirement or health). Most types of DC health benefits currently being discussed could be provided within the existing employment-based health insurance system, with or without the use of cafeteria plans. They could also allow employees to purchase health insurance directly from insurers, or they could drive new technologies and new forms of risk pooling through which health care services are provided and financed. DC health

  12. On Defining Mass

    Science.gov (United States)

    Hecht, Eugene

    2011-01-01

    Though central to any pedagogical development of physics, the concept of mass is still not well understood. Properly defining mass has proven to be far more daunting than contemporary textbooks would have us believe. And yet today the origin of mass is one of the most aggressively pursued areas of research in all of physics. Much of the excitement surrounding the Large Hadron Collider at CERN is associated with discovering the mechanism responsible for the masses of the elementary particles. This paper will first briefly examine the leading definitions, pointing out their shortcomings. Then, utilizing relativity theory, it will propose—for consideration by the community of physicists—a conceptual definition of mass predicated on the more fundamental concept of energy, more fundamental in that everything that has mass has energy, yet not everything that has energy has mass.

  13. Implementing Software Defined Radio

    CERN Document Server

    Grayver, Eugene

    2013-01-01

    Software Defined Radio makes wireless communications easier, more efficient, and more reliable. This book bridges the gap between academic research and practical implementation. When beginning a project, practicing engineers, technical managers, and graduate students can save countless hours by considering the concepts presented in these pages. The author covers the myriad options and trade-offs available when selecting an appropriate hardware architecture. As demonstrated here, the choice between hardware- and software-centric architecture can mean the difference between meeting an aggressive schedule and bogging down in endless design iterations. Because of the author’s experience overseeing dozens of failed and successful developments, he is able to present many real-life examples. Some of the key concepts covered are: Choosing the right architecture for the market – laboratory, military, or commercial Hardware platforms – FPGAs, GPPs, specialized and hybrid devices Standardization efforts to ens...

  14. Defining cyber warfare

    Directory of Open Access Journals (Sweden)

    Dragan D. Mladenović

    2012-04-01

    Full Text Available Cyber conflicts represent a new kind of warfare that is technologically developing very rapidly. Such development results in more frequent and more intensive cyber attacks undertaken by states against adversary targets, with a wide range of diverse operations, from information operations to physical destruction of targets. Nevertheless, cyber warfare is waged through the application of the same means, techniques and methods as those used in cyber criminal, terrorism and intelligence activities. Moreover, it has a very specific nature that enables states to covertly initiate attacks against their adversaries. The starting point in defining doctrines, procedures and standards in the area of cyber warfare is determining its true nature. In this paper, a contribution to this effort was made through the analysis of the existing state doctrines and international practice in the area of cyber warfare towards the determination of its nationally acceptable definition.

  15. Defining the mobilome.

    Science.gov (United States)

    Siefert, Janet L

    2009-01-01

    This chapter defines the agents that provide for the movement of genetic material which fuels the adaptive potential of life on our planet. The chapter has been structured to be broadly comprehensive, arbitrarily categorizing the mobilome into four classes: (1) transposons, (2) plasmids, (3) bacteriophage, and (4) self-splicing molecular parasites.Our increasing understanding of the mobilome is as dynamic as the mobilome itself. With continuing discovery, it is clear that nature has not confined these genomic agents of change to neat categories, but rather the classification categories overlap and intertwine. Massive sequencing efforts and their published analyses are continuing to refine our understanding of the extent of the mobilome. This chapter provides a framework to describe our current understanding of the mobilome and a foundation on which appreciation of its impact on genome evolution can be understood.

  16. 2016 Annual Technology Baseline (ATB) - Webinar Presentation

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Wesley; Kurup, Parthiv; Hand, Maureen; Feldman, David; Sigrin, Benjamin; Lantz, Eric; Stehly, Tyler; Augustine, Chad; Turchi, Craig; Porro, Gian; O' Connor, Patrick; Waldoch, Connor

    2016-09-13

    This deck was presented for the 2016 Annual Technology Baseline Webinar. The presentation describes the Annual Technology Baseline, which is a compilation of current and future cost and performance data for electricity generation technologies.

  17. 75 FR 66748 - Notice of Baseline Filings

    Science.gov (United States)

    2010-10-29

    ...- 000] Notice of Baseline Filings October 22, 2010. ONEOK Gas Transportation, L.L.C Docket No. PR11-68... above submitted a revised baseline filing of their Statement of Operating Conditions for services...

  18. Human papillomavirus testing for triage of women with cytologic evidence of low-grade squamous intraepithelial lesions: baseline data from a randomized trial. The Atypical Squamous Cells of Undetermined Significance/Low-Grade Squamous Intraepithelial Lesions Triage Study (ALTS) Group.

    Science.gov (United States)

    2000-03-01

    Human papillomavirus (HPV) infections appear to be central to the development of cervical cancer. This study addresses the question of whether testing women who have low-grade squamous intraepithelial lesions (LSILs) of the uterine cervix for HPV DNA is useful as a triage strategy. Four clinical centers in different areas of the United States participated in a randomized clinical trial of the use of HPV DNA testing in women with cytologic evidence of atypical squamous cells of undetermined significance (ASCUS) or LSIL. The study sample in this article consists only of women who had LSIL at enrollment. Within 6 months of an LSIL diagnosis (based on a Pap smear read by a community-based cytopathologist), women who were 18 years of age or older completed a standardized questionnaire and underwent a pelvic examination that included collection of cervical specimens for HPV DNA testing by Hybrid Capture II (HCII)(R) assay. Among the 642 women referred with LSIL who had analyzable test results, the mean chronologic age and age at first coitus were similar among the four clinical centers, despite the centers' ethnic and geographic diversity. Overall, HPV DNA was detected in cervical samples from 532 (82.9%) of the 642 women (95% confidence interval = 79.7%-85.7%). This high frequency of HPV positivity was confirmed by polymerase chain reaction (PCR) assays in a subset of 210 paired specimens tested by HCII and PCR (81.4% were positive by both methods). Because a very high percentage of women with an LSIL diagnosis from Pap smears are positive for HPV DNA by HCII testing, there is limited potential for this assay to direct decisions about the clinical management of women with LSIL. The role of HPV testing in the management of women with ASCUS is still under study.

  19. 324 Building Baseline Radiological Characterization

    International Nuclear Information System (INIS)

    Reeder, R.J.; Cooper, J.C.

    2010-01-01

    This report documents the analysis of radiological data collected as part of the characterization study performed in 1998. The study was performed to create a baseline of the radiological conditions in the 324 Building. A total of 85 technical (100 square centimeter (cm 2 )) smears were collected from the Room 147 hoods, the Shielded Materials Facility (SMF), and the Radiochemical Engineering Cells (REC). Exposure rate readings (window open and window closed) were taken at a distance of 2.5 centimeters (cm) and 30 cm from the surface of each smear. Gross beta-gamma and alpha counts of each smear were also performed. The smear samples were analyzed by gamma energy analysis (GEA). Alpha energy analysis (AEA) and strontium-90 analysis were also performed on selected smears. GEA results for one or more samples reported the presence of manganese-54, cobalt-60, silver-108m antimony-125, cesium-134, cesium-137, europium-154, europium-155, and americium-241. AEA results reported the presence of plutonium-239/240, plutonium-238/ 241 Am, curium-243/244, curium-242, and americium-243. Tables 5 through 9 present a summary by location of the estimated maximum removable and total contamination levels in the Room 147 hoods, the SMF, and the REC. The smear sample survey data and laboratory analytical results are presented in tabular form by sample in Appendix A. The Appendix A tables combine survey data documented in radiological survey reports found in Appendix B and laboratory analytical results reported in the 324 Building Physical and Radiological Characterization Study (Berk, Hill, and Landsman 1998), supplemented by the laboratory analytical results found in Appendix C.

  20. LTC vacuum blasting machine (concrete): Baseline report

    International Nuclear Information System (INIS)

    1997-01-01

    The LTC shot blast technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU's evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC 1073 Vacuum Blasting Machine uses a high-capacity, direct-pressure blasting system which incorporates a continuous feed for the blast media. The blast media cleans the surface within the contained brush area of the blast. It incorporates a vacuum system which removes dust and debris from the surface as it is blasted. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure during maintenance activities was minimal, but due to mechanical difficulties dust monitoring could not be conducted during operation. Noise exposure was significant. Further testing for each of these exposures is recommended because of the outdoor environment where the testing demonstration took place. This may cause the results to be inaccurate. It is feasible that the dust and noise levels will be higher in an enclosed environment. In addition, other safety and health issues found were ergonomics, heat stress, tripping hazards, electrical hazards, lockout/tagout, and arm-hand vibration

  1. Pentek metal coating removal system: Baseline report

    International Nuclear Information System (INIS)

    1997-01-01

    The Pentek coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU's evaluation of efficiency and cost, this report covers evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The Pentek coating removal system consisted of the ROTO-PEEN Scaler, CORNER-CUTTER reg-sign, and VAC-PAC reg-sign. They are designed to remove coatings from steel, concrete, brick, and wood. The Scaler uses 3M Roto Peen tungsten carbide cutters while the CORNER-CUTTER reg-sign uses solid needles for descaling activities. These hand tools are used with the VAC-PAC reg-sign vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure minimal, but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout

  2. Program Baseline Change Control Board charter

    International Nuclear Information System (INIS)

    1993-02-01

    The purpose of this Charter is to establish the Program Baseline Change Control Board (PBCCB) for the Office of Civilian Radioactive Waste Management (OCRWM) Program, and to describe its organization, responsibilities, and basic methods of operation. Guidance for implementing this Charter is provided by the OCRWM Baseline Management Plan (BMP) and OCRWM Program Baseline Change Control Procedure

  3. 40 CFR 1042.825 - Baseline determination.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Baseline determination. 1042.825... Provisions for Remanufactured Marine Engines § 1042.825 Baseline determination. (a) For the purpose of this... not valid. (f) Use good engineering judgment for all aspects of the baseline determination. We may...

  4. Relationship between visual field progression and baseline refraction in primary open-angle glaucoma.

    Science.gov (United States)

    Naito, Tomoko; Yoshikawa, Keiji; Mizoue, Shiro; Nanno, Mami; Kimura, Tairo; Suzumura, Hirotaka; Umeda, Yuzo; Shiraga, Fumio

    2016-01-01

    To analyze the relationship between visual field (VF) progression and baseline refraction in Japanese patients with primary open-angle glaucoma (POAG) including normal-tension glaucoma. In this retrospective study, the subjects were patients with POAG who had undergone VF tests at least ten times with a Humphrey Field Analyzer (Swedish interactive thresholding algorithm standard, Central 30-2 program). VF progression was defined as a significantly negative value of mean deviation (MD) slope at the final VF test. Multivariate logistic regression models were applied to detect an association between MD slope deterioration and baseline refraction. A total of 156 eyes of 156 patients were included in this analysis. Significant deterioration of MD slope was observed in 70 eyes of 70 patients (44.9%), whereas no significant deterioration was evident in 86 eyes of 86 patients (55.1%). The eyes with VF progression had significantly higher baseline refraction compared to those without apparent VF progression (-1.9±3.8 diopter [D] vs -3.5±3.4 D, P=0.0048) (mean ± standard deviation). When subject eyes were classified into four groups by the level of baseline refraction applying spherical equivalent (SE): no myopia (SE > -1D), mild myopia (-1D ≥ SE > -3D), moderate myopia (-3D ≥ SE > -6D), and severe myopia (-6D ≥ SE), the Cochran-Armitage trend analysis showed a decreasing trend in the proportion of MD slope deterioration with increasing severity of myopia (P=0.0002). The multivariate analysis revealed that baseline refraction (P=0.0108, odds ratio [OR]: 1.13, 95% confidence interval [CI]: 1.03-1.25) and intraocular pressure reduction rate (P=0.0150, OR: 0.97, 95% CI: 0.94-0.99) had a significant association with MD slope deterioration. In the current analysis of Japanese patients with POAG, baseline refraction was a factor significantly associated with MD slope deterioration as well as intraocular pressure reduction rate. When baseline refraction was classified into

  5. A publication database for optical long baseline interferometry

    Science.gov (United States)

    Malbet, Fabien; Mella, Guillaume; Lawson, Peter; Taillifet, Esther; Lafrasse, Sylvain

    2010-07-01

    Optical long baseline interferometry is a technique that has generated almost 850 refereed papers to date. The targets span a large variety of objects from planetary systems to extragalactic studies and all branches of stellar physics. We have created a database hosted by the JMMC and connected to the Optical Long Baseline Interferometry Newsletter (OLBIN) web site using MySQL and a collection of XML or PHP scripts in order to store and classify these publications. Each entry is defined by its ADS bibcode, includes basic ADS informations and metadata. The metadata are specified by tags sorted in categories: interferometric facilities, instrumentation, wavelength of operation, spectral resolution, type of measurement, target type, and paper category, for example. The whole OLBIN publication list has been processed and we present how the database is organized and can be accessed. We use this tool to generate statistical plots of interest for the community in optical long baseline interferometry.

  6. Defining the Anthropocene

    Science.gov (United States)

    Lewis, Simon; Maslin, Mark

    2016-04-01

    Time is divided by geologists according to marked shifts in Earth's state. Recent global environmental changes suggest that Earth may have entered a new human-dominated geological epoch, the Anthropocene. Should the Anthropocene - the idea that human activity is a force acting upon the Earth system in ways that mean that Earth will be altered for millions of years - be defined as a geological time-unit at the level of an Epoch? Here we appraise the data to assess such claims, first in terms of changes to the Earth system, with particular focus on very long-lived impacts, as Epochs typically last millions of years. Can Earth really be said to be in transition from one state to another? Secondly, we then consider the formal criteria used to define geological time-units and move forward through time examining whether currently available evidence passes typical geological time-unit evidence thresholds. We suggest two time periods likely fit the criteria (1) the aftermath of the interlinking of the Old and New Worlds, which moved species across continents and ocean basins worldwide, a geologically unprecedented and permanent change, which is also the globally synchronous coolest part of the Little Ice Age (in Earth system terms), and the beginning of global trade and a new socio-economic "world system" (in historical terms), marked as a golden spike by a temporary drop in atmospheric CO2, centred on 1610 CE; and (2) the aftermath of the Second World War, when many global environmental changes accelerated and novel long-lived materials were increasingly manufactured, known as the Great Acceleration (in Earth system terms) and the beginning of the Cold War (in historical terms), marked as a golden spike by the peak in radionuclide fallout in 1964. We finish by noting that the Anthropocene debate is politically loaded, thus transparency in the presentation of evidence is essential if a formal definition of the Anthropocene is to avoid becoming a debate about bias. The

  7. Regional geochemical baselines for Portuguese shelf sediments

    International Nuclear Information System (INIS)

    Mil-Homens, M.; Stevens, R.L.; Cato, I.; Abrantes, F.

    2007-01-01

    Metal concentrations (Al, Cr, Cu, Ni, Pb and Zn) from the DGM-INETI archive data set have been examined for sediments collected during the 1970s from 267 sites on the Portuguese shelf. Due to the differences in the oceanographic and sedimentological settings between western and Algarve coasts, the archive data set is split in two segments. For both shelf segments, regional geochemical baselines (RGB) are defined using aluminium as a reference element. Seabed samples recovered in 2002 from four distinct areas of the Portuguese shelf are superimposed on these models to identify and compare possible metal enrichments relative to the natural distribution. Metal enrichments associated with anthropogenic influences are identified in three samples collected nearby the Tejo River and are characterised by the highest enrichment factors (EF; EF Pb Zn < 4). EF values close to 1 suggest a largely natural origin for metal distributions in sediments from the other areas included in the study. - Background metal concentrations and their natural variability must be established before assessing anthropogenic impacts

  8. Gated integrator with signal baseline subtraction

    Energy Technology Data Exchange (ETDEWEB)

    Wang, X.

    1996-12-17

    An ultrafast, high precision gated integrator includes an opamp having differential inputs. A signal to be integrated is applied to one of the differential inputs through a first input network, and a signal indicative of the DC offset component of the signal to be integrated is applied to the other of the differential inputs through a second input network. A pair of electronic switches in the first and second input networks define an integrating period when they are closed. The first and second input networks are substantially symmetrically constructed of matched components so that error components introduced by the electronic switches appear symmetrically in both input circuits and, hence, are nullified by the common mode rejection of the integrating opamp. The signal indicative of the DC offset component is provided by a sample and hold circuit actuated as the integrating period begins. The symmetrical configuration of the integrating circuit improves accuracy and speed by balancing out common mode errors, by permitting the use of high speed switching elements and high speed opamps and by permitting the use of a small integrating time constant. The sample and hold circuit substantially eliminates the error caused by the input signal baseline offset during a single integrating window. 5 figs.

  9. Gated integrator with signal baseline subtraction

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Xucheng (Lisle, IL)

    1996-01-01

    An ultrafast, high precision gated integrator includes an opamp having differential inputs. A signal to be integrated is applied to one of the differential inputs through a first input network, and a signal indicative of the DC offset component of the signal to be integrated is applied to the other of the differential inputs through a second input network. A pair of electronic switches in the first and second input networks define an integrating period when they are closed. The first and second input networks are substantially symmetrically constructed of matched components so that error components introduced by the electronic switches appear symmetrically in both input circuits and, hence, are nullified by the common mode rejection of the integrating opamp. The signal indicative of the DC offset component is provided by a sample and hold circuit actuated as the integrating period begins. The symmetrical configuration of the integrating circuit improves accuracy and speed by balancing out common mode errors, by permitting the use of high speed switching elements and high speed opamps and by permitting the use of a small integrating time constant. The sample and hold circuit substantially eliminates the error caused by the input signal baseline offset during a single integrating window.

  10. A pragmatic cluster randomised controlled trial to evaluate the safety, clinical effectiveness, cost effectiveness and satisfaction with point of care testing in a general practice setting - rationale, design and baseline characteristics.

    Science.gov (United States)

    Laurence, Caroline; Gialamas, Angela; Yelland, Lisa; Bubner, Tanya; Ryan, Philip; Willson, Kristyn; Glastonbury, Briony; Gill, Janice; Shephard, Mark; Beilby, Justin

    2008-08-06

    Point of care testing (PoCT) may be a useful adjunct in the management of chronic conditions in general practice (GP). The provision of pathology test results at the time of the consultation could lead to enhanced clinical management, better health outcomes, greater convenience and satisfaction for patients and general practitioners (GPs), and savings in costs and time. It could also result in inappropriate testing, increased consultations and poor health outcomes resulting from inaccurate results. Currently there are very few randomised controlled trials (RCTs) in GP that have investigated these aspects of PoCT. The Point of Care Testing in General Practice Trial (PoCT Trial) was an Australian Government funded multi-centre, cluster randomised controlled trial to determine the safety, clinical effectiveness, cost effectiveness and satisfaction of PoCT in a GP setting.The PoCT Trial covered an 18 month period with the intervention consisting of the use of PoCT for seven tests used in the management of patients with diabetes, hyperlipidaemia and patients on anticoagulant therapy. The primary outcome measure was the proportion of patients within target range, a measure of therapeutic control. In addition, the PoCT Trial investigated the safety of PoCT, impact of PoCT on patient compliance to medication, stakeholder satisfaction, cost effectiveness of PoCT versus laboratory testing, and influence of geographic location. The paper provides an overview of the Trial Design, the rationale for the research methodology chosen and how the Trial was implemented in a GP environment. The evaluation protocol and data collection processes took into account the large number of patients, the broad range of practice types distributed over a large geographic area, and the inclusion of pathology test results from multiple pathology laboratories.The evaluation protocol developed reflects the complexity of the Trial setting, the Trial Design and the approach taken within the funding

  11. A pragmatic cluster randomised controlled trial to evaluate the safety, clinical effectiveness, cost effectiveness and satisfaction with point of care testing in a general practice setting – rationale, design and baseline characteristics

    Directory of Open Access Journals (Sweden)

    Glastonbury Briony

    2008-08-01

    Full Text Available Abstract Background Point of care testing (PoCT may be a useful adjunct in the management of chronic conditions in general practice (GP. The provision of pathology test results at the time of the consultation could lead to enhanced clinical management, better health outcomes, greater convenience and satisfaction for patients and general practitioners (GPs, and savings in costs and time. It could also result in inappropriate testing, increased consultations and poor health outcomes resulting from inaccurate results. Currently there are very few randomised controlled trials (RCTs in GP that have investigated these aspects of PoCT. Design/Methods The Point of Care Testing in General Practice Trial (PoCT Trial was an Australian Government funded multi-centre, cluster randomised controlled trial to determine the safety, clinical effectiveness, cost effectiveness and satisfaction of PoCT in a GP setting. The PoCT Trial covered an 18 month period with the intervention consisting of the use of PoCT for seven tests used in the management of patients with diabetes, hyperlipidaemia and patients on anticoagulant therapy. The primary outcome measure was the proportion of patients within target range, a measure of therapeutic control. In addition, the PoCT Trial investigated the safety of PoCT, impact of PoCT on patient compliance to medication, stakeholder satisfaction, cost effectiveness of PoCT versus laboratory testing, and influence of geographic location. Discussion The paper provides an overview of the Trial Design, the rationale for the research methodology chosen and how the Trial was implemented in a GP environment. The evaluation protocol and data collection processes took into account the large number of patients, the broad range of practice types distributed over a large geographic area, and the inclusion of pathology test results from multiple pathology laboratories. The evaluation protocol developed reflects the complexity of the Trial setting

  12. Teleology and Defining Sex.

    Science.gov (United States)

    Gamble, Nathan K; Pruski, Michal

    2018-07-01

    Disorders of sexual differentiation lead to what is often referred to as an intersex state. This state has medical, as well as some legal, recognition. Nevertheless, the question remains whether intersex persons occupy a state in between maleness and femaleness or whether they are truly men or women. To answer this question, another important conundrum needs to be first solved: what defines sex? The answer seems rather simple to most people, yet when morphology does not coincide with haplotypes, and genetics might not correlate with physiology the issue becomes more complex. This paper tackles both issues by establishing where the essence of sex is located and by superimposing that framework onto the issue of the intersex. This is achieved through giving due consideration to the biology of sexual development, as well as through the use of a teleological framework of the meaning of sex. Using a range of examples, the paper establishes that sex cannot be pinpointed to one biological variable but is rather determined by how the totality of one's biology is oriented towards biological reproduction. A brief consideration is also given to the way this situation could be comprehended from a Christian understanding of sex and suffering.

  13. THE 2014 ALMA LONG BASELINE CAMPAIGN: AN OVERVIEW

    Energy Technology Data Exchange (ETDEWEB)

    Partnership, ALMA [Astrophysics Research Institute, Liverpool John Moores University, IC2, Liverpool Science Park, 146 Brownlow Hill, Liverpool L3 5RF (United Kingdom); Fomalont, E. B.; Vlahakis, C.; Corder, S.; Remijan, A.; Barkats, D.; Dent, W. R. F.; Phillips, N.; Cox, P.; Hales, A. S. [Joint ALMA Observatory, Alonso de Córdova 3107, Vitacura, Santiago (Chile); Lucas, R. [Institut de Planétologie et d’Astrophysique de Grenoble (UMR 5274), BP 53, F-38041 Grenoble Cedex 9 (France); Hunter, T. R.; Brogan, C. L.; Amestica, R.; Cotton, W. [National Radio Astronomy Observatory, 520 Edgemont Road, Charlottesville, VA 22903 (United States); Asaki, Y. [National Astronomical Observatory of Japan, 2-21-1 Osawa, Mitaka, Tokyo 181-8588 (Japan); Matsushita, S. [Institute of Astronomy and Astrophysics, Academia Sinica, P.O. Box 23-141, Taipei 106, Taiwan (China); Hills, R. E. [Astrophysics Group, Cavendish Laboratory, JJ Thomson Avenue, Cambridge CB3 0HE (United Kingdom); Richards, A. M. S. [Jodrell Bank Centre for Astrophysics, School of Physics and Astronomy, University of Manchester, Oxford Road, Manchester M13 9PL (United Kingdom); Broguiere, D., E-mail: efomalon@nrao.edu [Institut de Radioastronomie Millime´trique (IRAM), 300 rue de la Piscine, Domaine Universitaire, F-38406 Saint Martin d’Hères (France); and others

    2015-07-20

    A major goal of the Atacama Large Millimeter/submillimeter Array (ALMA) is to make accurate images with resolutions of tens of milliarcseconds, which at submillimeter (submm) wavelengths requires baselines up to ∼15 km. To develop and test this capability, a Long Baseline Campaign (LBC) was carried out from 2014 September to late November, culminating in end-to-end observations, calibrations, and imaging of selected Science Verification (SV) targets. This paper presents an overview of the campaign and its main results, including an investigation of the short-term coherence properties and systematic phase errors over the long baselines at the ALMA site, a summary of the SV targets and observations, and recommendations for science observing strategies at long baselines. Deep ALMA images of the quasar 3C 138 at 97 and 241 GHz are also compared to VLA 43 GHz results, demonstrating an agreement at a level of a few percent. As a result of the extensive program of LBC testing, the highly successful SV imaging at long baselines achieved angular resolutions as fine as 19 mas at ∼350 GHz. Observing with ALMA on baselines of up to 15 km is now possible, and opens up new parameter space for submm astronomy.

  14. THE 2014 ALMA LONG BASELINE CAMPAIGN: AN OVERVIEW

    International Nuclear Information System (INIS)

    Partnership, ALMA; Fomalont, E. B.; Vlahakis, C.; Corder, S.; Remijan, A.; Barkats, D.; Dent, W. R. F.; Phillips, N.; Cox, P.; Hales, A. S.; Lucas, R.; Hunter, T. R.; Brogan, C. L.; Amestica, R.; Cotton, W.; Asaki, Y.; Matsushita, S.; Hills, R. E.; Richards, A. M. S.; Broguiere, D.

    2015-01-01

    A major goal of the Atacama Large Millimeter/submillimeter Array (ALMA) is to make accurate images with resolutions of tens of milliarcseconds, which at submillimeter (submm) wavelengths requires baselines up to ∼15 km. To develop and test this capability, a Long Baseline Campaign (LBC) was carried out from 2014 September to late November, culminating in end-to-end observations, calibrations, and imaging of selected Science Verification (SV) targets. This paper presents an overview of the campaign and its main results, including an investigation of the short-term coherence properties and systematic phase errors over the long baselines at the ALMA site, a summary of the SV targets and observations, and recommendations for science observing strategies at long baselines. Deep ALMA images of the quasar 3C 138 at 97 and 241 GHz are also compared to VLA 43 GHz results, demonstrating an agreement at a level of a few percent. As a result of the extensive program of LBC testing, the highly successful SV imaging at long baselines achieved angular resolutions as fine as 19 mas at ∼350 GHz. Observing with ALMA on baselines of up to 15 km is now possible, and opens up new parameter space for submm astronomy

  15. Baseline Year for the Baseline Year Test (in 40 CFR 93.119)

    Science.gov (United States)

    EPA OTAQ's State and Local Transportation Resources are for air quality and transportation government and community leaders. Updates on the adequacy of state implementation plans (SIPs), and transportation conformity regulations and guidance

  16. Defining an emerging disease.

    Science.gov (United States)

    Moutou, F; Pastoret, P-P

    2015-04-01

    Defining an emerging disease is not straightforward, as there are several different types of disease emergence. For example, there can be a 'real' emergence of a brand new disease, such as the emergence of bovine spongiform encephalopathy in the 1980s, or a geographic emergence in an area not previously affected, such as the emergence of bluetongue in northern Europe in 2006. In addition, disease can emerge in species formerly not considered affected, e.g. the emergence of bovine tuberculosis in wildlife species since 2000 in France. There can also be an unexpected increase of disease incidence in a known area and a known species, or there may simply be an increase in our knowledge or awareness of a particular disease. What all these emerging diseases have in common is that human activity frequently has a role to play in their emergence. For example, bovine spongiform encephalopathy very probably emerged as a result of changes in the manufacturing of meat-and-bone meal, bluetongue was able to spread to cooler climes as a result of uncontrolled trade in animals, and a relaxation of screening and surveillance for bovine tuberculosis enabled the disease to re-emerge in areas that had been able to drastically reduce the number of cases. Globalisation and population growth will continue to affect the epidemiology of diseases in years to come and ecosystems will continue to evolve. Furthermore, new technologies such as metagenomics and high-throughput sequencing are identifying new microorganisms all the time. Change is the one constant, and diseases will continue to emerge, and we must consider the causes and different types of emergence as we deal with these diseases in the future.

  17. NASA Orbital Debris Baseline Populations

    Science.gov (United States)

    Krisko, Paula H.; Vavrin, A. B.

    2013-01-01

    The NASA Orbital Debris Program Office has created high fidelity populations of the debris environment. The populations include objects of 1 cm and larger in Low Earth Orbit through Geosynchronous Transfer Orbit. They were designed for the purpose of assisting debris researchers and sensor developers in planning and testing. This environment is derived directly from the newest ORDEM model populations which include a background derived from LEGEND, as well as specific events such as the Chinese ASAT test, the Iridium 33/Cosmos 2251 accidental collision, the RORSAT sodium-potassium droplet releases, and other miscellaneous events. It is the most realistic ODPO debris population to date. In this paper we present the populations in chart form. We describe derivations of the background population and the specific populations added on. We validate our 1 cm and larger Low Earth Orbit population against SSN, Haystack, and HAX radar measurements.

  18. Hanford spent fuel inventory baseline

    International Nuclear Information System (INIS)

    Bergsman, K.H.

    1994-01-01

    This document compiles technical data on irradiated fuel stored at the Hanford Site in support of the Hanford SNF Management Environmental Impact Statement. Fuel included is from the Defense Production Reactors (N Reactor and the single-pass reactors; B, C, D, DR, F, H, KE and KW), the Hanford Fast Flux Test Facility Reactor, the Shipping port Pressurized Water Reactor, and small amounts of miscellaneous fuel from several commercial, research, and experimental reactors

  19. Baseline development, economic risk, and schedule risk: An integrated approach

    International Nuclear Information System (INIS)

    Tonkinson, J.A.

    1994-01-01

    The economic and schedule risks of Environmental Restoration (ER) projects are commonly analyzed toward the end of the baseline development process. Risk analysis is usually performed as the final element of the scheduling or estimating processes for the purpose of establishing cost and schedule contingency. However, there is an opportunity for earlier assessment of risks, during development of the technical scope and Work Breakdown Structure (WBS). Integrating the processes of risk management and baselining provides for early incorporation of feedback regarding schedule and cost risk into the proposed scope of work. Much of the information necessary to perform risk analysis becomes available during development of the technical baseline, as the scope of work and WBS are being defined. The analysis of risk can actually be initiated early on during development of the technical baseline and continue throughout development of the complete project baseline. Indeed, best business practices suggest that information crucial to the success of a project be analyzed and incorporated into project planning as soon as it is available and usable

  20. Placental baseline conditions modulate the hyperoxic BOLD-MRI response.

    Science.gov (United States)

    Sinding, Marianne; Peters, David A; Poulsen, Sofie S; Frøkjær, Jens B; Christiansen, Ole B; Petersen, Astrid; Uldbjerg, Niels; Sørensen, Anne

    2018-01-01

    Human pregnancies complicated by placental dysfunction may be characterized by a high hyperoxic Blood oxygen level-dependent (BOLD) MRI response. The pathophysiology behind this phenomenon remains to be established. The aim of this study was to evaluate whether it is associated with altered placental baseline conditions, including a lower oxygenation and altered tissue morphology, as estimated by the placental transverse relaxation time (T2*). We included 49 normal pregnancies (controls) and 13 pregnancies complicated by placental dysfunction (cases), defined by a birth weight baseline BOLD)/baseline BOLD) from a dynamic single-echo gradient-recalled echo (GRE) MRI sequence and the absolute ΔT2* (hyperoxic T2*- baseline T2*) from breath-hold multi-echo GRE sequences. In the control group, the relative ΔBOLD response increased during gestation from 5% in gestational week 20 to 20% in week 40. In the case group, the relative ΔBOLD response was significantly higher (mean Z-score 4.94; 95% CI 2.41, 7.47). The absolute ΔT2*, however, did not differ between controls and cases (p = 0.37), whereas the baseline T2* was lower among cases (mean Z-score -3.13; 95% CI -3.94, -2.32). Furthermore, we demonstrated a strong negative linear correlation between the Log 10 ΔBOLD response and the baseline T2* (r = -0.88, p baseline conditions, as the absolute increase in placental oxygenation (ΔT2*) does not differ between groups. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Definably compact groups definable in real closed fields.II

    OpenAIRE

    Barriga, Eliana

    2017-01-01

    We continue the analysis of definably compact groups definable in a real closed field $\\mathcal{R}$. In [3], we proved that for every definably compact definably connected semialgebraic group $G$ over $\\mathcal{R}$ there are a connected $R$-algebraic group $H$, a definable injective map $\\phi$ from a generic definable neighborhood of the identity of $G$ into the group $H\\left(R\\right)$ of $R$-points of $H$ such that $\\phi$ acts as a group homomorphism inside its domain. The above result and o...

  2. 40 CFR 74.20 - Data for baseline and alternative baseline.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Data for baseline and alternative baseline. 74.20 Section 74.20 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... baseline and alternative baseline. (a) Acceptable data. (1) The designated representative of a combustion...

  3. Cryogenics Testbed Laboratory Flange Baseline Configuration

    Science.gov (United States)

    Acuna, Marie Lei Ysabel D.

    2013-01-01

    As an intern at Kennedy Space Center (KSC), I was involved in research for the Fluids and Propulsion Division of the NASA Engineering (NE) Directorate. I was immersed in the Integrated Ground Operations Demonstration Units (IGODU) project for the majority of my time at KSC, primarily with the Ground Operations Demonstration Unit Liquid Oxygen (GODU L02) branch of IGODU. This project was established to develop advancements in cryogenic systems as a part of KSC's Advanced Exploration Systems (AES) program. The vision of AES is to develop new approaches for human exploration, and operations in and beyond low Earth orbit. Advanced cryogenic systems are crucial to minimize the consumable losses of cryogenic propellants, develop higher performance launch vehicles, and decrease operations cost for future launch programs. During my internship, I conducted a flange torque tracking study that established a baseline configuration for the flanges in the Simulated Propellant Loading System (SPLS) at the KSC Cryogenics Test Laboratory (CTL) - the testing environment for GODU L02.

  4. Hazard Baseline Downgrade Effluent Treatment Facility

    International Nuclear Information System (INIS)

    Blanchard, A.

    1998-01-01

    This Hazard Baseline Downgrade reviews the Effluent Treatment Facility, in accordance with Department of Energy Order 5480.23, WSRC11Q Facility Safety Document Manual, DOE-STD-1027-92, and DOE-EM-STD-5502-94. It provides a baseline grouping based on the chemical and radiological hazards associated with the facility. The Determination of the baseline grouping for ETF will aid in establishing the appropriate set of standards for the facility

  5. Integrated planning: A baseline development perspective

    International Nuclear Information System (INIS)

    Clauss, L.; Chang, D.

    1994-01-01

    The FEMP Baseline establishes the basis for integrating environmental activity technical requirements with their cost and schedule elements. The result is a path forward to successfully achieving the FERMCO mission. Specific to cost management, the FEMP Baseline has been incorporate into the FERMCO Project Control System (PCS) to provide a time-phased budget plan against which contractor performance is measured with an earned value management system. The result is the Performance Measurement Baseline (PMB), an important tool for keeping cost under control

  6. IPCC Socio-Economic Baseline Dataset

    Data.gov (United States)

    National Aeronautics and Space Administration — The Intergovernmental Panel on Climate Change (IPCC) Socio-Economic Baseline Dataset consists of population, human development, economic, water resources, land...

  7. Office of Geologic Repositories program baseline procedures notebook (OGR/B-1)

    International Nuclear Information System (INIS)

    1986-06-01

    Baseline management is typically applied to aid in the internal control of a program by providing consistent programmatic direction, control, and surveillance to an evolving system development. This fundamental concept of internal program control involves the establishment of a baseline to serve as a point of departure for consistent technical program coordination and to control subsequent changes from that baseline. The existence of a program-authorized baseline ensures that all participants are working to the same ground rules. Baseline management also ensures that, once the baseline is defined, changes are assessed and approved by a process which ensures adequate consideration of overall program impact. Baseline management also includes the consideration of examptions from the baseline. The process of baseline management continues through all the phases of an evolving system development program. As the Program proceeds, there will be a progressive increase in the data contained in the baseline documentation. Baseline management has been selected as a management technique to aid in the internal control of the Office of Geologic Repositories (OGR) program. Specifically, an OGR Program Baseline, including technical and programmatic requirements, is used for program control of the four Mined Geologic Disposal System field projects, i.e., Basalt Waste Isolation Project, Nevada Nuclear Waste Storage Investigation, Salt Repository Project and Crystalline Repository Project. This OGR Program Baseline Procedures Notebook provides a description of the baseline mwanagement concept, establishes the OGR Program baseline itself, and provides procedures to be followed for controlling changes to that baseline. The notebook has a controlled distribution and will be updated as required

  8. 7 CFR 28.950 - Terms defined.

    Science.gov (United States)

    2010-01-01

    ... Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing..., TESTING, AND STANDARDS Cotton Fiber and Processing Tests Definitions § 28.950 Terms defined. As used... Agricultural Marketing Service of the U.S. Department of Agriculture. (c) Administrator. The Administrator of...

  9. Despite worse baseline status depressed patients achieved outcomes similar to those in nondepressed patients after surgery for cervical deformity.

    Science.gov (United States)

    Poorman, Gregory W; Passias, Peter G; Horn, Samantha R; Frangella, Nicholas J; Daniels, Alan H; Hamilton, D Kojo; Kim, Hanjo; Sciubba, Daniel; Diebo, Bassel G; Bortz, Cole A; Segreto, Frank A; Kelly, Michael P; Smith, Justin S; Neuman, Brian J; Shaffrey, Christopher I; LaFage, Virginie; LaFage, Renaud; Ames, Christopher P; Hart, Robert; Mundis, Gregory M; Eastlack, Robert

    2017-12-01

    OBJECTIVE Depression and anxiety have been demonstrated to have negative impacts on outcomes after spine surgery. In patients with cervical deformity (CD), the psychological and physiological burdens of the disease may overlap without clear boundaries. While surgery has a proven record of bringing about significant pain relief and decreased disability, the impact of depression and anxiety on recovery from cervical deformity corrective surgery has not been previously reported on in the literature. The purpose of the present study was to determine the effect of depression and anxiety on patients' recovery from and improvement after CD surgery. METHODS The authors conducted a retrospective review of a prospective, multicenter CD database. Patients with a history of clinical depression, in addition to those with current self-reported anxiety or depression, were defined as depressed (D group). The D group was compared with nondepressed patients (ND group) with a similar baseline deformity determined by propensity score matching of the cervical sagittal vertical axis (cSVA). Baseline demographic, comorbidity, clinical, and radiographic data were compared among patients using t-tests. Improvement of symptoms was recorded at 3 months, 6 months, and 1 year postoperatively. All health-related quality of life (HRQOL) scores collected at these follow-up time points were compared using t-tests. RESULTS Sixty-six patients were matched for baseline radiographic parameters: 33 with a history of depression and/or current depression, and 33 without. Depressed patients had similar age, sex, race, and radiographic alignment: cSVA, T-1 slope minus C2-7 lordosis, SVA, and T-1 pelvic angle (p > 0.05). Compared with nondepressed individuals, depressed patients had a higher incidence of osteoporosis (21.2% vs 3.2%, p = 0.028), rheumatoid arthritis (18.2% vs 3.2%, p = 0.012), and connective tissue disorders (18.2% vs 3.2%, p = 0.012). At baseline, the D group had greater neck pain (7.9 of

  10. Delta Healthy Sprouts: Participants' Diet and Food Environment at Baseline

    Science.gov (United States)

    Local food environments influence the nutrition and health of area residents. This baseline analysis focuses on the food environments of women who participated in the Delta Healthy Sprouts project, a randomized, controlled, comparative trial designed to test the efficacy of two Maternal, Infant, an...

  11. 75 FR 74706 - Notice of Baseline Filings

    Science.gov (United States)

    2010-12-01

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Notice of Baseline Filings November 24, 2010. Centana Intrastate Pipeline, LLC. Docket No. PR10-84-001. Centana Intrastate Pipeline, LLC... applicants listed above submitted a revised baseline filing of their Statement of Operating Conditions for...

  12. 76 FR 8725 - Notice of Baseline Filings

    Science.gov (United States)

    2011-02-15

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Notice of Baseline Filings Enstor Grama Ridge Storage and Docket No. PR10-97-002. Transportation, L.L.C.. EasTrans, LLC Docket No. PR10-30-001... revised baseline filing of their Statement of Operating Conditions for services provided under section 311...

  13. 75 FR 57268 - Notice of Baseline Filings

    Science.gov (United States)

    2010-09-20

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR10-103-000; Docket No. PR10-104-000; Docket No. PR10-105- 000 (Not Consolidated)] Notice of Baseline Filings September 13..., 2010, and September 10, 2010, respectively the applicants listed above submitted their baseline filing...

  14. 75 FR 65010 - Notice of Baseline Filings

    Science.gov (United States)

    2010-10-21

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR11-1-000; Docket No. PR11-2-000; Docket No. PR11-3-000] Notice of Baseline Filings October 14, 2010. Cranberry Pipeline Docket..., 2010, respectively the applicants listed above submitted their baseline filing of its Statement of...

  15. 76 FR 5797 - Notice of Baseline Filings

    Science.gov (United States)

    2011-02-02

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR10-114-001; Docket No. PR10-129-001; Docket No. PR10-131- 001; Docket No. PR10-68-002 Not Consolidated] Notice of Baseline... applicants listed above submitted a revised baseline filing of their Statement of Operating Conditions for...

  16. 75 FR 70732 - Notice of Baseline Filings

    Science.gov (United States)

    2010-11-18

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR11-71-000; Docket No. PR11-72-000; Docket No. PR11-73- 000] Notice of Baseline Filings November 10, 2010. Docket No. PR11-71-000..., 2010, the applicants listed above submitted their baseline filing of their Statement of Operating...

  17. Baseline mean deviation and rates of visual field change in treated glaucoma patients.

    Science.gov (United States)

    Forchheimer, I; de Moraes, C G; Teng, C C; Folgar, F; Tello, C; Ritch, R; Liebmann, J M

    2011-05-01

    To evaluate the relationships between baseline visual field (VF) mean deviation (MD) and subsequent progression in treated glaucoma. Records of patients seen in a glaucoma practice between 1999 and 2009 were reviewed. Patients with glaucomatous optic neuropathy, baseline VF damage, and ≥8 SITA-standard 24-2 VF were included. Patients were divided into tertiles based upon baseline MD. Automated pointwise linear regression determined global and localized rates (decibels (dB) per year) of change. Progression was defined when two or more adjacent test locations in the same hemifield showed a sensitivity decline at a rate of >1.0  dB per year, P0.50) and global rates of VF change of progressing eyes were -1.3±1.2, -1.01±0.7, and -0.9±0.5 dB/year (P=0.09, analysis of variance). Within these groups, intraocular pressure (IOP) in stable vs progressing eyes were 15.5±3.3 vs 17.0±3.1 (P0.50) and multivariate (P=0.26) analyses adjusting for differences in follow-up IOP. After correcting for differences in IOP in treated glaucoma patients, we did not find a relationship between the rate of VF change (dB per year) and the severity of the baseline VF MD. This finding may have been due to more aggressive IOP lowering in eyes with more severe disease. Eyes with lower IOP progressed less frequently across the spectrum of VF loss.

  18. Baseline ecological risk assessment Salmon Site, Lamar County, Mississippi

    International Nuclear Information System (INIS)

    1995-04-01

    The Salmon Site (SS), formerly the Tatum Dome Test Site, located in Mississippi was the site of two nuclear and two gas explosion tests conducted between 1964 and 1970. A consequence of these testing activities is that radionuclides were released into the salt dome, where they are presently contained. During reentry drilling and other site activities, incidental liquid and solid wastes that contained radioactivity were generated, resulting in some soil, ground water and equipment contamination. As part of the remedial investigation effort, a Baseline Ecological Risk Assessment was conducted at the SS. The purpose is to gauge ecological and other environmental impacts attributable to past activities at the former test facility. The results of this facility-specific baseline risk assessment are presented in this document

  19. Baseline ecological risk assessment Salmon Site, Lamar County, Mississippi

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-04-01

    The Salmon Site (SS), formerly the Tatum Dome Test Site, located in Mississippi was the site of two nuclear and two gas explosion tests conducted between 1964 and 1970. A consequence of these testing activities is that radionuclides were released into the salt dome, where they are presently contained. During reentry drilling and other site activities, incidental liquid and solid wastes that contained radioactivity were generated, resulting in some soil, ground water and equipment contamination. As part of the remedial investigation effort, a Baseline Ecological Risk Assessment was conducted at the SS. The purpose is to gauge ecological and other environmental impacts attributable to past activities at the former test facility. The results of this facility-specific baseline risk assessment are presented in this document.

  20. Defining the "normal" postejaculate urinalysis.

    Science.gov (United States)

    Mehta, Akanksha; Jarow, Jonathan P; Maples, Pat; Sigman, Mark

    2012-01-01

    Although sperm have been shown to be present in the postejaculate urinalysis (PEU) of both fertile and infertile men, the number of sperm present in the PEU of the general population has never been well defined. The objective of this study was to describe the semen and PEU findings in both the general and infertile population, in order to develop a better appreciation for "normal." Infertile men (n = 77) and control subjects (n = 71) were prospectively recruited. Exclusion criteria included azoospermia and medications known to affect ejaculation. All men underwent a history, physical examination, semen analysis, and PEU. The urine was split into 2 containers: PEU1, the initial voided urine, and PEU2, the remaining voided urine. Parametric statistical methods were applied for data analysis to compare sperm concentrations in each sample of semen and urine between the 2 groups of men. Controls had higher average semen volume (3.3 ± 1.6 vs 2.0 ± 1.4 mL, P sperm concentrations (112 million vs 56.2 million, P = .011), compared with infertile men. The presence of sperm in urine was common in both groups, but more prevalent among infertile men (98.7% vs 88.7%, P = .012), in whom it comprised a greater proportion of the total sperm count (46% vs 24%, P = .022). The majority of sperm present in PEU were seen in PEU1 of both controls (69%) and infertile men (88%). An association was noted between severe oligospermia (sperm counts in PEU (sperm in the urine compared with control, there is a large degree of overlap between the 2 populations, making it difficult to identify a specific threshold to define a positive test. Interpretation of a PEU should be directed by whether the number of sperm in the urine could affect subsequent management.

  1. How do pediatric anesthesiologists define intraoperative hypotension?

    Science.gov (United States)

    Nafiu, Olubukola O; Voepel-Lewis, Terri; Morris, Michelle; Chimbira, Wilson T; Malviya, Shobha; Reynolds, Paul I; Tremper, Kevin K

    2009-11-01

    Although blood pressure (BP) monitoring is a recommended standard of care by the ASA, and pediatric anesthesiologists routinely monitor the BP of their patients and when appropriate treat deviations from 'normal', there is no robust definition of hypotension in any of the pediatric anesthesia texts or journals. Consequently, what constitutes hypotension in pediatric anesthesia is currently unknown. We designed a questionnaire-based survey of pediatric anesthesiologists to determine the BP ranges and thresholds used to define intraoperative hypotension (IOH). Members of the Society of Pediatric Anesthesia (SPA) and the Association of Paediatric Anaesthetists (APA) of Great Britain and Ireland were contacted through e-mail to participate in this survey. We asked a few demographic questions and five questions about specific definitions of hypotension for different age groups of patients undergoing inguinal herniorraphy, a common pediatric surgical procedure. The overall response rate was 56% (483/860), of which 76% were SPA members. Majority of the respondents (72%) work in academic institutions, while 8.9% work in institutions with fewer than 1000 annual pediatric surgical caseload. About 76% of respondents indicated that a 20-30% reduction in baseline systolic blood pressure (SBP) indicates significant hypotension in children under anesthesia. Most responders (86.7%) indicated that they use mean arterial pressure or SBP (72%) to define IOH. The mean SBP values for hypotension quoted by SPA members was about 5-7% lower across all pediatric age groups compared to values quoted by APA members (P = 0.001 for all age groups). There is great variability in the BP parameters used and the threshold used for defining and treating IOH among pediatric anesthesiologists. The majority of respondents considered a 20-30% reduction from baseline in SBP as indicative of significant hypotension. Lack of a consensus definition for a common clinical condition like IOH could have

  2. Baseline Bone Mineral Density Measurements Key to Future Testing Intervals

    Science.gov (United States)

    ... Research | April 3, 2018 Optimizing Steroid Treatment for Duchenne Muscular Dystrophy Spotlight on Research | March 8, 2018 The Role of the Microbiota in Eczema: Findings Suggest That Striking ...

  3. CryoSat SAR/SARin Level1b products: assessment of BaselineC and improvements towards BaselineD

    Science.gov (United States)

    Scagliola, Michele; Fornari, Marco; Bouffard, Jerome; Parrinello, Tommaso

    2017-04-01

    CryoSat was launched on the 8th April 2010 and is the first European ice mission dedicated to the monitoring of precise changes in the thickness of polar ice sheets and floating sea ice. Cryosat carries an innovative radar altimeter called the Synthetic Aperture Interferometric Altimeter (SIRAL), that transmits pulses at a high pulse repetition frequency thus making the received echoes phase coherent and suitable for azimuth processing. This allows to reach a significantly improved along track resolution with respect to traditional pulse-width limited altimeters. CryoSat is the first altimetry mission operating in SAR mode and continuous improvements in the Level1 Instrument Processing Facility (IPF1) are being identified, tested and validated in order to improve the quality of the Level1b products. The current IPF, Baseline C, was released in operation in April 2015 and the second CryoSat reprocessing campaign was jointly initiated, taking benefit of the upgrade implemented in the IPF1 processing chain but also of some specific configurations for the calibration corrections. In particular, the CryoSat Level1b BaselineC products generated in the framework of the second reprocessing campaign include refined information for what concerns the mispointing angles and the calibration corrections. This poster will thus detail thus the evolutions that are currently planned for the CryoSat BaselineD SAR/SARin Level1b products and the corresponding quality improvements that are expected.

  4. TWRS technical baseline database manager definition document

    International Nuclear Information System (INIS)

    Acree, C.D.

    1997-01-01

    This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager

  5. Neuropsychological Profiles Differentiate Alzheimer Disease from Subcortical Ischemic Vascular Dementia in an Autopsy-Defined Cohort.

    Science.gov (United States)

    Ramirez-Gomez, Liliana; Zheng, Ling; Reed, Bruce; Kramer, Joel; Mungas, Dan; Zarow, Chris; Vinters, Harry; Ringman, John M; Chui, Helena

    2017-01-01

    The aim of this study was to assess the ability of neuropsychological tests to differentiate autopsy-defined Alzheimer disease (AD) from subcortical ischemic vascular dementia (SIVD). From a sample of 175 cases followed longitudinally that underwent autopsy, we selected 23 normal controls (NC), 20 SIVD, 69 AD, and 10 mixed cases of dementia. Baseline neuropsychological tests, including Memory Assessment Scale word list learning test, control oral word association test, and animal fluency, were compared between the three autopsy-defined groups. The NC, SIVD, and AD groups did not differ by age or education. The SIVD and AD groups did not differ by the Global Clinical Dementia Rating Scale. Subjects with AD performed worse on delayed recall (p < 0.01). A receiver operating characteristics analysis comparing the SIVD and AD groups including age, education, difference between categorical (animals) versus phonemic fluency (letter F), and the first recall from the word learning test distinguished the two groups with a sensitivity of 85%, specificity of 67%, and positive likelihood ratio of 2.57 (AUC = 0.789, 95% CI 0.69-0.88, p < 0.0001). In neuropathologically defined subgroups, neuropsychological profiles have modest ability to distinguish patients with AD from those with SIVD. © 2017 S. Karger AG, Basel.

  6. Nonintrusive methodology for wellness baseline profiling

    Science.gov (United States)

    Chung, Danny Wen-Yaw; Tsai, Yuh-Show; Miaou, Shaou-Gang; Chang, Walter H.; Chang, Yaw-Jen; Chen, Shia-Chung; Hong, Y. Y.; Chyang, C. S.; Chang, Quan-Shong; Hsu, Hon-Yen; Hsu, James; Yao, Wei-Cheng; Hsu, Ming-Sin; Chen, Ming-Chung; Lee, Shi-Chen; Hsu, Charles; Miao, Lidan; Byrd, Kenny; Chouikha, Mohamed F.; Gu, Xin-Bin; Wang, Paul C.; Szu, Harold

    2007-04-01

    We develop an accumulatively effective and affordable set of smart pair devices to save the exuberant expenditure for the healthcare of aging population, which will not be sustainable when all the post-war baby boomers retire (78 millions will cost 1/5~1/4 GDP in US alone). To design an accessible test-bed for distributed points of homecare, we choose two exemplars of the set to demonstrate the possibility of translation of modern military and clinical know-how, because two exemplars share identically the noninvasive algorithm adapted to the Smart Sensor-pairs for the real world persistent surveillance. Currently, the standard diagnoses for malignant tumors and diabetes disorders are blood serum tests, X-ray CAT scan, and biopsy used sometime in the physical checkup by physicians as cohort-average wellness baselines. The loss of the quality of life in making second careers productive may be caused by the missing of timeliness for correct diagnoses and easier treatments, which contributes to the one quarter of human errors generating the lawsuits against physicians and hospitals, which further escalates the insurance cost and wasteful healthcare expenditure. Such a vicious cycle should be entirely eliminated by building an "individual diagnostic aids (IDA)," similar to the trend of personalized drug, developed from daily noninvasive intelligent databases of the "wellness baseline profiling (WBP)". Since our physiology state undulates diurnally, the Nyquist anti-aliasing theory dictates a minimum twice-a-day sampling of the WBP for the IDA, which must be made affordable by means of noninvasive, unsupervised and unbiased methodology at the convenience of homes. Thus, a pair of military infrared (IR) spectral cameras has been demonstrated for the noninvasive spectrogram ratio test of the spontaneously emitted thermal radiation from a normal human body at 37°C temperature. This invisible self-emission spreads from 3 microns to 12 microns of the radiation wavelengths

  7. Cardiovascular risk factors associated with lower baseline cognitive performance in HIV-positive persons.

    Science.gov (United States)

    Wright, E J; Grund, B; Robertson, K; Brew, B J; Roediger, M; Bain, M P; Drummond, F; Vjecha, M J; Hoy, J; Miller, C; Penalva de Oliveira, A C; Pumpradit, W; Shlay, J C; El-Sadr, W; Price, R W

    2010-09-07

    To determine factors associated with baseline neurocognitive performance in HIV-infected participants enrolled in the Strategies for Management of Antiretroviral Therapy (SMART) neurology substudy. Participants from Australia, North America, Brazil, and Thailand were administered a 5-test neurocognitive battery. Z scores and the neurocognitive performance outcome measure, the quantitative neurocognitive performance z score (QNPZ-5), were calculated using US norms. Neurocognitive impairment was defined as z scores penetration effectiveness rank of antiretroviral regimens were not. In this HIV-positive population with high CD4 cell counts, neurocognitive impairment was associated with prior CVD. Lower neurocognitive performance was associated with prior CVD, hypertension, and hypercholesterolemia, but not conventional HAD risk factors. The contribution of CVD and cardiovascular risk factors to the neurocognition of HIV-positive populations warrants further investigation.

  8. Neurocognitive function in HIV-infected patients: comparison of two methods to define impairment.

    Directory of Open Access Journals (Sweden)

    Alejandro Arenas-Pinto

    Full Text Available To compare two definitions of neurocognitive impairment (NCI in a large clinical trial of effectively-treated HIV-infected adults at baseline.Hopkins Verbal Learning test-Revised (HVLT-R, Colour Trail (CTT and Grooved Pegboard (GPT tests were applied exploring five cognitive domains. Raw scores were transformed into Z-scores and NCI defined as summary NPZ-5 score one standard deviation below the mean of the normative dataset (i.e. <-1SD or Z-scores <-1SD in at least two individual domains (categorical scale. Principal component analysis (PCA was performed to explore the contribution of individual tests to the total variance.Mean NPZ-5 score was -0.72 (SD 0.98 and 178/548 (32% participants had NPZ-5 scores <-1SD. When impairment was defined as <-1SD in at least two individual tests, 283 (52% patients were impaired. Strong correlations between the two components of the HVLT-R test (learning/recall (r = 0.73, and the CTT and (attention/executive functioning (r = 0.66 were observed. PCA showed a clustering with three components accounting for 88% of the total variance. When patients who scored <-1SD only in two correlated tests were considered as not impaired, prevalence of NCI was 43%. When correlated test scores were averaged, 36% of participants had NPZ-3 scores <-1SD and 32% underperformed in at least two individual tests.Controlling for differential contribution of individual test-scores on the overall performance and the level of correlation between components of the test battery used appear to be important when testing cognitive function. These two factors are likely to affect both summary scores and categorical scales in defining cognitive impairment.EUDRACT: 2007-006448-23 and ISRCTN04857074.

  9. Hanford Site technical baseline database. Revision 1

    International Nuclear Information System (INIS)

    Porter, P.E.

    1995-01-01

    This report lists the Hanford specific files (Table 1) that make up the Hanford Site Technical Baseline Database. Table 2 includes the delta files that delineate the differences between this revision and revision 0 of the Hanford Site Technical Baseline Database. This information is being managed and maintained on the Hanford RDD-100 System, which uses the capabilities of RDD-100, a systems engineering software system of Ascent Logic Corporation (ALC). This revision of the Hanford Site Technical Baseline Database uses RDD-100 version 3.0.2.2 (see Table 3). Directories reflect those controlled by the Hanford RDD-100 System Administrator. Table 4 provides information regarding the platform. A cassette tape containing the Hanford Site Technical Baseline Database is available

  10. Test

    DEFF Research Database (Denmark)

    Bendixen, Carsten

    2014-01-01

    Bidrag med en kortfattet, introducerende, perspektiverende og begrebsafklarende fremstilling af begrebet test i det pædagogiske univers.......Bidrag med en kortfattet, introducerende, perspektiverende og begrebsafklarende fremstilling af begrebet test i det pædagogiske univers....

  11. THE US LONG BASELINE NEUTRINO EXPERIMENT STUDY.

    Energy Technology Data Exchange (ETDEWEB)

    BISHAI,M.

    2007-08-06

    The US Long Baseline Neutrino Experiment Study was commissioned jointly by Brookhaven National Laboratory (BNL)and Fermi National Accelerator Laboratory (FNAL) to investigate the potential for future U.S. based long baseline neutrino oscillation experiments using MW class conventional neutrino beams that can be produced at FNAL. The experimental baselines are based on two possible detector locations: (1) off-axis to the existing FNAL NuMI beamline at baselines of 700 to 810 km and (2) NSF's proposed future Deep Underground Science and Engineering Laboratory (DUSEL) at baselines greater than 1000km. Two detector technologies are considered: a megaton class Water Cherenkov detector deployed deep underground at a DUSEL site, or a 100kT Liquid Argon Time-Projection Chamber (TPC) deployed on the surface at any of the proposed sites. The physics sensitivities of the proposed experiments are summarized. We find that conventional horn focused wide-band neutrino beam options from FNAL aimed at a massive detector with a baseline of > 1000km have the best sensitivity to CP violation and the neutrino mass hierarchy for values of the mixing angle {theta}{sub 13} down to 2{sup o}.

  12. Shifted Baselines Reduce Willingness to Pay for Conservation

    Directory of Open Access Journals (Sweden)

    Loren McClenachan

    2018-02-01

    Full Text Available A loss of memory of past environmental degradation has resulted in shifted baselines, which may result in conservation and restoration goals that are less ambitious than if stakeholders had a full knowledge of ecosystem potential. However, the link between perception of baseline states and support for conservation planning has not been tested empirically. Here, we investigate how perceptions of change in coral reef ecosystems affect stakeholders' willingness to pay (WTP for the establishment of protected areas. Coral reefs are experiencing rapid, global change that is observable by the public, and therefore provide an ideal ecosystem to test links between beliefs about baseline states and willingness to support conservation. Our survey respondents perceived change to coral reef communities across six variables: coral abundance, fish abundance, fish diversity, fish size, sedimentation, and water pollution. Respondants who accurately perceived declines in reef health had significantly higher WTP for protected areas (US $256.80 vs. $102.50 per year, suggesting that shifted baselines may reduce engagement with conservation efforts. If WTP translates to engagement, this suggests that goals for restoration and recovery are likely to be more ambitious if the public is aware of long term change. Therefore, communicating the scope and depth of environmental problems is essential in engaging the public in conservation.

  13. Baseline series fragrance markers fail to predict contact allergy.

    Science.gov (United States)

    Mann, Jack; McFadden, John P; White, Jonathan M L; White, Ian R; Banerjee, Piu

    2014-05-01

    Negative patch test results with fragrance allergy markers in the European baseline series do not always predict a negative reaction to individual fragrance substances. To determine the frequencies of positive test reactions to the 26 fragrance substances for which labelling is mandatory in the EU, and how effectively reactions to fragrance markers in the baseline series predict positive reactions to the fragrance substances that are labelled. The records of 1951 eczema patients, routinely tested with the labelled fragrance substances and with an extended European baseline series in 2011 and 2012, were retrospectively reviewed. Two hundred and eighty-one (14.4%) (71.2% females) reacted to one or more allergens from the labelled-fragrance substance series and/or a fragrance marker from the European baseline series. The allergens that were positive with the greatest frequencies were cinnamyl alcohol (48; 2.46%), Evernia furfuracea (44; 2.26%), and isoeugenol (40; 2.05%). Of the 203 patients who reacted to any of the 26 fragrances in the labelled-fragrance substance series, only 117 (57.6%) also reacted to a fragrance marker in the baseline series. One hundred and seven (52.7%) reacted to either fragrance mix I or fragrance mix II, 28 (13.8%) reacted to Myroxylon pereirae, and 13 (6.4%) reacted to hydroxyisohexyl 3-cyclohexene carboxaldehyde. These findings confirm that the standard fragrance markers fail to identify patients with contact allergies to the 26 fragrances. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  14. Pediatric Heart Transplantation: Transitioning to Adult Care (TRANSIT): Baseline Findings.

    Science.gov (United States)

    Grady, Kathleen L; Hof, Kathleen Van't; Andrei, Adin-Cristian; Shankel, Tamara; Chinnock, Richard; Miyamoto, Shelley; Ambardekar, Amrut V; Anderson, Allen; Addonizio, Linda; Latif, Farhana; Lefkowitz, Debra; Goldberg, Lee; Hollander, Seth A; Pham, Michael; Weissberg-Benchell, Jill; Cool, Nichole; Yancy, Clyde; Pahl, Elfriede

    2018-02-01

    Young adult solid organ transplant recipients who transfer from pediatric to adult care experience poor outcomes related to decreased adherence to the medical regimen. Our pilot trial for young adults who had heart transplant (HT) who transfer to adult care tests an intervention focused on increasing HT knowledge, self-management and self-advocacy skills, and enhancing support, as compared to usual care. We report baseline findings between groups regarding (1) patient-level outcomes and (2) components of the intervention. From 3/14 to 9/16, 88 subjects enrolled and randomized to intervention (n = 43) or usual care (n = 45) at six pediatric HT centers. Patient self-report questionnaires and medical records data were collected at baseline, and 3 and 6 months after transfer. For this report, baseline findings (at enrollment and prior to transfer to adult care) were analyzed using Chi-square and t-tests. Level of significance was p Baseline demographics were similar in the intervention and usual care arms: age 21.3 ± 3.2 vs 21.5 ± 3.3 years and female 44% vs 49%, respectively. At baseline, there were no differences between intervention and usual care for use of tacrolimus (70 vs 62%); tacrolimus level (mean ± SD = 6.5 ± 2.3 ng/ml vs 5.6 ± 2.3 ng/ml); average of the within patient standard deviation of the baseline mean tacrolimus levels (1.6 vs 1.3); and adherence to the medical regimen [3.6 ± 0.4 vs 3.5 ± 0.5 (1 = hardly ever to 4 = all of the time)], respectively. At baseline, both groups had a modest amount of HT knowledge, were learning self-management and self-advocacy, and perceived they were adequately supported. Baseline findings indicate that transitioning HT recipients lack essential knowledge about HT and have incomplete self-management and self-advocacy skills.

  15. Defining asthma in genetic studies

    NARCIS (Netherlands)

    Koppelman, GH; Postma, DS; Meijer, G.

    1999-01-01

    Genetic studies have been hampered by the lack of a gold standard to diagnose asthma. The complex nature of asthma makes it more difficult to identify asthma genes. Therefore, approaches to define phenotypes, which have been successful in other genetically complex diseases, may be applied to define

  16. Nickel on the market: a baseline survey of articles in 'prolonged contact' with skin.

    Science.gov (United States)

    Ringborg, Evelina; Lidén, Carola; Julander, Anneli

    2016-08-01

    In April 2014, the European Chemicals Agency defined the concept of 'prolonged contact with skin' as used in the EU nickel restriction. To establish a baseline of nickel-releasing items on the Swedish market conforming with the EU nickel restriction according to the definition of 'prolonged contact' with the skin. We performed a limited market survey in Stockholm, Sweden. Items with metallic parts that come into contact with the skin, except those explicitly mentioned in the legal text, were chosen. The dimethylglyoxime (DMG) test was used to evaluate nickel release. One hundred and forty-one items belonging to one of three categories - accessories, utensils for needlework, painting and writing (called utensils), and electronic devices - were tested in the study. Forty-four percent of all items were DMG test-positive (releasing nickel), and 9% gave a doubtful DMG test result. The large proportion of nickel-releasing items in the present study shows clearly that broader parts of industry need to take action to prevent nickel allergy. The high proportion of DMG test-positive items indicates that there is still much work to be done to reduce the nickel exposure of the population. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. Baseline effects on carbon footprints of biofuels: The case of wood

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Eric, E-mail: johnsonatlantic@gmail.com [Atlantic Consulting, 8136 Gattikon (Switzerland); Tschudi, Daniel [ETH, Berghaldenstrasse 46, 8800 Thalwil (Switzerland)

    2012-11-15

    As biofuel usage has boomed over the past decade, so has research and regulatory interest in its carbon accounting. This paper examines one aspect of that carbon accounting: the baseline, i.e. the reference case against which other conditions or changes can be compared. A literature search and analysis identified four baseline types: no baseline; reference point; marginal fossil fuel; and biomass opportunity cost. The fourth one, biomass opportunity cost, is defined in more detail, because this is not done elsewhere in the literature. The four baselines are then applied to the carbon footprint of a wood-fired power plant. The footprint of the resulting wood-fired electricity varies dramatically, according to the type of baseline. Baseline type is also found to be the footprint's most significant sensitivity. Other significant sensitivities are: efficiency of the power plant; the growth (or re-growth) rate of the forest that supplies the wood; and the residue fraction of the wood. Length of the policy horizon is also an important factor in determining the footprint. The paper concludes that because of their significance and variability, baseline choices should be made very explicit in biofuel carbon footprints. - Highlights: Black-Right-Pointing-Pointer Four baseline types for biofuel footprinting are identified. Black-Right-Pointing-Pointer One type, 'biomass opportunity cost', is defined mathematically and graphically. Black-Right-Pointing-Pointer Choice of baseline can dramatically affect the footprint result. Black-Right-Pointing-Pointer The 'no baseline' approach is not acceptable. Black-Right-Pointing-Pointer Choice between the other three baselines depends on the question being addressed.

  18. A long baseline global stereo matching based upon short baseline estimation

    Science.gov (United States)

    Li, Jing; Zhao, Hong; Li, Zigang; Gu, Feifei; Zhao, Zixin; Ma, Yueyang; Fang, Meiqi

    2018-05-01

    In global stereo vision, balancing the matching efficiency and computing accuracy seems to be impossible because they contradict each other. In the case of a long baseline, this contradiction becomes more prominent. In order to solve this difficult problem, this paper proposes a novel idea to improve both the efficiency and accuracy in global stereo matching for a long baseline. In this way, the reference images located between the long baseline image pairs are firstly chosen to form the new image pairs with short baselines. The relationship between the disparities of pixels in the image pairs with different baselines is revealed by considering the quantized error so that the disparity search range under the long baseline can be reduced by guidance of the short baseline to gain matching efficiency. Then, the novel idea is integrated into the graph cuts (GCs) to form a multi-step GC algorithm based on the short baseline estimation, by which the disparity map under the long baseline can be calculated iteratively on the basis of the previous matching. Furthermore, the image information from the pixels that are non-occluded under the short baseline but are occluded for the long baseline can be employed to improve the matching accuracy. Although the time complexity of the proposed method depends on the locations of the chosen reference images, it is usually much lower for a long baseline stereo matching than when using the traditional GC algorithm. Finally, the validity of the proposed method is examined by experiments based on benchmark datasets. The results show that the proposed method is superior to the traditional GC method in terms of efficiency and accuracy, and thus it is suitable for long baseline stereo matching.

  19. Geochemical baseline studies of soil in Finland

    Science.gov (United States)

    Pihlaja, Jouni

    2017-04-01

    The soil element concentrations regionally vary a lot in Finland. Mostly this is caused by the different bedrock types, which are reflected in the soil qualities. Geological Survey of Finland (GTK) is carrying out geochemical baseline studies in Finland. In the previous phase, the research is focusing on urban areas and mine environments. The information can, for example, be used to determine the need for soil remediation, to assess environmental impacts or to measure the natural state of soil in industrial areas or mine districts. The field work is done by taking soil samples, typically at depth between 0-10 cm. Sampling sites are chosen to represent the most vulnerable areas when thinking of human impacts by possible toxic soil element contents: playgrounds, day-care centers, schools, parks and residential areas. In the mine districts the samples are taken from the areas locating outside the airborne dust effected areas. Element contents of the soil samples are then analyzed with ICP-AES and ICP-MS, Hg with CV-AAS. The results of the geochemical baseline studies are published in the Finnish national geochemical baseline database (TAPIR). The geochemical baseline map service is free for all users via internet browser. Through this map service it is possible to calculate regional soil baseline values using geochemical data stored in the map service database. Baseline data for 17 elements in total is provided in the map service and it can be viewed on the GTK's web pages (http://gtkdata.gtk.fi/Tapir/indexEN.html).

  20. Theoretical approaches to elections defining

    OpenAIRE

    Natalya V. Lebedeva

    2011-01-01

    Theoretical approaches to elections defining develop the nature, essence and content of elections, help to determine their place and a role as one of the major national law institutions in democratic system.

  1. Theoretical approaches to elections defining

    Directory of Open Access Journals (Sweden)

    Natalya V. Lebedeva

    2011-01-01

    Full Text Available Theoretical approaches to elections defining develop the nature, essence and content of elections, help to determine their place and a role as one of the major national law institutions in democratic system.

  2. Defining Modules, Modularity and Modularization

    DEFF Research Database (Denmark)

    Miller, Thomas Dedenroth; Pedersen, Per Erik Elgård

    The paper describes the evolution of the concept of modularity in a historical perspective. The main reasons for modularity are: create variety, utilize similarities, and reduce complexity. The paper defines the terms: Module, modularity, and modularization.......The paper describes the evolution of the concept of modularity in a historical perspective. The main reasons for modularity are: create variety, utilize similarities, and reduce complexity. The paper defines the terms: Module, modularity, and modularization....

  3. Baseline effects on carbon footprints of biofuels: The case of wood

    International Nuclear Information System (INIS)

    Johnson, Eric; Tschudi, Daniel

    2012-01-01

    As biofuel usage has boomed over the past decade, so has research and regulatory interest in its carbon accounting. This paper examines one aspect of that carbon accounting: the baseline, i.e. the reference case against which other conditions or changes can be compared. A literature search and analysis identified four baseline types: no baseline; reference point; marginal fossil fuel; and biomass opportunity cost. The fourth one, biomass opportunity cost, is defined in more detail, because this is not done elsewhere in the literature. The four baselines are then applied to the carbon footprint of a wood-fired power plant. The footprint of the resulting wood-fired electricity varies dramatically, according to the type of baseline. Baseline type is also found to be the footprint's most significant sensitivity. Other significant sensitivities are: efficiency of the power plant; the growth (or re-growth) rate of the forest that supplies the wood; and the residue fraction of the wood. Length of the policy horizon is also an important factor in determining the footprint. The paper concludes that because of their significance and variability, baseline choices should be made very explicit in biofuel carbon footprints. - Highlights: ► Four baseline types for biofuel footprinting are identified. ► One type, ‘biomass opportunity cost’, is defined mathematically and graphically. ► Choice of baseline can dramatically affect the footprint result. ► The ‘no baseline’ approach is not acceptable. ► Choice between the other three baselines depends on the question being addressed.

  4. Waste management project technical baseline description

    International Nuclear Information System (INIS)

    Sederburg, J.P.

    1997-01-01

    A systems engineering approach has been taken to describe the technical baseline under which the Waste Management Project is currently operating. The document contains a mission analysis, function analysis, requirement analysis, interface definitions, alternative analysis, system definition, documentation requirements, implementation definitions, and discussion of uncertainties facing the Project

  5. Baseline and Multimodal UAV GCS Interface Design

    Science.gov (United States)

    2013-07-01

    complete a computerized version of the NASA - TLX assessment of perceived mental workload. 2.3 Results The baseline condition ran smoothly and with...System MALE Medium-altitude, Long-endurance NASA - TLX NASA Task Load Index SA Situation Awareness TDT Tucker Davis Technologies UAV Uninhabited Aerial

  6. National Cyberethics, Cybersafety, Cybersecurity Baseline Study

    Science.gov (United States)

    Education Digest: Essential Readings Condensed for Quick Review, 2009

    2009-01-01

    This article presents findings from a study that explores the nature of the Cyberethics, Cybersafety, and Cybersecurity (C3) educational awareness policies, initiatives, curriculum, and practices currently taking place in the U.S. public and private K-12 educational settings. The study establishes baseline data on C3 awareness, which can be used…

  7. Guidance on Port Biological Baseline Surveys (PBBS)

    Digital Repository Service at National Institute of Oceanography (India)

    Awad, A.; Haag, F.; Anil, A.C.; Abdulla, A.

    This publication has been prepared by GBP, IOI, CSIR-NIO and IUCN in order to serve as guidance to those who are planning to carry out a port biological baseline survey, in particular in the context of Ballast Water Management. It has been drafted...

  8. Solid Waste Program technical baseline description

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, A.B.

    1994-07-01

    The system engineering approach has been taken to describe the technical baseline under which the Solid Waste Program is currently operating. The document contains a mission analysis, function analysis, system definition, documentation requirements, facility and project bases, and uncertainties facing the program.

  9. Toward Baseline Software Anomalies in NASA Missions

    Science.gov (United States)

    Layman, Lucas; Zelkowitz, Marvin; Basili, Victor; Nikora, Allen P.

    2012-01-01

    In this fast abstract, we provide preliminary findings an analysis of 14,500 spacecraft anomalies from unmanned NASA missions. We provide some baselines for the distributions of software vs. non-software anomalies in spaceflight systems, the risk ratings of software anomalies, and the corrective actions associated with software anomalies.

  10. Physics Potential of Long-Baseline Experiments

    Directory of Open Access Journals (Sweden)

    Sanjib Kumar Agarwalla

    2014-01-01

    Full Text Available The discovery of neutrino mixing and oscillations over the past decade provides firm evidence for new physics beyond the Standard Model. Recently, θ13 has been determined to be moderately large, quite close to its previous upper bound. This represents a significant milestone in establishing the three-flavor oscillation picture of neutrinos. It has opened up exciting prospects for current and future long-baseline neutrino oscillation experiments towards addressing the remaining fundamental questions, in particular the type of the neutrino mass hierarchy and the possible presence of a CP-violating phase. Another recent and crucial development is the indication of non-maximal 2-3 mixing angle, causing the octant ambiguity of θ23. In this paper, I will review the phenomenology of long-baseline neutrino oscillations with a special emphasis on sub-leading three-flavor effects, which will play a crucial role in resolving these unknowns. First, I will give a brief description of neutrino oscillation phenomenon. Then, I will discuss our present global understanding of the neutrino mass-mixing parameters and will identify the major unknowns in this sector. After that, I will present the physics reach of current generation long-baseline experiments. Finally, I will conclude with a discussion on the physics capabilities of accelerator-driven possible future long-baseline precision oscillation facilities.

  11. Accelerated Best Basis Inventory Baselining Task

    International Nuclear Information System (INIS)

    SASAKI, L.M.

    2001-01-01

    The baselining effort was recently proposed to bring the Best-Basis Inventory (BBI) and Question No.8 of the Tank Interpretive Report (TIR) for all 177 tanks to the current standards and protocols and to prepare a TIR Question No.8 if one is not already available. This plan outlines the objectives and methodology of the accelerated BBI baselining task. BBI baselining meetings held during December 2000 resulted in a revised BBI methodology and an initial set of BBI creation rules to be used in the baselining effort. The objectives of the BBI baselining effort are to: (1) Provide inventories that are consistent with the revised BBI methodology and new BBI creation rules. (2) Split the total tank waste in each tank into six waste phases, as appropriate (Supernatant, saltcake solids, saltcake liquid, sludge solids, sludge liquid, and retained gas). In some tanks, the solids and liquid portions of the sludge and/or saltcake may be combined into a single sludge or saltcake phase. (3) Identify sampling events that are to be used for calculating the BBIs. (4) Update waste volumes for subsequent reconciliation with the Hanlon (2001) waste tank summary. (5) Implement new waste type templates. (6) Include any sample data that might have been unintentionally omitted in the previous BBI and remove any sample data that should not have been included. Sample data to be used in the BBI must be available on TWINS. (7) Ensure that an inventory value for each standard BBI analyte is provided for each waste component. Sample based inventories for supplemental BBI analytes will be included when available. (8) Provide new means and confidence interval reports if one is not already available and include uncertainties in reporting inventory values

  12. Mercury baseline levels in Flemish soils (Belgium)

    International Nuclear Information System (INIS)

    Tack, Filip M.G.; Vanhaesebroeck, Thomas; Verloo, Marc G.; Van Rompaey, Kurt; Ranst, Eric van

    2005-01-01

    It is important to establish contaminant levels that are normally present in soils to provide baseline data for pollution studies. Mercury is a toxic element of concern. This study was aimed at assessing baseline mercury levels in soils in Flanders. In a previous study, mercury contents in soils in Oost-Vlaanderen were found to be significantly above levels reported elsewhere. For the current study, observations were extended over two more provinces, West-Vlaanderen and Antwerpen. Ranges of soil Hg contents were distinctly higher in the province Oost-Vlaanderen (interquartile range from 0.09 to 0.43 mg/kg) than in the other provinces (interquartile ranges from 0.7 to 0.13 and 0.7 to 0.15 mg/kg for West-Vlaanderen and Antwerpen, respectively). The standard threshold method was applied to separate soils containing baseline levels of Hg from the data. Baseline concentrations for Hg were characterised by a median of 0.10 mg Hg/kg dry soil, an interquartile range from 0.07 to 0.14 mg/kg and a 90% percentile value of 0.30 mg/kg. The influence of soil properties such as clay and organic carbon contents, and pH on baseline Hg concentrations was not important. Maps of the spatial distribution of Hg levels showed that the province Oost-Vlaanderen exhibited zones with systematically higher Hg soil contents. This may be related to the former presence of many small-scale industries employing mercury in that region. - Increased mercury levels may reflect human activity

  13. Defining Plagiarism: A Literature Review

    Directory of Open Access Journals (Sweden)

    Akbar Akbar

    2018-02-01

    Full Text Available Plagiarism has repeatedly occurred in Indonesia, resulting in focusing on such academic misbehavior as a “central issue” in Indonesian higher education. One of the issues of addressing plagiarism in higher education is that there is a confusion of defining plagiarism. It seems that Indonesian academics had different perception when defining plagiarism. This article aims at exploring the issue of plagiarism by helping define plagiarism to address confusion among Indonesian academics. This article applies literature review by firs finding relevant articles after identifying databases for literature searching. After the collection of required articles for review, the articles were synthesized before presenting the findings. This study has explored the definition of plagiarism in the context of higher education. This research found that plagiarism is defined in the relation of criminal acts. The huge numbers of discursive features used position plagiaristic acts as an illegal deed. This study also found that cultural backgrounds and exposure to plagiarism were influential in defining plagiarism.

  14. Active galactic nuclei cores in infrared-faint radio sources. Very long baseline interferometry observations using the Very Long Baseline Array

    Science.gov (United States)

    Herzog, A.; Middelberg, E.; Norris, R. P.; Spitler, L. R.; Deller, A. T.; Collier, J. D.; Parker, Q. A.

    2015-06-01

    Context. Infrared-faint radio sources (IFRS) form a new class of galaxies characterised by radio flux densities between tenths and tens of mJy and faint or absent infrared counterparts. It has been suggested that these objects are radio-loud active galactic nuclei (AGNs) at significant redshifts (z ≳ 2). Aims: Whereas the high redshifts of IFRS have been recently confirmed based on spectroscopic data, the evidence for the presence of AGNs in IFRS is mainly indirect. So far, only two AGNs have been unquestionably confirmed in IFRS based on very long baseline interferometry (VLBI) observations. In this work, we test the hypothesis that IFRS contain AGNs in a large sample of sources using VLBI. Methods: We observed 57 IFRS with the Very Long Baseline Array (VLBA) down to a detection sensitivity in the sub-mJy regime and detected compact cores in 35 sources. Results: Our VLBA detections increase the number of VLBI-detected IFRS from 2 to 37 and provide strong evidence that most - if not all - IFRS contain AGNs. We find that IFRS have a marginally higher VLBI detection fraction than randomly selected sources with mJy flux densities at arcsec-scales. Moreover, our data provide a positive correlation between compactness - defined as the ratio of milliarcsec- to arcsec-scale flux density - and redshift for IFRS, but suggest a decreasing mean compactness with increasing arcsec-scale radio flux density. Based on these findings, we suggest that IFRS tend to contain young AGNs whose jets have not formed yet or have not expanded, equivalent to very compact objects. We found two IFRS that are resolved into two components. The two components are spatially separated by a few hundred milliarcseconds in both cases. They might be components of one AGN, a binary black hole, or the result of gravitational lensing.

  15. EML Chester - 1982. Annual report of the Regional Baseline Station at Chester, New Jersey

    International Nuclear Information System (INIS)

    Volchok, H.L.

    1982-11-01

    The Environmental Measurements Laboratory (EML) has maintained a regional baseline station at Chester, New Jersey since 1976. The site provides EML with a remote, rural facility for carrying out regional baseline research and for testing field equipment. This report updates the various programs underway at the Chester site. Separate abstracts have been prepared for the included papers

  16. Defining safety goals. 2. Basic Consideration on Defining Safety Goals

    International Nuclear Information System (INIS)

    Hakata, T.

    2001-01-01

    The purpose of this study is to develop basic safety goals that are rational and consistent for all nuclear facilities, including nuclear power plants and fuel cycle facilities. Basic safety goals (risk limits) by an index of radiation dose are discussed, which are based on health effects of detriment and fatality and risk levels presumably accepted by society. The contents of this paper are the personal opinions of the author. The desirable structure of safety goals is assumed to be 'basic safety goals plus specific safety goals (or supplemental safety goals) for each sort of facility, which reflects their characteristics'. The requisites of the basic safety goals must include (a) rational bases (scientific and social), (b) comprehensiveness (common to all sorts of nuclear facilities covering from normal to accidental conditions), and (c) applicability. To meet the requirements, the basic safety goals might have to be a risk profile expression by an index of radiation dose. The societal rationality is consideration of absolute risk levels (10 -6 or 10 -7 /yr) and/or relative risk factors (such as 0.1% of U.S. safety goals) that the general public accepts as tolerable. The following quantitative objectives are adopted in this study for protection of average individuals in the vicinity of a nuclear facility: 1. The additive annual radiation dose during normal operation must be -4 /yr (health detriment), 2x10 -6 /yr (latent cancer and severe hereditary effects), and 10 -7 /yr (acute fatality) from the statistics in Japan. The radiation effects on human beings are determined by recommendations of UNSCEAR (Ref. 1) and ICRP. The health effects considered are non-severe stochastic health detriment, i.e., detectable opacities of lens of eye (threshold 5 0.5 to 2 Sv), depression of hematopoiesis of bone marrow (0.5 Sv), and depression of reproductive capability (temporary sterility of testes ) (0.15 Sv). The LD 50/60 of acute fatality is ∼4 Sv, and fatalities by latent

  17. Regional and hemispheric influences on temporal variability in baseline carbon monoxide and ozone over the Northeast US

    Science.gov (United States)

    Interannual variability in baseline carbon monoxide (CO) and ozone (O3), defined as mixing ratios under minimal influence of recent and local emissions, was studied for seven rural sites in the Northeast US over 2001–2010. Annual baseline CO exhibited statistically signific...

  18. Modular Software-Defined Radio

    Directory of Open Access Journals (Sweden)

    Rhiemeier Arnd-Ragnar

    2005-01-01

    Full Text Available In view of the technical and commercial boundary conditions for software-defined radio (SDR, it is suggestive to reconsider the concept anew from an unconventional point of view. The organizational principles of signal processing (rather than the signal processing algorithms themselves are the main focus of this work on modular software-defined radio. Modularity and flexibility are just two key characteristics of the SDR environment which extend smoothly into the modeling of hardware and software. In particular, the proposed model of signal processing software includes irregular, connected, directed, acyclic graphs with random node weights and random edges. Several approaches for mapping such software to a given hardware are discussed. Taking into account previous findings as well as new results from system simulations presented here, the paper finally concludes with the utility of pipelining as a general design guideline for modular software-defined radio.

  19. Defining and Selecting Independent Directors

    Directory of Open Access Journals (Sweden)

    Eric Pichet

    2017-10-01

    Full Text Available Drawing from the Enlightened Shareholder Theory that the author first developed in 2011, this theoretical paper with practical and normative ambitions achieves a better definition of independent director, while improving the understanding of the roles he fulfils on boards of directors. The first part defines constructs like firms, Governance system and Corporate governance, offering a clear distinction between the latter two concepts before explaining the four main missions of a board. The second part defines the ideal independent director by outlining the objective qualities that are necessary and adding those subjective aspects that have turned this into a veritable profession. The third part defines the ideal process for selecting independent directors, based on nominating committees that should themselves be independent. It also includes ways of assessing directors who are currently in function, as well as modalities for renewing their mandates. The paper’s conclusion presents the Paradox of the Independent Director.

  20. Defining and Classifying Interest Groups

    DEFF Research Database (Denmark)

    Baroni, Laura; Carroll, Brendan; Chalmers, Adam

    2014-01-01

    The interest group concept is defined in many different ways in the existing literature and a range of different classification schemes are employed. This complicates comparisons between different studies and their findings. One of the important tasks faced by interest group scholars engaged...... in large-N studies is therefore to define the concept of an interest group and to determine which classification scheme to use for different group types. After reviewing the existing literature, this article sets out to compare different approaches to defining and classifying interest groups with a sample...... in the organizational attributes of specific interest group types. As expected, our comparison of coding schemes reveals a closer link between group attributes and group type in narrower classification schemes based on group organizational characteristics than those based on a behavioral definition of lobbying....

  1. ON DEFINING S-SPACES

    Directory of Open Access Journals (Sweden)

    Francesco Strati

    2013-05-01

    Full Text Available The present work is intended to be an introduction to the Superposition Theory of David Carfì. In particular I shall depict the meaning of his brand new theory, on the one hand in an informal fashion and on the other hand by giving a formal approach of the algebraic structure of the theory: the S-linear algebra. This kind of structure underpins the notion of S-spaces (or Carfì-spaces by defining both its properties and its nature. Thus I shall define the S-triple as the fundamental principle upon which the S-linear algebra is built up.

  2. Baseline Vascular Cognitive Impairment Predicts the Course of Apathetic Symptoms After Stroke: The CASPER Study.

    Science.gov (United States)

    Douven, Elles; Köhler, Sebastian; Schievink, Syenna H J; van Oostenbrugge, Robert J; Staals, Julie; Verhey, Frans R J; Aalten, Pauline

    2018-03-01

    To examine the influence of vascular cognitive impairment (VCI) on the course of poststroke depression (PSD) and poststroke apathy (PSA). Included were 250 stroke patients who underwent neuropsychological and neuropsychiatric assessment 3 months after stroke (baseline) and at a 6- and 12-month follow-up after baseline. Linear mixed models tested the influence of VCI in at least one cognitive domain (any VCI) or multidomain VCI (VCI in multiple cognitive domains) at baseline and domain-specific VCI at baseline on levels of depression and apathy over time, with random effects for intercept and slope. Almost half of the patients showed any VCI at baseline, and any VCI was associated with increasing apathy levels from baseline to the 12-month follow-up. Patients with multidomain VCI had higher apathy scores at the 6- and 12-month follow-up compared with patients with VCI in a single cognitive domain. Domain-specific analyses showed that impaired executive function and slowed information processing speed went together with increasing apathy levels from baseline to 6- and 12-month follow-up. None of the cognitive variables predicted the course of depressive symptoms. Baseline VCI is associated with increasing apathy levels from baseline to the chronic stroke phase, whereas no association was found between baseline VCI and the course of depressive symptoms. Health professionals should be aware that apathy might be absent early after stroke but may evolve over time in patients with VCI. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Effects of Baseline Selection on Magnetocardiography: P-Q and T-P Intervals

    International Nuclear Information System (INIS)

    Lim, Hyun Kyoon; Kwon, Hyuk Chan; Kim, Tae En; Lee, Yong Ho; Kim, Jin Mok; Kim, In Seon; Kim, Ki Woong; Park, Yong Ki

    2007-01-01

    The baseline selection is the first and important step to analyze magnetocardiography (MCG) parameters. There are no difficulties to select the baseline between P- and Q-wave peak (P-Q interval) of MCG wave recorded from healthy subjects because the P-Q intervals of the healthy subjects do not much vary. However, patients with ischemic heart disease often show an unstable P-Q interval which does not seem to be appropriate for the baseline. In this case, T-P interval is alternatively recommended for the baseline. However, there has been no study on the difference made by the baseline selection. In this study, we studied the effect of the different baseline selection. MCG data were analyzed from twenty healthy subjects and twenty one patients whose baselines were alternatively selected in the T-P interval for their inappropriate P-Q interval. Paired T-test was used to compare two set of data. Fifteen parameters derived from the R-wave peak, the T-wave peak, and the period, T max/3 ∼ T max were compared for the different baseline selection. As a result, most parameters did not show significant differences (p>0.05) except few parameters. Therefore, there will be no significant differences if anyone of two intervals were selected for the MCG baseline. However, for the consistent analysis, P-Q interval is strongly recommended for the baseline correction.

  4. Baseline brain energy supports the state of consciousness.

    Science.gov (United States)

    Shulman, Robert G; Hyder, Fahmeed; Rothman, Douglas L

    2009-07-07

    An individual, human or animal, is defined to be in a conscious state empirically by the behavioral ability to respond meaningfully to stimuli, whereas the loss of consciousness is defined by unresponsiveness. PET measurements of glucose or oxygen consumption show a widespread approximately 45% reduction in cerebral energy consumption with anesthesia-induced loss of consciousness. Because baseline brain energy consumption has been shown by (13)C magnetic resonance spectroscopy to be almost exclusively dedicated to neuronal signaling, we propose that the high level of brain energy is a necessary property of the conscious state. Two additional neuronal properties of the conscious state change with anesthesia. The delocalized fMRI activity patterns in rat brain during sensory stimulation at a higher energy state (close to the awake) collapse to a contralateral somatosensory response at lower energy state (deep anesthesia). Firing rates of an ensemble of neurons in the rat somatosensory cortex shift from the gamma-band range (20-40 Hz) at higher energy state to energy state. With the conscious state defined by the individual's behavior and maintained by high cerebral energy, measurable properties of that state are the widespread fMRI patterns and high frequency neuronal activity, both of which support the extensive interregional communication characteristic of consciousness. This usage of high brain energies when the person is in the "state" of consciousness differs from most studies, which attend the smaller energy increments observed during the stimulations that form the "contents" of that state.

  5. Defining and Differentiating the Makerspace

    Science.gov (United States)

    Dousay, Tonia A.

    2017-01-01

    Many resources now punctuate the maker movement landscape. However, some schools and communities still struggle to understand this burgeoning movement. How do we define these spaces and differentiate them from previous labs and shops? Through a multidimensional framework, stakeholders should consider how the structure, access, staffing, and tools…

  6. Indico CONFERENCE: Define the Programme

    CERN Multimedia

    CERN. Geneva; Ferreira, Pedro

    2017-01-01

    In this tutorial you are going to learn how to define the programme of a conference in Indico. The program of your conference is divided in different “tracks”. Tracks represent the subject matter of the conference, such as “Online Computing”, “Offline Computing”, and so on.

  7. Predicting Baseline for Analysis of Electricity Pricing

    Energy Technology Data Exchange (ETDEWEB)

    Kim, T. [Ulsan National Inst. of Science and Technology (Korea, Republic of); Lee, D. [Ulsan National Inst. of Science and Technology (Korea, Republic of); Choi, J. [Ulsan National Inst. of Science and Technology (Korea, Republic of); Spurlock, A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sim, A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Todd, A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wu, K. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-05-03

    To understand the impact of new pricing structure on residential electricity demands, we need a baseline model that captures every factor other than the new price. The standard baseline is a randomized control group, however, a good control group is hard to design. This motivates us to devlop data-driven approaches. We explored many techniques and designed a strategy, named LTAP, that could predict the hourly usage years ahead. The key challenge in this process is that the daily cycle of electricity demand peaks a few hours after the temperature reaching its peak. Existing methods rely on the lagged variables of recent past usages to enforce this daily cycle. These methods have trouble making predictions years ahead. LTAP avoids this trouble by assuming the daily usage profile is determined by temperature and other factors. In a comparison against a well-designed control group, LTAP is found to produce accurate predictions.

  8. CASA Uno GPS orbit and baseline experiments

    Science.gov (United States)

    Schutz, B. E.; Ho, C. S.; Abusali, P. A. M.; Tapley, B. D.

    1990-01-01

    CASA Uno data from sites distributed in longitude from Australia to Europe have been used to determine orbits of the GPS satellites. The characteristics of the orbits determined from double difference phase have been evaluated through comparisons of two-week solutions with one-week solutions and by comparisons of predicted and estimated orbits. Evidence of unmodeled effects is demonstrated, particularly associated with the orbit planes that experience solar eclipse. The orbit accuracy has been assessed through the repeatability of unconstrained estimated baseline vectors ranging from 245 km to 5400 km. Both the baseline repeatability and the comparison with independent space geodetic methods give results at the level of 1-2 parts in 100,000,000. In addition, the Mojave/Owens Valley (245 km) and Kokee Park/Ft. Davis (5409 km) estimates agree with VLBI and SLR to better than 1 part in 100,000,000.

  9. Baseline composition of solar energetic particles

    International Nuclear Information System (INIS)

    Meyer, J.

    1985-01-01

    We analyze all existing spacecraft observations of the highly variable heavy element composition of solar energetic particles (SEP) during non- 3 He-rich events. All data show the imprint of an ever-present basic composition pattern (dubbed ''mass-unbiased baseline'' SEP composition) that differs from the photospheric composition by a simple bias related to first ionization potential (FIP). In each particular observation, this mass-unbiased baseline composition is being distorted by an additional bias, which is always a monotonic function of mass (or Z). This latter bias varies in amplitude and even sign from observation to observation. To first order, it seems related to differences in the A/Z* ratio between elements (Z* = mean effective charge)

  10. Information Technology Sector Baseline Risk Assessment

    Science.gov (United States)

    2009-08-01

    alternative root be economically advantageous , an actor’s ability to exploit market forces and create an alternative root would be significantly improved...conduct their operations. Therefore, a loss or disruption to Internet services would not be advantageous for the desired outcomes of these syndicates.26... eCommerce Service loss or disruption [C] Traffic Redirection [C] = Undesired consequence Information Technology Sector Baseline Risk Assessment

  11. Measuring cognitive change with ImPACT: the aggregate baseline approach.

    Science.gov (United States)

    Bruce, Jared M; Echemendia, Ruben J; Meeuwisse, Willem; Hutchison, Michael G; Aubry, Mark; Comper, Paul

    2017-11-01

    The Immediate Post-Concussion Assessment and Cognitive Test (ImPACT) is commonly used to assess baseline and post-injury cognition among athletes in North America. Despite this, several studies have questioned the reliability of ImPACT when given at intervals employed in clinical practice. Poor test-retest reliability reduces test sensitivity to cognitive decline, increasing the likelihood that concussed athletes will be returned to play prematurely. We recently showed that the reliability of ImPACT can be increased when using a new composite structure and the aggregate of two baselines to predict subsequent performance. The purpose of the present study was to confirm our previous findings and determine whether the addition of a third baseline would further increase the test-retest reliability of ImPACT. Data from 97 English speaking professional hockey players who had received at least 4 ImPACT baseline evaluations were extracted from a National Hockey League Concussion Program database. Linear regression was used to determine whether each of the first three testing sessions accounted for unique variance in the fourth testing session. Results confirmed that the aggregate baseline approach improves the psychometric properties of ImPACT, with most indices demonstrating adequate or better test-retest reliability for clinical use. The aggregate baseline approach provides a modest clinical benefit when recent baselines are available - and a more substantial benefit when compared to approaches that obtain baseline measures only once during the course of a multi-year playing career. Pending confirmation in diverse samples, neuropsychologists are encouraged to use the aggregate baseline approach to best quantify cognitive change following sports concussion.

  12. The LIFE Cognition Study: design and baseline characteristics

    Science.gov (United States)

    Sink, Kaycee M; Espeland, Mark A; Rushing, Julia; Castro, Cynthia M; Church, Timothy S; Cohen, Ronald; Gill, Thomas M; Henkin, Leora; Jennings, Janine M; Kerwin, Diana R; Manini, Todd M; Myers, Valerie; Pahor, Marco; Reid, Kieran F; Woolard, Nancy; Rapp, Stephen R; Williamson, Jeff D

    2014-01-01

    Observational studies have shown beneficial relationships between exercise and cognitive function. Some clinical trials have also demonstrated improvements in cognitive function in response to moderate–high intensity aerobic exercise; however, these have been limited by relatively small sample sizes and short durations. The Lifestyle Interventions and Independence for Elders (LIFE) Study is the largest and longest randomized controlled clinical trial of physical activity with cognitive outcomes, in older sedentary adults at increased risk for incident mobility disability. One LIFE Study objective is to evaluate the effects of a structured physical activity program on changes in cognitive function and incident all-cause mild cognitive impairment or dementia. Here, we present the design and baseline cognitive data. At baseline, participants completed the modified Mini Mental Status Examination, Hopkins Verbal Learning Test, Digit Symbol Coding, Modified Rey–Osterrieth Complex Figure, and a computerized battery, selected to be sensitive to changes in speed of processing and executive functioning. During follow up, participants completed the same battery, along with the Category Fluency for Animals, Boston Naming, and Trail Making tests. The description of the mild cognitive impairment/dementia adjudication process is presented here. Participants with worse baseline Short Physical Performance Battery scores (prespecified at ≤7) had significantly lower median cognitive test scores compared with those having scores of 8 or 9 with modified Mini Mental Status Examination score of 91 versus (vs) 93, Hopkins Verbal Learning Test delayed recall score of 7.4 vs 7.9, and Digit Symbol Coding score of 45 vs 48, respectively (all P<0.001). The LIFE Study will contribute important information on the effects of a structured physical activity program on cognitive outcomes in sedentary older adults at particular risk for mobility impairment. In addition to its importance in the

  13. AIDS defining disease: Disseminated cryptococcosis

    Directory of Open Access Journals (Sweden)

    Roshan Anupama

    2006-01-01

    Full Text Available Disseminated cryptococcosis is one of the acquired immune deficiency syndrome defining criteria and the most common cause of life threatening meningitis. Disseminated lesions in the skin manifest as papules or nodules that mimic molluscum contagiosum (MC. We report here a human immunodeficiency virus positive patient who presented with MC like lesions. Disseminated cryptococcosis was confirmed by India ink preparation and histopathology. The condition of the patient improved with amphotercin B.

  14. How do people define moderation?

    Science.gov (United States)

    vanDellen, Michelle R; Isherwood, Jennifer C; Delose, Julie E

    2016-06-01

    Eating in moderation is considered to be sound and practical advice for weight maintenance or prevention of weight gain. However, the concept of moderation is ambiguous, and the effect of moderation messages on consumption has yet to be empirically examined. The present manuscript examines how people define moderate consumption. We expected that people would define moderate consumption in ways that justified their current or desired consumption rather than view moderation as an objective standard. In Studies 1 and 2, moderate consumption was perceived to involve greater quantities of an unhealthy food (chocolate chip cookies, gummy candies) than perceptions of how much one should consume. In Study 3, participants generally perceived themselves to eat in moderation and defined moderate consumption as greater than their personal consumption. Furthermore, definitions of moderate consumption were related to personal consumption behaviors. Results suggest that the endorsement of moderation messages allows for a wide range of interpretations of moderate consumption. Thus, we conclude that moderation messages are unlikely to be effective messages for helping people maintain or lose weight. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Baseline and cognition activated brain SPECT imaging in depression

    International Nuclear Information System (INIS)

    Zhao Jinhua; Lin Xiangtong; Jiang Kaida; Liu Yongchang; Xu Lianqin

    1998-01-01

    Purpose: To evaluate the regional cerebral blood flow (rCBF) abnormalities through the semiquantitative analysis of the baseline and cognition activated rCBF imaging in unmedicated depressed patients. Methods: 27 depressed patients unmedicated by anti-depressants were enrolled. The diagnosis (depression of moderate degree with somatization) was confirmed by the ICD-10 criteria. 15 age matched normal controls were studied under identical conditions. Baseline and cognition activated 99m Tc-ECD SPECT were performed on 21 of the 27 patients with depression and 13 of the 15 normal controls. Baseline 99m Tc-ECD SPECT alone were performed on the rest 6 patients with depression and 2 normal controls. The cognitive activation is achieved by Wisconsin Card Sorting Test (WCST). 1110 MBq of 99m Tc-ECD was administered by intravenous bolus injection 5 minutes after the onset of the WCST. Semi-quantitative analysis was conducted with the 7th, 8th, 9th, 10th, 11th slices of the transaxial imaging. rCBF ratios of every ROI were calculated using the average tissue activity in the region divided by the maximum activity in the cerebellum. Results: 1) The baseline rCBF of left frontal (0.720) and left temporal lobe (0.720) were decreased significantly in depressed patients comparing with those of the control subjects. 2) The activated rCBF of left frontal lobe (0.719) and left temporal lobe (0.690), left parietal lobe (0.701) were decreased evidently than those of the controls. Conclusions: 1) Hypoperfusions of left frontal and left temporal cortexes were identified in patients with depression. 2) The hypoperfusion of left frontal and left temporal cortexes may be the cause of cognition disorder and depressed mood in patients with depression. 3) Cognition activated brain perfusion imaging is helpful for making a more accurate diagnosis of depression

  16. Base-line studies for DAE establishments

    International Nuclear Information System (INIS)

    Puranik, V.D.

    2012-01-01

    The Department of Atomic Energy has establishments located in various regions of the country and they include front-end fuel cycle facilities, nuclear power stations, back-end fuel cycle facilities and facilities for research and societal applications. These facilities handle naturally occurring radionuclides such as uranium, thorium and a variety of man-made radionuclides. These radionuclides are handled with utmost care so that they do not affect adversely the occupational workers or the members of public residing nearby. There is safety culture of the highest standard existing in all DAE establishments and it matches with the international standards. In addition, there is a perpetual environmental monitoring program carried out by the Environmental Survey Laboratories (ESLs) located at all DAE establishments. The environmental data generated by such program is studied regularly by experts to ensure compliance with the regulatory requirements. The regulatory requirements in the country are of international standards and ensure adequate protection of workers and members of public. In addition to such continued monitoring program and studies being carried out for the ongoing projects, base-line studies are carried out for all the new projects of the DAE. The purpose of the base-line studies is to establish a detailed base-line data set for a new DAE location well before the foundation stone is laid, so that the data collected when there is no departmental activity can be compared with the data generated later by the ESL. The data so generated is site specific and it varies from place to place depending upon the location of the site, e.g., inland or coastal, the presence of water bodies and pattern of irrigation, the geological characteristics of the location, the local culture and habits of the people, population density and urban or rural background. The data to be recorded as base-line data is generated over a period of at least one year covering all the seasons

  17. Pipeline integrity: ILI baseline data for QRA

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Todd R. [Tuboscope Pipeline Services, Houston, TX (United States)]. E-mail: tporter@varco.com; Silva, Jose Augusto Pereira da [Pipeway Engenharia, Rio de Janeiro, RJ (Brazil)]. E-mail: guto@pipeway.com; Marr, James [MARR and Associates, Calgary, AB (Canada)]. E-mail: jmarr@marr-associates.com

    2003-07-01

    The initial phase of a pipeline integrity management program (IMP) is conducting a baseline assessment of the pipeline system and segments as part of Quantitative Risk Assessment (QRA). This gives the operator's integrity team the opportunity to identify critical areas and deficiencies in the protection, maintenance, and mitigation strategies. As a part of data gathering and integration of a wide variety of sources, in-line inspection (ILI) data is a key element. In order to move forward in the integrity program development and execution, the baseline geometry of the pipeline must be determined with accuracy and confidence. From this, all subsequent analysis and conclusions will be derived. Tuboscope Pipeline Services (TPS), in conjunction with Pipeway Engenharia of Brazil, operate ILI inertial navigation system (INS) and Caliper geometry tools, to address this integrity requirement. This INS and Caliper ILI tool data provides pipeline trajectory at centimeter level resolution and sub-metre 3D position accuracy along with internal geometry - ovality, dents, misalignment, and wrinkle/buckle characterization. Global strain can be derived from precise INS curvature measurements and departure from the initial pipeline state. Accurate pipeline elevation profile data is essential in the identification of sag/over bend sections for fluid dynamic and hydrostatic calculations. This data, along with pipeline construction, operations, direct assessment and maintenance data is integrated in LinaViewPRO{sup TM}, a pipeline data management system for decision support functions, and subsequent QRA operations. This technology provides the baseline for an informed, accurate and confident integrity management program. This paper/presentation will detail these aspects of an effective IMP, and experience will be presented, showing the benefits for liquid and gas pipeline systems. (author)

  18. SRP Baseline Hydrogeologic Investigation, Phase 3

    Energy Technology Data Exchange (ETDEWEB)

    Bledsoe, H.W.

    1988-08-01

    The SRP Baseline Hydrogeologic Investigation was implemented for the purpose of updating and improving the knowledge and understanding of the hydrogeologic systems underlying the SRP site. Phase III, which is discussed in this report, includes the drilling of 7 deep coreholes (sites P-24 through P-30) and the installation of 53 observation wells ranging in depth from approximately 50 ft to more than 970 ft below the ground surface. In addition to the collection of geologic cores for lithologic and stratigraphic study, samples were also collected for the determination of physical characteristics of the sediments and for the identification of microorganisms.

  19. Environmental Baseline File for National Transportation

    International Nuclear Information System (INIS)

    1999-01-01

    This Environmental Baseline File summarizes and consolidates information related to the national-level transportation of commercial spent nuclear fuel. Topics address include: shipments of commercial spent nuclear fuel based on mostly truck and mostly rail shipping scenarios; transportation routing for commercial spent nuclear fuel sites and DOE sites; radionuclide inventories for various shipping container capacities; transportation routing; populations along transportation routes; urbanized area population densities; the impacts of historical, reasonably foreseeable, and general transportation; state-level food transfer factors; Federal Guidance Report No. 11 and 12 radionuclide dose conversion factors; and national average atmospheric conditions

  20. Spectrometer Baseline Control Via Spatial Filtering

    Science.gov (United States)

    Burleigh, M. R.; Richey, C. R.; Rinehart, S. A.; Quijada, M. A.; Wollack, E. J.

    2016-01-01

    An absorptive half-moon aperture mask is experimentally explored as a broad-bandwidth means of eliminating spurious spectral features arising from reprocessed radiation in an infrared Fourier transform spectrometer. In the presence of the spatial filter, an order of magnitude improvement in the fidelity of the spectrometer baseline is observed. The method is readily accommodated within the context of commonly employed instrument configurations and leads to a factor of two reduction in optical throughput. A detailed discussion of the underlying mechanism and limitations of the method are provided.

  1. Rationing in the presence of baselines

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Moreno-Ternero, Juan D.; Østerdal, Lars Peter

    2013-01-01

    We analyze a general model of rationing in which agents have baselines, in addition to claims against the (insufficient) endowment of the good to be allocated. Many real-life problems fit this general model (e.g., bankruptcy with prioritized claims, resource allocation in the public health care...... sector, water distribution in drought periods). We introduce (and characterize) a natural class of allocation methods for this model. Any method within the class is associated with a rule in the standard rationing model, and we show that if the latter obeys some focal properties, the former obeys them...

  2. Baseline scenarios of global environmental change

    International Nuclear Information System (INIS)

    Alcamo, J.; Kreileman, G.J.J.; Bollen, J.C.; Born, G.J. van den; Krol, M.S.; Toet, A.M.C.; Vries, H.J.M. de; Gerlagh, R.

    1996-01-01

    This paper presents three baseline scenarios of no policy action computed by the IMAGE2 model. These scenarios cover a wide range of coupled global change indicators, including: energy demand and consumption; food demand, consumption, and production; changes in land cover including changes in extent of agricultural land and forest; emissions of greenhouse gases and ozone precursors; and climate change and its impacts on sea level rise, crop productivity and natural vegetation. Scenario information is available for the entire world with regional and grid scale detail, and covers from 1970 to 2100. (author)

  3. SRP baseline hydrogeologic investigation: Aquifer characterization

    Energy Technology Data Exchange (ETDEWEB)

    Strom, R.N.; Kaback, D.S.

    1992-03-31

    An investigation of the mineralogy and chemistry of the principal hydrogeologic units and the geochemistry of the water in the principal aquifers at Savannah River Site (SRS) was undertaken as part of the Baseline Hydrogeologic Investigation. This investigation was conducted to provide background data for future site studies and reports and to provide a site-wide interpretation of the geology and geochemistry of the Coastal Plain Hydrostratigraphic province. Ground water samples were analyzed for major cations and anions, minor and trace elements, gross alpha and beta, tritium, stable isotopes of hydrogen, oxygen, and carbon, and carbon-14. Sediments from the well borings were analyzed for mineralogy and major and minor elements.

  4. Gravity sensing using Very Long Baseline Atom Interferometry

    Science.gov (United States)

    Schlippert, D.; Wodey, E.; Meiners, C.; Tell, D.; Schubert, C.; Ertmer, W.; Rasel, E. M.

    2017-12-01

    Very Long Baseline Atom Interferometry (VLBAI) has applications in high-accuracy absolute gravimetry, gravity-gradiometry, and for tests of fundamental physics. Thanks to the quadratic scaling of the phase shift with increasing free evolution time, extending the baseline of atomic gravimeters from tens of centimeters to meters puts resolutions of 10-13g and beyond in reach.We present the design and progress of key elements of the VLBAI-test stand: a dual-species source of Rb and Yb, a high-performance two-layer magnetic shield, and an active vibration isolation system allowing for unprecedented stability of the mirror acting as an inertial reference. We envisage a vibration-limited short-term sensitivity to gravitational acceleration of 1x10-8 m/s-2Hz-1/2 and up to a factor of 25 improvement when including additional correlation with a broadband seismometer. Here, the supreme long-term stability of atomic gravity sensors opens the route towards competition with superconducting gravimeters. The operation of VLBAI as a differential dual-species gravimeter using ultracold mixtures of Yb and Rb atoms enables quantum tests of the universality of free fall (UFF) at an unprecedented level of <10-13, potentially surpassing the best experiments to date.

  5. Defining enthesitis in spondyloarthritis by ultrasound

    DEFF Research Database (Denmark)

    Terslev, Lene; Naredo, E; Iagnocco, A

    2014-01-01

    Objective: To standardize ultrasound (US) in enthesitis. Methods: An Initial Delphi exercise was undertaken to define US detected enthesitis and its core components. These definitions were subsequently tested on static images taken from Spondyloarthritis (SpA) patients in order to evaluate...... elementary component. On static images the intra-observer reliability showed a high degree of variability for the detection of elementary lesions with kappa coefficients ranging from 0.14 - 1. The inter-observer kappa value was variable with the lowest kappa for enthesophytes (0.24) and the best for Doppler...... activity at the enthesis (0.63). Conclusion: This is the first consensus based definition of US enthesitis and its elementary components and the first step performed to ensure a higher degree of homogeneity and comparability of results between studies and in daily clinical work. Defining Enthesitis...

  6. Network Coded Software Defined Networking

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Hansen, Jonas; Roetter, Daniel Enrique Lucani

    2015-01-01

    Software Defined Networking (SDN) and Network Coding (NC) are two key concepts in networking that have garnered a large attention in recent years. On the one hand, SDN's potential to virtualize services in the Internet allows a large flexibility not only for routing data, but also to manage....... This paper advocates for the use of SDN to bring about future Internet and 5G network services by incorporating network coding (NC) functionalities. The inherent flexibility of both SDN and NC provides a fertile ground to envision more efficient, robust, and secure networking designs, that may also...

  7. Network Coded Software Defined Networking

    DEFF Research Database (Denmark)

    Hansen, Jonas; Roetter, Daniel Enrique Lucani; Krigslund, Jeppe

    2015-01-01

    Software defined networking has garnered large attention due to its potential to virtualize services in the Internet, introducing flexibility in the buffering, scheduling, processing, and routing of data in network routers. SDN breaks the deadlock that has kept Internet network protocols stagnant...... for decades, while applications and physical links have evolved. This article advocates for the use of SDN to bring about 5G network services by incorporating network coding (NC) functionalities. The latter constitutes a major leap forward compared to the state-of-the- art store and forward Internet paradigm...

  8. (Re)Defining Salesperson Motivation

    DEFF Research Database (Denmark)

    Khusainova, Rushana; de Jong, Ad; Lee, Nick

    2018-01-01

    The construct of motivation is one of the central themes in selling and sales management research. Yet, to-date no review article exists that surveys the construct (both from an extrinsic and intrinsic motivation context), critically evaluates its current status, examines various key challenges...... apparent from the extant research, and suggests new research opportunities based on a thorough review of past work. The authors explore how motivation is defined, major theories underpinning motivation, how motivation has historically been measured, and key methodologies used over time. In addition......, attention is given to principal drivers and outcomes of salesperson motivation. A summarizing appendix of key articles in salesperson motivation is provided....

  9. Defining Usability of PN Services

    DEFF Research Database (Denmark)

    Nicolajsen, Hanne Westh; Ahola, Titta; Fleury, Alexandre

    In this deliverable usability and user experience are defined in relation to MAGNET Beyond technologies, and it is described how the main MAGNET Beyond concepts can be evaluated through the involvement of users. The concepts include the new "Activity based communication approach" for interacting...... with the MAGNET Beyond system, as well as the core concepts: Personal Network, Personal Network-Federation, Service Discovery, User Profile Management, Personal Network Management, Privacy and Security and Context Awareness. The overall plans for the final usability evaluation are documented based on the present...

  10. Phased Startup Initiative Phases 3 and 4 Test Plan and Test Specification (OCRWM)

    International Nuclear Information System (INIS)

    PITNER, A.L.

    2000-01-01

    Construction for the Spent Nuclear Fuel (SNF) Project facilities is continuing per the Level III Baseline Schedule, and installation of the Fuel Retrieval System (FRS) and Integrated Water Treatment System (IWTS) in K West Basin is now complete. In order to accelerate the project, a phased start up strategy to initiate testing of the FRS and IWTS early in the overall project schedule was proposed (Williams 1999). Wilkinson (1999) expands the definition of the original proposal into four functional testing phases of the Phased Startup Initiative (PSI). Phases 1 and 2 are based on performing functional tests using dummy fuel. These tests are described in separate planning documents. This test plan provides overall guidance for Phase 3 and 4 tests, which are performed using actual irradiated N fuel assemblies. The overall objective of the Phase 3 and 4 testing is to verify how the FRS and IWTS respond while processing actual fuel. Conducting these tests early in the project schedule will allow identification and resolution of equipment and process problems before they become activities on the start-up critical path. The specific objectives of this test plan are to: (1) Define the test scope for the FRS and IWTS; (2) Provide detailed test requirements that can be used to write the specific test procedures; (3) Define data required and measurements to be taken. Where existing methods to obtain these do not exist, enough detail will be provided to define required additional equipment; and (4) Define specific test objectives and acceptance criteria

  11. Analysis of baseline gene expression levels from ...

    Science.gov (United States)

    The use of gene expression profiling to predict chemical mode of action would be enhanced by better characterization of variance due to individual, environmental, and technical factors. Meta-analysis of microarray data from untreated or vehicle-treated animals within the control arm of toxicogenomics studies has yielded useful information on baseline fluctuations in gene expression. A dataset of control animal microarray expression data was assembled by a working group of the Health and Environmental Sciences Institute's Technical Committee on the Application of Genomics in Mechanism Based Risk Assessment in order to provide a public resource for assessments of variability in baseline gene expression. Data from over 500 Affymetrix microarrays from control rat liver and kidney were collected from 16 different institutions. Thirty-five biological and technical factors were obtained for each animal, describing a wide range of study characteristics, and a subset were evaluated in detail for their contribution to total variability using multivariate statistical and graphical techniques. The study factors that emerged as key sources of variability included gender, organ section, strain, and fasting state. These and other study factors were identified as key descriptors that should be included in the minimal information about a toxicogenomics study needed for interpretation of results by an independent source. Genes that are the most and least variable, gender-selectiv

  12. Structure of a traditional baseline data system

    Energy Technology Data Exchange (ETDEWEB)

    1976-12-01

    Research was conducted to determine whether appropriate data exist for the development of a comprehensive statistical baseline data system on the human environment in the Athabasca oil sands region of Alberta. The existing data sources pertinent to the target area were first reviewed and discussed. Criteria were selected to assist the evaluation of data, including type of data collected, source, degree of detail, geographic identification, accessibility, and time frame. These criteria allowed assessing whether the data would be amenable to geographically-coded, continuous monitoring systems. It was found that the Statistics Canada Census provided the most detail, the most complete coverage of the target area, the smallest statistical areas, the greatest consistency in data and data collection, and the most regular collection. The local agency collection efforts were generally oriented toward specific goals and the data intended primarily for intra-agency use. The smallest statistical units in these efforts may be too large to be of value to a common small-area system, and data collection agencies did not generally use coterminous boundaries. Recommendations were made to give primary consideration to Statistics Canada data in the initial development of the baseline data system. Further development of such a system depends on the adoption by local agencies of a common small-area system for data collection. 38 refs., 6 figs.

  13. Long-baseline neutrino oscillation experiments

    International Nuclear Information System (INIS)

    Crane, D.; Goodman, M.

    1994-01-01

    There is no unambiguous definition for long baseline neutrino oscillation experiments. The term is generally used for accelerator neutrino oscillation experiments which are sensitive to Δm 2 2 , and for which the detector is not on the accelerator site. The Snowmass N2L working group met to discuss the issues facing such experiments. The Fermilab Program Advisory Committee adopted several recommendations concerning the Fermilab neutrino program at their Aspen meeting immediately prior to the Snowmass Workshop. This heightened the attention for the proposals to use Fermilab for a long-baseline neutrino experiment at the workshop. The plan for a neutrino oscillation program at Brookhaven was also thoroughly discussed. Opportunities at CERN were considered, particularly the use of detectors at the Gran Sasso laboratory. The idea to build a neutrino beam from KEK towards Superkamiokande was not discussed at the Snowmass meeting, but there has been considerable development of this idea since then. Brookhaven and KEK would use low energy neutrino beams, while FNAL and CERN would plan have medium energy beams. This report will summarize a few topics common to LBL proposals and attempt to give a snapshot of where things stand in this fast developing field

  14. Baseline response rates affect resistance to change.

    Science.gov (United States)

    Kuroda, Toshikazu; Cook, James E; Lattal, Kennon A

    2018-01-01

    The effect of response rates on resistance to change, measured as resistance to extinction, was examined in two experiments. In Experiment 1, responding in transition from a variable-ratio schedule and its yoked-interval counterpart to extinction was compared with pigeons. Following training on a multiple variable-ratio yoked-interval schedule of reinforcement, in which response rates were higher in the former component, reinforcement was removed from both components during a single extended extinction session. Resistance to extinction in the yoked-interval component was always either greater or equal to that in the variable-ratio component. In Experiment 2, resistance to extinction was compared for two groups of rats that exhibited either high or low response rates when maintained on identical variable-interval schedules. Resistance to extinction was greater for the lower-response-rate group. These results suggest that baseline response rate can contribute to resistance to change. Such effects, however, can only be revealed when baseline response rate and reinforcement rate are disentangled (Experiments 1 and 2) from the more usual circumstance where the two covary. Furthermore, they are more cleanly revealed when the programmed contingencies controlling high and low response rates are identical, as in Experiment 2. © 2017 Society for the Experimental Analysis of Behavior.

  15. Baseline Estimation and Outlier Identification for Halocarbons

    Science.gov (United States)

    Wang, D.; Schuck, T.; Engel, A.; Gallman, F.

    2017-12-01

    The aim of this paper is to build a baseline model for halocarbons and to statistically identify the outliers under specific conditions. In this paper, time series of regional CFC-11 and Chloromethane measurements was discussed, which taken over the last 4 years at two locations, including a monitoring station at northwest of Frankfurt am Main (Germany) and Mace Head station (Ireland). In addition to analyzing time series of CFC-11 and Chloromethane, more importantly, a statistical approach of outlier identification is also introduced in this paper in order to make a better estimation of baseline. A second-order polynomial plus harmonics are fitted to CFC-11 and chloromethane mixing ratios data. Measurements with large distance to the fitting curve are regard as outliers and flagged. Under specific requirement, the routine is iteratively adopted without the flagged measurements until no additional outliers are found. Both model fitting and the proposed outlier identification method are realized with the help of a programming language, Python. During the period, CFC-11 shows a gradual downward trend. And there is a slightly upward trend in the mixing ratios of Chloromethane. The concentration of chloromethane also has a strong seasonal variation, mostly due to the seasonal cycle of OH. The usage of this statistical method has a considerable effect on the results. This method efficiently identifies a series of outliers according to the standard deviation requirements. After removing the outliers, the fitting curves and trend estimates are more reliable.

  16. NV Diamond Micro-Magnetometer Baseline Studies

    Science.gov (United States)

    2009-08-12

    to define circular masks of diameters ranging from 100-250nm on the surface. An anisotropic etch was used to transfer the pattern into the crystal...between NV and nearby 13C. (b) Pulse sequence for transfer of electron spin coherence to nuclear spin and repetitive readout. (c) Cumulative Rabi

  17. The geobiosphere emergy baseline: A synthesis.

    Science.gov (United States)

    The concept of emergy defined as the available energy (or exergy) of one form used up directly and indirectly to produce an item or action (Odum, Environmental Accounting Emergy and Environmental Decision Making, John Wiley & Sons, Inc., 1996) requires the specification of a unif...

  18. Baseline Response Levels Are a Nuisance in Infant Contingency Learning

    Science.gov (United States)

    Millar, W. S.; Weir, Catherine

    2015-01-01

    The impact of differences in level of baseline responding on contingency learning in the first year was examined by considering the response acquisition of infants classified into baseline response quartiles. Whereas the three lower baseline groups showed the predicted increment in responding to a contingency, the highest baseline responders did…

  19. 33 CFR 2.20 - Territorial sea baseline.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Territorial sea baseline. 2.20... JURISDICTION Jurisdictional Terms § 2.20 Territorial sea baseline. Territorial sea baseline means the line.... Normally, the territorial sea baseline is the mean low water line along the coast of the United States...

  20. Comparison of Three Methods Estimating Baseline Creatinine For Acute Kidney Injury in Hospitalized Patients: a Multicentre Survey in Third-Level Urban Hospitals of China.

    Science.gov (United States)

    Lang, Xia-Bing; Yang, Yi; Yang, Ju-Rong; Wan, Jian-Xin; Yu, Sheng-Qiang; Cui, Jiong; Tang, Xiao-Jing; Chen, Jianghua

    2018-01-01

    A lack of baseline serum creatinine (SCr) data leads to underestimation of the burden caused by acute kidney injury (AKI) in developing countries. The goal of this study was to investigate the effects of various baseline SCr analysis methods on the current diagnosis of AKI in hospitalized patients. Patients with at least one SCr value during their hospital stay between January 1, 2011 and December 31, 2012 were retrospectively included in the study. The baseline SCr was determined either by the minimum SCr (SCrMIN) or the estimated SCr using the MDRD formula (SCrGFR-75). We also used the dynamic baseline SCr (SCrdynamic) in accordance with the 7 day/48 hour time window. AKI was defined based on the KDIGO SCr criteria. Of 562,733 hospitalized patients, 350,458 (62.3%) had at least one SCr determination, and 146,185 (26.0%) had repeat SCr tests. AKI was diagnosed in 13,883 (2.5%) patients using the SCrMIN, 21,281 (3.8%) using the SCrGFR-75 and 9,288 (1.7%) using the SCrdynamic. Compared with the non-AKI patients, AKI patients had a higher in-hospital mortality rate regardless of the baseline SCr analysis method. Because of the scarcity of SCr data, imputation of the baseline SCr is necessary to remedy the missing data. The detection rate of AKI varies depending on the different imputation methods. SCrGFR-75 can identify more AKI cases than the other two methods. © 2018 The Author(s). Published by S. Karger AG, Basel.

  1. Comparison of Three Methods Estimating Baseline Creatinine For Acute Kidney Injury in Hospitalized Patients: a Multicentre Survey in Third-Level Urban Hospitals of China

    Directory of Open Access Journals (Sweden)

    Xia-bing Lang

    2018-02-01

    Full Text Available Background/Aims: A lack of baseline serum creatinine (SCr data leads to underestimation of the burden caused by acute kidney injury (AKI in developing countries. The goal of this study was to investigate the effects of various baseline SCr analysis methods on the current diagnosis of AKI in hospitalized patients. Methods: Patients with at least one SCr value during their hospital stay between January 1, 2011 and December 31, 2012 were retrospectively included in the study. The baseline SCr was determined either by the minimum SCr (SCrMIN or the estimated SCr using the MDRD formula (SCrGFR-75. We also used the dynamic baseline SCr (SCrdynamic in accordance with the 7 day/48 hour time window. AKI was defined based on the KDIGO SCr criteria. Results: Of 562,733 hospitalized patients, 350,458 (62.3% had at least one SCr determination, and 146,185 (26.0% had repeat SCr tests. AKI was diagnosed in 13,883 (2.5% patients using the SCrMIN, 21,281 (3.8% using the SCrGFR-75 and 9,288 (1.7% using the SCrdynamic. Compared with the non-AKI patients, AKI patients had a higher in-hospital mortality rate regardless of the baseline SCr analysis method. Conclusions: Because of the scarcity of SCr data, imputation of the baseline SCr is necessary to remedy the missing data. The detection rate of AKI varies depending on the different imputation methods. SCrGFR-75 can identify more AKI cases than the other two methods.

  2. Expressiveness and definability in circumscription

    Directory of Open Access Journals (Sweden)

    Francicleber Martins Ferreira

    2011-06-01

    Full Text Available We investigate expressiveness and definability issues with respect to minimal models, particularly in the scope of Circumscription. First, we give a proof of the failure of the Löwenheim-Skolem Theorem for Circumscription. Then we show that, if the class of P; Z-minimal models of a first-order sentence is Δ-elementary, then it is elementary. That is, whenever the circumscription of a first-order sentence is equivalent to a first-order theory, then it is equivalent to a finitely axiomatizable one. This means that classes of models of circumscribed theories are either elementary or not Δ-elementary. Finally, using the previous result, we prove that, whenever a relation Pi is defined in the class of P; Z-minimal models of a first-order sentence Φ and whenever such class of P; Z-minimal models is Δ-elementary, then there is an explicit definition ψ for Pi such that the class of P; Z-minimal models of Φ is the class of models of Φ ∧ ψ. In order words, the circumscription of P in Φ with Z varied can be replaced by Φ plus this explicit definition ψ for Pi.

  3. Defining Quality in Undergraduate Education

    Directory of Open Access Journals (Sweden)

    Alison W. Bowers

    2018-01-01

    Full Text Available Objectives: This research brief explores the literature addressing quality in undergraduate education to identify what previous research has said about quality and to offer future directions for research on quality in undergraduate education. Method: We conducted a scoping review to provide a broad overview of existing research. Using targeted search terms in academic databases, we identified and reviewed relevant academic literature to develop emergent themes and implications for future research. Results: The exploratory review of the literature revealed a range of thoughtful discussions and empirical studies attempting to define quality in undergraduate education. Many publications highlighted the importance of including different stakeholder perspectives and presented some of the varying perceptions of quality among different stakeholders. Conclusions: While a number of researchers have explored and written about how to define quality in undergraduate education, there is not a general consensus regarding a definition of quality in undergraduate education. Past research offers a range of insights, models, and data to inform future research. Implication for Theory and/or Practice: We provide four recommendations for future research to contribute to a high quality undergraduate educational experience. We suggest more comprehensive systematic reviews of the literature as a next step.

  4. Baseline PSA in a Spanish male population aged 40-49 years anticipates detection of prostate cancer.

    Science.gov (United States)

    Angulo, J C; Viñas, M A; Gimbernat, H; Fata, F Ramón de; Granados, R; Luján, M

    2015-12-01

    We researched the usefulness of optimizing prostate cancer (PC) screening in our community using baseline PSA readings in men between 40-49 years of age. A retrospective study was performed that analyzed baseline PSA in the fifth decade of life and its ability to predict the development of PC in a population of Madrid (Spain). An ROC curve was created and a cutoff was proposed. We compared the evolution of PSA from baseline in patients with consecutive readings using the Friedman test. We established baseline PSA ranges with different risks of developing cancer and assessed the diagnostic utility of the annual PSA velocity (PSAV) in this population. Some 4,304 men aged 40-49 years underwent opportunistic screening over the course of 17 years, with at least one serum PSA reading (6,001 readings) and a mean follow-up of 57.1±36.8 months. Of these, 768 underwent biopsy of some organ, and 104 underwent prostate biopsy. Fourteen patients (.33%) were diagnosed with prostate cancer. The median baseline PSA was .74 (.01-58.5) ng/mL for patients without PC and 4.21 (.76-47.4) ng/mL for those with PC. The median time from the reading to diagnosis was 26.8 (1.5-143.8) months. The optimal cutoff for detecting PC was 1.9ng/mL (sensitivity, 92.86%; specificity, 92.54%; PPV, 3.9%; NPV, 99.97%), and the area under the curve was 92.8%. In terms of the repeated reading, the evolution of the PSA showed no statistically significant differences between the patients without cancer (p=.56) and those with cancer (P=.64). However, a PSAV value >.3ng/mL/year revealed high specificity for detecting cancer in this population. A baseline PSA level ≥1.9ng/mL in Spanish men aged 40-49 years predicted the development of PC. This value could therefore be of use for opportunistic screening at an early age. An appropriate follow-up adapted to the risk of this population needs to be defined, but an annual PSAV ≥.3ng/mL/year appears of use for reaching an early diagnosis. Copyright © 2015 AEU

  5. Classification of baseline toxicants for QSAR predictions to replace fish acute toxicity studies.

    Science.gov (United States)

    Nendza, Monika; Müller, Martin; Wenzel, Andrea

    2017-03-22

    Fish acute toxicity studies are required for environmental hazard and risk assessment of chemicals by national and international legislations such as REACH, the regulations of plant protection products and biocidal products, or the GHS (globally harmonised system) for classification and labelling of chemicals. Alternative methods like QSARs (quantitative structure-activity relationships) can replace many ecotoxicity tests. However, complete substitution of in vivo animal tests by in silico methods may not be realistic. For the so-called baseline toxicants, it is possible to predict the fish acute toxicity with sufficient accuracy from log K ow and, hence, valid QSARs can replace in vivo testing. In contrast, excess toxicants and chemicals not reliably classified as baseline toxicants require further in silico, in vitro or in vivo assessments. Thus, the critical task is to discriminate between baseline and excess toxicants. For fish acute toxicity, we derived a scheme based on structural alerts and physicochemical property thresholds to classify chemicals as either baseline toxicants (=predictable by QSARs) or as potential excess toxicants (=not predictable by baseline QSARs). The step-wise approach identifies baseline toxicants (true negatives) in a precautionary way to avoid false negative predictions. Therefore, a certain fraction of false positives can be tolerated, i.e. baseline toxicants without specific effects that may be tested instead of predicted. Application of the classification scheme to a new heterogeneous dataset for diverse fish species results in 40% baseline toxicants, 24% excess toxicants and 36% compounds not classified. Thus, we can conclude that replacing about half of the fish acute toxicity tests by QSAR predictions is realistic to be achieved in the short-term. The long-term goals are classification criteria also for further groups of toxicants and to replace as many in vivo fish acute toxicity tests as possible with valid QSAR

  6. Software Defined Radio: Basic Principles and Applications

    Directory of Open Access Journals (Sweden)

    José Raúl Machado-Fernández

    2014-12-01

    Full Text Available The author makes a review of the SDR (Software Defined Radio technology, including hardware schemes and application fields. A low performance device is presented and several tests are executed with it using free software. With the acquired experience, SDR employment opportunities are identified for low-cost solutions that can solve significant problems. In addition, a list of the most important frameworks related to the technology developed in the last years is offered, recommending the use of three of them.

  7. Phase Startup Initiative Phases 3 and 4 Test Plan and Test Specification (OCRWM)

    International Nuclear Information System (INIS)

    PAJUNEN, A.L.; LANGEVIN, M.J.

    2000-01-01

    Construction for the Spent Nuclear Fuel (SNF) Project facilities is continuing per the Level III Baseline Schedule, and installation of the Fuel Retrieval System (FRS) and Integrated Water Treatment System (IWTS) in K West Basin is now complete. In order to accelerate the project, a phased start up strategy to initiate testing of the FRS and IWTS early in the overall project schedule was proposed (Williams 1999). Wilkinson (1999) expands the definition of the original proposal into four functional testing phases of the Phased Startup Initiative (PSI). Phases 1 and 2 are based on performing functional tests using dummy fuel. This test plan provides overall guidance for Phase 3 and 4 tests, which are performed using actual irradiated N fuel assemblies. The overall objective of the Phase 3 and 4 testing is to verify how the FRS and IWTS respond while processing actual fuel. Conducting these tests early in the project schedule will allow identification and resolution of equipment and process problems before they become activities on the start-up critical path. The specific objectives of this test plan are to: Define the Phase 3 and 4 test scope for the FRS and IWTS; Provide detailed test requirements that can be used to write the specific test procedures; Define data required and measurements to be taken. Where existing methods to obtain these do not exist, enough detail will be provided to define required additional equipment; and Define specific test objectives and acceptance criteria

  8. Phase Startup Initiative Phases 3 and 4 Test Plan and Test Specification ( OCRWM)

    Energy Technology Data Exchange (ETDEWEB)

    PAJUNEN, A.L.; LANGEVIN, M.J.

    2000-08-07

    Construction for the Spent Nuclear Fuel (SNF) Project facilities is continuing per the Level III Baseline Schedule, and installation of the Fuel Retrieval System (FRS) and Integrated Water Treatment System (IWTS) in K West Basin is now complete. In order to accelerate the project, a phased start up strategy to initiate testing of the FRS and IWTS early in the overall project schedule was proposed (Williams 1999). Wilkinson (1999) expands the definition of the original proposal into four functional testing phases of the Phased Startup Initiative (PSI). Phases 1 and 2 are based on performing functional tests using dummy fuel. This test plan provides overall guidance for Phase 3 and 4 tests, which are performed using actual irradiated N fuel assemblies. The overall objective of the Phase 3 and 4 testing is to verify how the FRS and IWTS respond while processing actual fuel. Conducting these tests early in the project schedule will allow identification and resolution of equipment and process problems before they become activities on the start-up critical path. The specific objectives of this test plan are to: Define the Phase 3 and 4 test scope for the FRS and IWTS; Provide detailed test requirements that can be used to write the specific test procedures; Define data required and measurements to be taken. Where existing methods to obtain these do not exist, enough detail will be provided to define required additional equipment; and Define specific test objectives and acceptance criteria.

  9. The WITCH Model. Structure, Baseline, Solutions.

    Energy Technology Data Exchange (ETDEWEB)

    Bosetti, V.; Massetti, E.; Tavoni, M.

    2007-07-01

    WITCH - World Induced Technical Change Hybrid - is a regionally disaggregated hard link hybrid global model with a neoclassical optimal growth structure (top down) and an energy input detail (bottom up). The model endogenously accounts for technological change, both through learning curves affecting prices of new vintages of capital and through R and D investments. The model features the main economic and environmental policies in each world region as the outcome of a dynamic game. WITCH belongs to the class of Integrated Assessment Models as it possesses a climate module that feeds climate changes back into the economy. In this paper we provide a thorough discussion of the model structure and baseline projections. We report detailed information on the evolution of energy demand, technology and CO2 emissions. Finally, we explicitly quantifiy the role of free riding in determining the emissions scenarios. (auth)

  10. Global Nuclear Energy Partnership Waste Treatment Baseline

    International Nuclear Information System (INIS)

    Gombert, Dirk; Ebert, William; Marra, James; Jubin, Robert; Vienna, John

    2008-01-01

    The Global Nuclear Energy Partnership (GNEP) program is designed to demonstrate that a proliferation-resistant and sustainable integrated nuclear fuel cycle can be commercialized and used internationally. Alternative stabilization concepts for byproducts and waste streams generated by fuel recycling processes were evaluated and a baseline set of waste forms was recommended for the safe disposition of waste streams. Specific waste forms are recommended based on the demonstrated or expected commercial practicability and technical maturity of the processes needed to make the waste forms, and expected performance of the waste form materials when disposed. Significant issues remain in developing technologies to process some of the wastes into the recommended waste forms, and a detailed analysis of technology readiness may lead to the choice of a different waste form than what is recommended herein. Evolving regulations could also affect the selection of waste forms. (authors)

  11. In-Space Manufacturing Baseline Property Development

    Science.gov (United States)

    Stockman, Tom; Schneider, Judith; Prater, Tracie; Bean, Quincy; Werkheiser, Nicki

    2016-01-01

    The In-Space Manufacturing (ISM) project at NASA Marshall Space Flight Center currently operates a 3D FDM (fused deposition modeling) printer onboard the International Space Station. In order to enable utilization of this capability by designer, the project needs to establish characteristic material properties for materials produced using the process. This is difficult for additive manufacturing since standards and specifications do not yet exist for these technologies. Due to availability of crew time, there are limitations to the sample size which in turn limits the application of the traditional design allowables approaches to develop a materials property database for designers. In this study, various approaches to development of material databases were evaluated for use by designers of space systems who wish to leverage in-space manufacturing capabilities. This study focuses on alternative statistical techniques for baseline property development to support in-space manufacturing.

  12. Global Nuclear Energy Partnership Waste Treatment Baseline

    Energy Technology Data Exchange (ETDEWEB)

    Gombert, Dirk; Ebert, William; Marra, James; Jubin, Robert; Vienna, John [Idaho National laboratory, 2525 Fremont Ave., Idaho Falls, ID 83402 (United States)

    2008-07-01

    The Global Nuclear Energy Partnership (GNEP) program is designed to demonstrate that a proliferation-resistant and sustainable integrated nuclear fuel cycle can be commercialized and used internationally. Alternative stabilization concepts for byproducts and waste streams generated by fuel recycling processes were evaluated and a baseline set of waste forms was recommended for the safe disposition of waste streams. Specific waste forms are recommended based on the demonstrated or expected commercial practicability and technical maturity of the processes needed to make the waste forms, and expected performance of the waste form materials when disposed. Significant issues remain in developing technologies to process some of the wastes into the recommended waste forms, and a detailed analysis of technology readiness may lead to the choice of a different waste form than what is recommended herein. Evolving regulations could also affect the selection of waste forms. (authors)

  13. Global Nuclear Energy Partnership Waste Treatment Baseline

    Energy Technology Data Exchange (ETDEWEB)

    Dirk Gombert; William Ebert; James Marra; Robert Jubin; John Vienna

    2008-05-01

    The Global Nuclear Energy Partnership program (GNEP) is designed to demonstrate a proliferation-resistant and sustainable integrated nuclear fuel cycle that can be commercialized and used internationally. Alternative stabilization concepts for byproducts and waste streams generated by fuel recycling processes were evaluated and a baseline of waste forms was recommended for the safe disposition of waste streams. Waste forms are recommended based on the demonstrated or expected commercial practicability and technical maturity of the processes needed to make the waste forms, and performance of the waste form materials when disposed. Significant issues remain in developing technologies to process some of the wastes into the recommended waste forms, and a detailed analysis of technology readiness and availability may lead to the choice of a different waste form than what is recommended herein. Evolving regulations could also affect the selection of waste forms.

  14. Miniature EVA Software Defined Radio

    Science.gov (United States)

    Pozhidaev, Aleksey

    2012-01-01

    As NASA embarks upon developing the Next-Generation Extra Vehicular Activity (EVA) Radio for deep space exploration, the demands on EVA battery life will substantially increase. The number of modes and frequency bands required will continue to grow in order to enable efficient and complex multi-mode operations including communications, navigation, and tracking applications. Whether conducting astronaut excursions, communicating to soldiers, or first responders responding to emergency hazards, NASA has developed an innovative, affordable, miniaturized, power-efficient software defined radio that offers unprecedented power-efficient flexibility. This lightweight, programmable, S-band, multi-service, frequency- agile EVA software defined radio (SDR) supports data, telemetry, voice, and both standard and high-definition video. Features include a modular design, an easily scalable architecture, and the EVA SDR allows for both stationary and mobile battery powered handheld operations. Currently, the radio is equipped with an S-band RF section. However, its scalable architecture can accommodate multiple RF sections simultaneously to cover multiple frequency bands. The EVA SDR also supports multiple network protocols. It currently implements a Hybrid Mesh Network based on the 802.11s open standard protocol. The radio targets RF channel data rates up to 20 Mbps and can be equipped with a real-time operating system (RTOS) that can be switched off for power-aware applications. The EVA SDR's modular design permits implementation of the same hardware at all Network Nodes concept. This approach assures the portability of the same software into any radio in the system. It also brings several benefits to the entire system including reducing system maintenance, system complexity, and development cost.

  15. Mechanical Thrombectomy in Elderly Stroke Patients with Mild-to-Moderate Baseline Disability.

    Science.gov (United States)

    Slawski, Diana E; Salahuddin, Hisham; Shawver, Julie; Kenmuir, Cynthia L; Tietjen, Gretchen E; Korsnack, Andrea; Zaidi, Syed F; Jumaa, Mouhammad A

    2018-04-01

    The number of elderly patients suffering from ischemic stroke is rising. Randomized trials of mechanical thrombectomy (MT) generally exclude patients over the age of 80 years with baseline disability. The aim of this study was to understand the efficacy and safety of MT in elderly patients, many of whom may have baseline impairment. Between January 2015 and April 2017, 96 patients ≥80 years old who underwent MT for stroke were selected for a chart review. The data included baseline characteristics, time to treatment, the rate of revascularization, procedural complications, mortality, and 90-day good outcome defined as a modified Rankin Scale (mRS) score of 0-2 or return to baseline. Of the 96 patients, 50 had mild baseline disability (mRS score 0-1) and 46 had moderate disability (mRS score 2-4). Recanalization was achieved in 84% of the patients, and the rate of symptomatic hemorrhage was 6%. At 90 days, 34% of the patients had a good outcome. There were no significant differences in good outcome between those with mild and those with moderate baseline disability (43 vs. 24%, p = 0.08), between those aged ≤85 and those aged > 85 years (40.8 vs. 26.1%, p = 0.19), and between those treated within and those treated beyond 8 h (39 vs. 20%, p = 0.1). The mortality rate was 38.5% at 90 days. The Alberta Stroke Program Early CT Score (ASPECTS) and the National Institutes of Health Stroke Scale (NIHSS) predicted good outcome regardless of baseline disability ( p baseline disability, and delayed treatment are associated with sub-optimal outcomes after MT. However, redefining good outcome to include return to baseline functioning demonstrates that one-third of this patient population benefits from MT, suggesting the real-life utility of this treatment.

  16. Forecasting Sensorimotor Adaptability from Baseline Inter-Trial Correlations

    Science.gov (United States)

    Beaton, K. H.; Bloomberg, J. J.

    2014-01-01

    measured in the frequency domain. Therefore, we use the power spectrum (PS), which is the Fourier transform of the ACF, to describe our inter-trial correlations. The decay of the PS yields a straight line on a log-log frequency plot, which we quantify by Beta = - (slope of PS on log-log axes). Hence, Beta is a measure of the strength of inter- trial correlations in the baseline data. Larger Beta values are indicative of longer inter-trial correlations. Experimental Approach: We will begin by performing a retrospective analysis of treadmill-gait adaptation data previously collected by Dr. Bloomberg and colleagues. Specifically, we will quantify the strength of inter-trial correlations in the baseline step cadence and heart rate data and compare it to the locomotor adaptability performance results already described by these investigators. Incorporating these datasets will also allow us to explore the applicability of (and potential limitations surrounding) the use of Beta in forecasting physiological performance. We will also perform a new experiment, in which Beta will be derived from baseline data collected during over-ground (non-treadmill) walking, which will enable us to consider locomotor performance, through the parameter Beta, under the most functionallyrelevant, natural gait condition. This experiment will incorporate two baseline and five post-training over-ground locomotion tests to explore the consistency and potential adaptability of the Beta values themselves. HYPOTHESES: We hypothesize that the strength of baseline inter-trial correlations of step cadence and heart rate will relate to locomotor adaptability. Specifically, we anticipate that individuals who show weaker longer-term inter-trial correlations in baseline step cadence data will be the better adaptors, as step cadence can be modified in real-time (i.e., online corrections are an inherent property of the locomotor system; analogous to results observed in the VOR). Conversely, because heart rate is not

  17. Acceptance test report: Field test of mixer pump for 241-AN-107 caustic addition project

    International Nuclear Information System (INIS)

    Leshikar, G.A.

    1997-01-01

    The field acceptance test of a 75 HP mixer pump (Hazleton serial number N-20801) installed in Tank 241-AN-107 was conducted from October 1995 thru February 1996. The objectives defined in the acceptance test were successfully met, with two exceptions recorded. The acceptance test encompassed field verification of mixer pump turntable rotation set-up and operation, verification that the pump instrumentation functions within established limits, facilitation of baseline data collection from the mixer pump mounted ultrasonic instrumentation, verification of mixer pump water flush system operation and validation of a procedure for its operation, and several brief test runs (bump) of the mixer pump

  18. 29 CFR 779.247 - “Goods” defined.

    Science.gov (United States)

    2010-07-01

    ... OR SERVICES Employment to Which the Act May Apply; Enterprise Coverage Interstate Inflow Test Under... Act is defined in section 3(i) of the Act. The statutory definition is quoted in § 779.14, and is...

  19. 40 CFR 80.915 - How are the baseline toxics value and baseline toxics volume determined?

    Science.gov (United States)

    2010-07-01

    ... baseline toxics value if it can determine an applicable toxics value for every batch of gasoline produced... of gasoline batch i produced or imported between January 1, 1998 and December 31, 2000, inclusive. i = Individual batch of gasoline produced or imported between January 1, 1998 and December 31, 2000, inclusive. n...

  20. Pakistan, Sindh Province - Baseline Indicators System : Baseline Procurement Performance Assessment Report

    OpenAIRE

    World Bank

    2009-01-01

    This document provides an assessment of the public procurement system in Sindh province using the baseline indicators system developed by the Development Assistance Committee of the Organization for Economic Cooperation and Development (OECD-DAC). This assessment, interviews and discussions were held with stakeholders from the public and private sectors as well as civil society. Developing...

  1. Systemic inflammatory response syndrome criteria in defining severe sepsis.

    Science.gov (United States)

    Kaukonen, Kirsi-Maija; Bailey, Michael; Pilcher, David; Cooper, D Jamie; Bellomo, Rinaldo

    2015-04-23

    The consensus definition of severe sepsis requires suspected or proven infection, organ failure, and signs that meet two or more criteria for the systemic inflammatory response syndrome (SIRS). We aimed to test the sensitivity, face validity, and construct validity of this approach. We studied data from patients from 172 intensive care units in Australia and New Zealand from 2000 through 2013. We identified patients with infection and organ failure and categorized them according to whether they had signs meeting two or more SIRS criteria (SIRS-positive severe sepsis) or less than two SIRS criteria (SIRS-negative severe sepsis). We compared their characteristics and outcomes and assessed them for the presence of a step increase in the risk of death at a threshold of two SIRS criteria. Of 1,171,797 patients, a total of 109,663 had infection and organ failure. Among these, 96,385 patients (87.9%) had SIRS-positive severe sepsis and 13,278 (12.1%) had SIRS-negative severe sepsis. Over a period of 14 years, these groups had similar characteristics and changes in mortality (SIRS-positive group: from 36.1% [829 of 2296 patients] to 18.3% [2037 of 11,119], P<0.001; SIRS-negative group: from 27.7% [100 of 361] to 9.3% [122 of 1315], P<0.001). Moreover, this pattern remained similar after adjustment for baseline characteristics (odds ratio in the SIRS-positive group, 0.96; 95% confidence interval [CI], 0.96 to 0.97; odds ratio in the SIRS-negative group, 0.96; 95% CI, 0.94 to 0.98; P=0.12 for between-group difference). In the adjusted analysis, mortality increased linearly with each additional SIRS criterion (odds ratio for each additional criterion, 1.13; 95% CI, 1.11 to 1.15; P<0.001) without any transitional increase in risk at a threshold of two SIRS criteria. The need for two or more SIRS criteria to define severe sepsis excluded one in eight otherwise similar patients with infection, organ failure, and substantial mortality and failed to define a transition point in

  2. The benefits of defining "snacks".

    Science.gov (United States)

    Hess, Julie M; Slavin, Joanne L

    2018-04-18

    Whether eating a "snack" is considered a beneficial or detrimental behavior is largely based on how "snack" is defined. The term "snack food" tends to connote energy-dense, nutrient-poor foods high in nutrients to limit (sugar, sodium, and/or saturated fat) like cakes, cookies, chips and other salty snacks, and sugar-sweetened beverages. Eating a "snack food" is often conflated with eating a "snack," however, leading to an overall perception of snacks as a dietary negative. Yet the term "snack" can also refer simply to an eating occasion outside of breakfast, lunch, or dinner. With this definition, the evidence to support health benefits or detriments to eating a "snack" remains unclear, in part because relatively few well-designed studies that specifically focus on the impact of eating frequency on health have been conducted. Despite these inconsistencies and research gaps, in much of the nutrition literature, "snacking" is still referred to as detrimental to health. As discussed in this review, however, there are multiple factors that influence the health impacts of snacking, including the definition of "snack" itself, the motivation to snack, body mass index of snack eaters, and the food selected as a snack. Without a definition of "snack" and a body of research using methodologically rigorous protocols, determining the health impact of eating a "snack" will continue to elude the nutrition research community and prevent the development of evidence-based policies about snacking that support public health. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. The London low emission zone baseline study.

    Science.gov (United States)

    Kelly, Frank; Armstrong, Ben; Atkinson, Richard; Anderson, H Ross; Barratt, Ben; Beevers, Sean; Cook, Derek; Green, Dave; Derwent, Dick; Mudway, Ian; Wilkinson, Paul

    2011-11-01

    On February 4, 2008, the world's largest low emission zone (LEZ) was established. At 2644 km2, the zone encompasses most of Greater London. It restricts the entry of the oldest and most polluting diesel vehicles, including heavy-goods vehicles (haulage trucks), buses and coaches, larger vans, and minibuses. It does not apply to cars or motorcycles. The LEZ scheme will introduce increasingly stringent Euro emissions standards over time. The creation of this zone presented a unique opportunity to estimate the effects of a stepwise reduction in vehicle emissions on air quality and health. Before undertaking such an investigation, robust baseline data were gathered on air quality and the oxidative activity and metal content of particulate matter (PM) from air pollution monitors located in Greater London. In addition, methods were developed for using databases of electronic primary-care records in order to evaluate the zone's health effects. Our study began in 2007, using information about the planned restrictions in an agreed-upon LEZ scenario and year-on-year changes in the vehicle fleet in models to predict air pollution concentrations in London for the years 2005, 2008, and 2010. Based on this detailed emissions and air pollution modeling, the areas in London were then identified that were expected to show the greatest changes in air pollution concentrations and population exposures after the implementation of the LEZ. Using these predictions, the best placement of a pollution monitoring network was determined and the feasibility of evaluating the health effects using electronic primary-care records was assessed. To measure baseline pollutant concentrations before the implementation of the LEZ, a comprehensive monitoring network was established close to major roadways and intersections. Output-difference plots from statistical modeling for 2010 indicated seven key areas likely to experience the greatest change in concentrations of nitrogen dioxide (NO2) (at least 3

  4. Baseline requirements for assessment of mining impact using biological monitoring

    International Nuclear Information System (INIS)

    Humphrey, C.L.; Dostine, P.L.

    1995-01-01

    Biological monitoring programmes for environmental protection should provide for both early detection of possible adverse effects, and assessment of the ecological significance of these effects. Monitoring techniques are required that include responses sensitive to the impact, that can be subjected to rigorous statistical analysis and for which statistical power is high. Such issues in baseline research of 'what and how to measure?' and 'for how long?' have been the focus of a programme being developed to monitor and assess effects of mining operations on the essentially pristine, freshwater ecosystems of the Alligator Rivers Region (ARR) in tropical northern Australia. Application of the BACIP (Before, After, Control, Impact, Paired differences) design, utilizing a form of temporal replication, to univariate (single species) and multivariate (community) data is described. The BACIP design incorporates data from single control and impact sites. We argue for modification of the design for particular studies conducted in streams, to incorporate additional independent control sites from adjacent catchment. Inferential power, by way of (i) more confidently attributing cause to an observed change and (ii) providing information about the ecological significance of the change, will be enhanced using a modified BACIP design. In highly valued environments such as the ARR, monitoring programmes require application of statistical tests with high power to guarantee that an impact no greater than a prescribed amount has gone undetected. A minimum number of baseline years using the BACIP approach would therefore be required in order to achieve some desired level of statistical power. This paper describes the results of power analyses conducted on 2-5 years (depending upon the technique) of baseline data from streams of the ARR and discuss the implications of these results for management. 44 refs., 1 tab., 3 figs

  5. Defining Auditory-Visual Objects: Behavioral Tests and Physiological Mechanisms.

    Science.gov (United States)

    Bizley, Jennifer K; Maddox, Ross K; Lee, Adrian K C

    2016-02-01

    Crossmodal integration is a term applicable to many phenomena in which one sensory modality influences task performance or perception in another sensory modality. We distinguish the term binding as one that should be reserved specifically for the process that underpins perceptual object formation. To unambiguously differentiate binding form other types of integration, behavioral and neural studies must investigate perception of a feature orthogonal to the features that link the auditory and visual stimuli. We argue that supporting true perceptual binding (as opposed to other processes such as decision-making) is one role for cross-sensory influences in early sensory cortex. These early multisensory interactions may therefore form a physiological substrate for the bottom-up grouping of auditory and visual stimuli into auditory-visual (AV) objects. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Defining Integrated Science Education and Putting It to Test

    OpenAIRE

    Åström, Maria

    2008-01-01

    The thesis is made up by four studies, on the comprehensive theme of integrated and subject-specific science education in Swedish compulsory school. A literature study on the matter is followed by an expert survey, then a case study and ending with two analyses of students' science results from PISA 2003 and PISA 2006. The first two studies explore similarities and differences between integrated and subject-specific science education, i.e. Science education and science taught as Biology, Chem...

  7. 1995 Baseline solid waste management system description

    International Nuclear Information System (INIS)

    Anderson, G.S.; Konynenbelt, H.S.

    1995-09-01

    This provides a detailed solid waste system description that documents the treatment, storage, and disposal (TSD) strategy for managing Hanford's solid low-level waste, low-level mixed waste, transuranic and transuranic mixed waste, and greater-than-Class III waste. This system description is intended for use by managers of the solid waste program, facility and system planners, as well as system modelers. The system description identifies the TSD facilities that constitute the solid waste system and defines these facilities' interfaces, schedules, and capacities. It also provides the strategy for treating each of the waste streams generated or received by the Hanford Site from generation or receipt through final destination

  8. The baseline pressure of intracranial pressure (ICP) sensors can be altered by electrostatic discharges.

    Science.gov (United States)

    Eide, Per K; Bakken, André

    2011-08-22

    The monitoring of intracranial pressure (ICP) has a crucial role in the surveillance of patients with brain injury. During long-term monitoring of ICP, we have seen spontaneous shifts in baseline pressure (ICP sensor zero point), which are of technical and not physiological origin. The aim of the present study was to explore whether or not baseline pressures of ICP sensors can be affected by electrostatics discharges (ESD's), when ESD's are delivered at clinically relevant magnitudes. We performed bench-testing of a set of commercial ICP sensors. In our experimental setup, the ICP sensor was placed in a container with 0.9% NaCl solution. A test person was charged 0.5-10 kV, and then delivered ESD's to the sensor by touching a metal rod that was located in the container. The continuous pressure signals were recorded continuously before/after the ESD's, and the pressure readings were stored digitally using a computerized system A total of 57 sensors were tested, including 25 Codman ICP sensors and 32 Raumedic sensors. When charging the test person in the range 0.5-10 kV, typically ESD's in the range 0.5-5 kV peak pulse were delivered to the ICP sensor. Alterations in baseline pressure ≥ 2 mmHg was seen in 24 of 25 (96%) Codman sensors and in 17 of 32 (53%) Raumedic sensors. Lasting changes in baseline pressure > 10 mmHg that in the clinical setting would affect patient management, were seen frequently for both sensor types. The changes in baseline pressure were either characterized by sudden shifts or gradual drifts in baseline pressure. The baseline pressures of commercial solid ICP sensors can be altered by ESD's at discharge magnitudes that are clinically relevant. Shifts in baseline pressure change the ICP levels visualised to the physician on the monitor screen, and thereby reveal wrong ICP values, which likely represent a severe risk to the patient.

  9. Camera Trajectory fromWide Baseline Images

    Science.gov (United States)

    Havlena, M.; Torii, A.; Pajdla, T.

    2008-09-01

    Camera trajectory estimation, which is closely related to the structure from motion computation, is one of the fundamental tasks in computer vision. Reliable camera trajectory estimation plays an important role in 3D reconstruction, self localization, and object recognition. There are essential issues for a reliable camera trajectory estimation, for instance, choice of the camera and its geometric projection model, camera calibration, image feature detection and description, and robust 3D structure computation. Most of approaches rely on classical perspective cameras because of the simplicity of their projection models and ease of their calibration. However, classical perspective cameras offer only a limited field of view, and thus occlusions and sharp camera turns may cause that consecutive frames look completely different when the baseline becomes longer. This makes the image feature matching very difficult (or impossible) and the camera trajectory estimation fails under such conditions. These problems can be avoided if omnidirectional cameras, e.g. a fish-eye lens convertor, are used. The hardware which we are using in practice is a combination of Nikon FC-E9 mounted via a mechanical adaptor onto a Kyocera Finecam M410R digital camera. Nikon FC-E9 is a megapixel omnidirectional addon convertor with 180° view angle which provides images of photographic quality. Kyocera Finecam M410R delivers 2272×1704 images at 3 frames per second. The resulting combination yields a circular view of diameter 1600 pixels in the image. Since consecutive frames of the omnidirectional camera often share a common region in 3D space, the image feature matching is often feasible. On the other hand, the calibration of these cameras is non-trivial and is crucial for the accuracy of the resulting 3D reconstruction. We calibrate omnidirectional cameras off-line using the state-of-the-art technique and Mičušík's two-parameter model, that links the radius of the image point r to the

  10. Integrating risk management into the baselining process

    International Nuclear Information System (INIS)

    Jennett, N.; Tonkinson, A.

    1994-01-01

    These processes work together in building the project (comprised of the technical, schedule, and cost baselines) against which performance is measured and changes to the scope, schedule and cost of a project are managed and controlled. Risk analysis is often performed as the final element of the scheduling or estimating processes, a precursor to establishing cost and schedule contingency. However, best business practices dictate that information that may be crucial to the success of a project be analyzed and incorporated into project planning as soon as it is available and usable. The purpose or risk management is not to eliminate risk. Neither is it intended to suggest wholesale re-estimating and re-scheduling of a project. Rather, the intent is to make provisions to reduce and control the schedule and/or cost ramifications of risk by anticipating events and conditions that cannot be reliably planned for and which have the potential to negatively impact accomplishment of the technical objectives and requirements of the project

  11. Shifting environmental baselines in the Red Sea.

    Science.gov (United States)

    Price, A R G; Ghazi, S J; Tkaczynski, P J; Venkatachalam, A J; Santillan, A; Pancho, T; Metcalfe, R; Saunders, J

    2014-01-15

    The Red Sea is among the world's top marine biodiversity hotspots. We re-examined coastal ecosystems at sites surveyed during the 1980s using the same methodology. Coral cover increased significantly towards the north, mirroring the reverse pattern for mangroves and other sedimentary ecosystems. Latitudinal patterns are broadly consistent across both surveys and with results from independent studies. Coral cover showed greatest change, declining significantly from a median score of 4 (1000-9999 m(2)) to 2 (10-99m(2)) per quadrat in 2010/11. This may partly reflect impact from coastal construction, which was evident at 40% of sites and has significantly increased in magnitude over 30 years. Beach oil has significantly declined, but shore debris has increased significantly. Although substantial, levels are lower than at some remote ocean atolls. While earlier reports have suggested that the Red Sea is generally healthy, shifting environmental baselines are evident from the current study. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. FAIR - Baseline technical report. Executive summary

    International Nuclear Information System (INIS)

    Gutbrod, H.H.; Augustin, I.; Eickhoff, H.; Gross, K.D.; Henning, W.F.; Kraemer, D.; Walter, G.

    2006-09-01

    This document presents the Executive Summary, the first of six volumes comprising the 2006 Baseline Technical Report (BTR) for the international FAIR project (Facility for Antiproton and Ion Research). The BTR provides the technical description, cost, schedule, and assessments of risk for the proposed new facility. The purpose of the BTR is to provide a reliable basis for the construction, commissioning and operation of FAIR. The BTR is one of the central documents requested by the FAIR International Steering Committee (ISC) and its working groups, in order to prepare the legal process and the decisions on the construction and operation of FAIR in an international framework. It provides the technical basis for legal contracts on contributions to be made by, so far, 13 countries within the international FAIR Consortium. The BTR begins with this extended Executive Summary as Volume 1, which is also intended for use as a stand-alone document. The Executive Summary provides brief summaries of the accelerator facilities, the scientific programs and experimental stations, civil construction and safety, and of the workproject structure, costs and schedule. (orig.)

  13. Defining Tobacco Regulatory Science Competencies.

    Science.gov (United States)

    Wipfli, Heather L; Berman, Micah; Hanson, Kacey; Kelder, Steven; Solis, Amy; Villanti, Andrea C; Ribeiro, Carla M P; Meissner, Helen I; Anderson, Roger

    2017-02-01

    In 2013, the National Institutes of Health and the Food and Drug Administration funded a network of 14 Tobacco Centers of Regulatory Science (TCORS) with a mission that included research and training. A cross-TCORS Panel was established to define tobacco regulatory science (TRS) competencies to help harmonize and guide their emerging educational programs. The purpose of this paper is to describe the Panel's work to develop core TRS domains and competencies. The Panel developed the list of domains and competencies using a semistructured Delphi method divided into four phases occurring between November 2013 and August 2015. The final proposed list included a total of 51 competencies across six core domains and 28 competencies across five specialized domains. There is a need for continued discussion to establish the utility of the proposed set of competencies for emerging TRS curricula and to identify the best strategies for incorporating these competencies into TRS training programs. Given the field's broad multidisciplinary nature, further experience is needed to refine the core domains that should be covered in TRS training programs versus knowledge obtained in more specialized programs. Regulatory science to inform the regulation of tobacco products is an emerging field. The paper provides an initial list of core and specialized domains and competencies to be used in developing curricula for new and emerging training programs aimed at preparing a new cohort of scientists to conduct critical TRS research. © The Author 2016. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  14. A baseline for the multivariate comparison of resting state networks

    Directory of Open Access Journals (Sweden)

    Elena A Allen

    2011-02-01

    Full Text Available As the size of functional and structural MRI datasets expands, it becomes increasingly important to establish a baseline from which diagnostic relevance may be determined, a processing strategy that efficiently prepares data for analysis, and a statistical approach that identifies important effects in a manner that is both robust and reproducible. In this paper, we introduce a multivariate analytic approach that optimizes sensitivity and reduces unnecessary testing. We demonstrate the utility of this mega-analytic approach by identifying the effects of age and gender on the resting state networks of 603 healthy adolescents and adults (mean age: 23.4 years, range: 12 to 71 years. Data were collected on the same scanner, preprocessed using an automated analysis pipeline based in SPM, and studied using group independent component analysis. Resting state networks were identified and evaluated in terms of three primary outcome measures: time course spectral power, spatial map intensity, and functional network connectivity. Results revealed robust effects of age on all three outcome measures, largely indicating decreases in network coherence and connectivity with increasing age. Gender effects were of smaller magnitude but suggested stronger intra-network connectivity in females and more inter-network connectivity in males, particularly with regard to sensorimotor networks. These findings, along with the analysis approach and statistical framework described here, provide a useful baseline for future investigations of brain networks in health and disease.

  15. Baseline metal concentrations in Paramoera walkeri from East Antarctica

    International Nuclear Information System (INIS)

    Palmer, Anne S. . E-mail Anne.Palmer@utas.edu.au; Snape, Ian; Stark, Jonathan S.; Johnstone, Glenn J.; Townsend, Ashley T.

    2006-01-01

    Remediation of the Thala Valley waste disposal site near Casey Station, East Antarctica was conducted in the austral summer of 2003/2004. Biomonitoring of the adjacent marine environment was undertaken using the gammaridean amphipod Paramoera walkeri as a sentinel species [Stark, J.S., Johnstone, G.J., Palmer, A.S., Snape, I., Larner, B.L., Riddle, M.J., in press, doi:10.1016/j.marpolbul.2006.05.020. Monitoring the remediation of a near shore waste disposal site in Antarctica using the amphipod Paramoera walkeri and diffusive gradients in thin films (DGTs). Marine Pollution Bulletin and references therein]. Determination of uptake of metals and hypothesis testing for differences that could be attributed to contamination required the establishment of baseline metal concentrations in P. walkeri. Baseline metal concentrations from two reference locations in the Windmill Islands are presented here. P. walkeri was a found to be a sensitive bioaccumulating organism that recorded spatial and temporal variability at the reference sites. Measurement of metals in P. walkeri required the development of a simple digestion procedure that used concentrated nitric acid. For the first time, rare earth metals were determined with additional clean procedures required to measure ultra low concentrations using magnetic sector ICP-MS. Certified and in-house reference materials were employed to ensure method reliability

  16. On the baseline evolution of automobile fuel economy in Europe

    International Nuclear Information System (INIS)

    Zachariadis, Theodoros

    2006-01-01

    'Business as usual' scenarios in long-term energy forecasts are crucial for scenario-based policy analyses. This article focuses on fuel economy of passenger cars and light trucks, a long-disputed issue with serious implications for worldwide energy use and CO 2 emissions. The current status in Europe is explained and future developments are analysed with the aid of historical data of the last three decades from the United States and Europe. As a result of this analysis, fuel economy values are proposed for use as assumptions in baseline energy/transport scenarios in the 15 'old' European Union Member States. Proposed values are given for new gasoline and diesel cars and for the years 2010, 2020 and 2030. The increasing discrepancy between vehicle fuel consumption measured under test conditions and that in the real world is also considered. One main conclusion is that the European Commission's voluntary agreement with the automobile industry should not be assumed to fully achieve its target under baseline conditions, nor should it be regarded as a major stimulus for autonomous vehicle efficiency improvements after 2010. A second conclusion is that three very recent studies enjoying authority across the EU tend to be overly optimistic as regards the technical progress for conventional and alternative vehicle propulsion technologies under 'business as usual' conditions

  17. Esophageal Baseline Impedance Reflects Mucosal Integrity and Predicts Symptomatic Outcome With Proton Pump Inhibitor Treatment.

    Science.gov (United States)

    Xie, Chenxi; Sifrim, Daniel; Li, Yuwen; Chen, Minhu; Xiao, Yinglian

    2018-01-30

    Esophageal baseline impedance, which is decreased in gastroesophageal reflux disease (GERD) patients, is related to the severity of acid reflux and the integrity of the esophageal mucosa. The study aims to compare the baseline impedance and the dilated intercellular spaces (DIS) within patients with typical reflux symptoms and to evaluate the correlation of baseline impedance with DIS, esophageal acid exposure, as well as the efficacy of proton pump inhibitor (PPI) treatment. Ninety-two patients and 10 healthy controls were included in the study. Erosive esophagitis (EE) was defined by esophageal mucosal erosion under upper endoscopy. Patients without mucosa erosion were divided into groups with pathologic acid reflux (non-erosive reflux disease [NERD]) or with hypersensitive esophagus. The biopsies of esophageal mucosa were taken 2-4 cm above the gastroesophageal junction Z-line during upper endoscopy for DIS measurement. All the patients received esomeprazole 20 mg twice-daily treatment for 8 weeks. The efficacy of esomeprazole was evaluated among all patients. The intercellular spaces were dilated in both EE and NERD patients ( P baseline impedance was decreased in both EE patients and NERD patients, and negatively correlated to the acid exposure time ( r = -0.527, P baseline impedance ( r = -0.230, P Baseline impedance > 1764 Ω" was an independent predictor for PPI failure (OR, 11.9; 95% CI, 2.4-58.9; P baseline impedance was observed in patients with mucosa erosion or pathological acid reflux. The baseline impedance reflected the mucosal integrity, it was more sensitive to esophageal acid exposure. Patients with high impedance might not benefit from the PPI treatment.

  18. 100-D Area technical baseline report

    International Nuclear Information System (INIS)

    Carpenter, R.W.

    1993-01-01

    This document is prepared in support of the 100 Area Environmental Restoration activity at the US Department of Energy's Hanford Site near Richland, Washington. It provides a technical baseline of waste sites located at the 100-D Area. The report is based on an environmental investigation undertaken by the Westinghouse Hanford Company (WHC) History Office in support of the Environmental Restoration Engineering Function and on review and evaluation of numerous Hanford Site current and historical reports, drawings, and photographs, supplemented by site inspections and employee interviews. No intrusive field investigation or sampling was conducted. All Hanford coordinate locations are approximate locations taken from several different maps and drawings of the 100-D Area. Every effort was made to derive coordinate locations for the center of each facility or waste site, except where noted, using standard measuring devices. Units of measure are shown as they appear in reference documents. The 100-D Area is made up of three operable units: 100-DR-1, 100-DR-2, and 100-DR-3. All three are addressed in this report. These operable units include liquid and solid waste disposal sites in the vicinity of, and related to, the 100-D and 100-DR Reactors. A fourth operable unit, 100-HR-3, is concerned with groundwater and is not addressed here. This report describes waste sites which include cribs, trenches, pits, french drains, retention basins, solid waste burial grounds, septic tanks, and drain fields. Each waste site is described separately and photographs are provided where available. A complete list of photographs can be found in Appendix A. A comprehensive environmental summary is not provided here but may be found in Hanford Site National Environmental Policy Act Characterization (Cushing 1988), which describes the geology and soils, meteorology, hydrology, land use, population, and air quality of the area

  19. The very-long-baseline array

    International Nuclear Information System (INIS)

    Kellermann, K.I.; Thompson, A.R.

    1988-01-01

    The development of radio technology in World War II opened a completely new window on the universe. When astronomers turned radio antennas to the heavens, they began to find a previously unknown universe of solar and planetary radio bursts, quasars, pulsars, radio galaxies, giant molecular clouds and cosmic masers. Not only do the radio waves reveal a new world of astronomical phenomena but also-because they are much longer than light waves-they are not as severely distorted by atmospheric turbulence or small imperfections in the telescope. About 25 years ago radio astronomers became aware that they could synthesize a resolution equivalent to that of a large aperture by combining data from smaller radio antennas that are widely separated. The effective aperture size would be about equal to the largest separation between the antennas. The technique is called synthesis imaging and is based on the principles of interferometry. Radio astronomers in the U.S. are now building a synthesis radio telescope called the Very-Long-Baseline Array, or VLBA. With 10 antennas sited across the country from the Virgin Islands to Hawaii, it will synthesize a radio antenna 8,000 kilometers across, nearly the diameter of the earth. The VLBA'S angular resolution will be less than a thousandth of an arc-second-about three orders of magnitude better than that of the largest conventional ground-based optical telescopes. Astronomers eagerly await the completion early in the next decade of the VLBA, which is expected, among other things, to give an unprecedentedly clear view into the cores of quasars and galactic nuclei and to reveal details of the processe-thought to be powered by black holes-that drive them

  20. Drawing a baseline in aesthetic quality assessment

    Science.gov (United States)

    Rubio, Fernando; Flores, M. Julia; Puerta, Jose M.

    2018-04-01

    Aesthetic classification of images is an inherently subjective task. There does not exist a validated collection of images/photographs labeled as having good or bad quality from experts. Nowadays, the closest approximation to that is to use databases of photos where a group of users rate each image. Hence, there is not a unique good/bad label but a rating distribution given by users voting. Due to this peculiarity, it is not possible to state the problem of binary aesthetic supervised classification in such a direct mode as other Computer Vision tasks. Recent literature follows an approach where researchers utilize the average rates from the users for each image, and they establish an arbitrary threshold to determine their class or label. In this way, images above the threshold are considered of good quality, while images below the threshold are seen as bad quality. This paper analyzes current literature, and it reviews those attributes able to represent an image, differentiating into three families: specific, general and deep features. Among those which have been proved more competitive, we have selected a representative subset, being our main goal to establish a clear experimental framework. Finally, once features were selected, we have used them for the full AVA dataset. We have to remark that to perform validation we report not only accuracy values, which is not that informative in this case, but also, metrics able to evaluate classification power within imbalanced datasets. We have conducted a series of experiments so that distinct well-known classifiers are learned from data. Like that, this paper provides what we could consider valuable and valid baseline results for the given problem.

  1. Pentek concrete scabbling system: Baseline report; Greenbook (chapter)

    International Nuclear Information System (INIS)

    1997-01-01

    The Pentek scabbling technology was tested at Florida International University (FIU) and is being evaluated as a baseline technology. This report evaluates it for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The Pentek concrete scabbling system consisted of the MOOSE, SQUIRREL-I, and SQUIRREL-III scabblers. The scabblers are designed to scarify concrete floors and slabs using cross-section, tungsten carbide tipped bits. The bits are designed to remove concrete in 318 inch increments. The bits are either 9-tooth or demolition type. The scabblers are used with a vacuum system designed to collect and filter the concrete dust and contamination that is removed from the surface. The safety and health evaluation conducted during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure was minimal, but noise exposure was significant. Further testing for each of these exposures is recommended. Because of the outdoor environment where the testing demonstration took place, results may be inaccurate. It is feasible that the dust and noise levels will be higher in an enclosed operating environment. Other areas of concern were arm-hand vibration, whole-body vibration, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout

  2. Pentek metal coating removal system: Baseline report; Greenbook (chapter)

    International Nuclear Information System (INIS)

    1997-01-01

    The Pentek coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU's evaluation of efficiency and cost, this report covers evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The Pentek coating removal system consisted of the ROTO-PEEN Scaler, CORNER-CUTTER reg-sign, and VAC-PAC reg-sign. They are designed to remove coatings from steel, concrete, brick, and wood. The Scaler uses 3M Roto Peen tungsten carbide cutters while the CORNER-CUTTER reg-sign uses solid needles for descaling activities. These hand tools are used with the VAC-PAC reg-sign vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure minimal, but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout

  3. 10 CFR 850.20 - Baseline beryllium inventory.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Baseline beryllium inventory. 850.20 Section 850.20 Energy... Baseline beryllium inventory. (a) The responsible employer must develop a baseline inventory of the... inventory, the responsible employer must: (1) Review current and historical records; (2) Interview workers...

  4. A study of man made radioactivity baseline in dietary materials

    International Nuclear Information System (INIS)

    de la Paz, L.; Estacio, J.; Palattao, M.V.; Anden, A.

    1986-01-01

    This paper describes the radioactivity baseline from literature data coming from various countries where data are available. 1979-1985 were chosen as the baseline years for the following: milk (fresh and powdered), meat and meat products, cereals, fruits, coffee and tea, fish and vegetables. Pre- and post-Chernobyl baseline data are given. (ELC). 21 figs; 17 refs

  5. 40 CFR 80.92 - Baseline auditor requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Baseline auditor requirements. 80.92... (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Anti-Dumping § 80.92 Baseline auditor requirements. (a... determination methodology, resulting baseline fuel parameter, volume and emissions values verified by an auditor...

  6. Reconsidering Cluster Bias in Multilevel Data: A Monte Carlo Comparison of Free and Constrained Baseline Approaches.

    Science.gov (United States)

    Guenole, Nigel

    2018-01-01

    The test for item level cluster bias examines the improvement in model fit that results from freeing an item's between level residual variance from a baseline model with equal within and between level factor loadings and between level residual variances fixed at zero. A potential problem is that this approach may include a misspecified unrestricted model if any non-invariance is present, but the log-likelihood difference test requires that the unrestricted model is correctly specified. A free baseline approach where the unrestricted model includes only the restrictions needed for model identification should lead to better decision accuracy, but no studies have examined this yet. We ran a Monte Carlo study to investigate this issue. When the referent item is unbiased, compared to the free baseline approach, the constrained baseline approach led to similar true positive (power) rates but much higher false positive (Type I error) rates. The free baseline approach should be preferred when the referent indicator is unbiased. When the referent assumption is violated, the false positive rate was unacceptably high for both free and constrained baseline approaches, and the true positive rate was poor regardless of whether the free or constrained baseline approach was used. Neither the free or constrained baseline approach can be recommended when the referent indicator is biased. We recommend paying close attention to ensuring the referent indicator is unbiased in tests of cluster bias. All Mplus input and output files, R, and short Python scripts used to execute this simulation study are uploaded to an open access repository.

  7. Baseline risk and marginal willingness to pay for health risk reduction.

    Science.gov (United States)

    Gerking, Shelby; Adamowicz, Wiktor; Dickie, Mark; Veronesi, Marcella

    2017-01-01

    Empirical results presented in this paper suggest that parents' marginal willingness to pay (MWTP) for a reduction in morbidity risk from heart disease is inversely related to baseline risk (i.e., the amount of risk initially faced) both for themselves and for their children. For instance, a 40% reduction from the mean of baseline risk results in an increase in MWTP by 70% or more. Thus, estimates of monetary benefits of public programs to reduce heart disease risk would be understated if the standard practice is followed of evaluating MWTP at initial risk levels and then multiplying this value by the number of cases avoided. Estimates are supported by: (1) unique quantitative information on perceptions of the risk of getting heart disease that allow baseline risk to be defined at an individual level and (2) improved econometric procedures to control for well-known difficulties associated with stated preference data.

  8. Evaluation of the accuracy of estimated baseline serum creatinine for acute kidney injury diagnosis.

    Science.gov (United States)

    Hatakeyama, Yutaka; Horino, Taro; Nagata, Keitaro; Kataoka, Hiromi; Matsumoto, Tatsuki; Terada, Yoshio; Okuhara, Yoshiyasu

    2018-04-01

    Modern epidemiologic studies of acute kidney injury (AKI) have been facilitated by the increasing availability of electronic medical records. However, pre-morbid reference serum creatinine (SCr) data are often unavailable in such records. Investigators substitute estimated baseline SCr with the eGFR 75 approach, instead of using actually measured baseline SCr. Here, we evaluated the accuracy of estimated baseline SCr for AKI diagnosis in the Japanese population. Inpatients and outpatients aged 18-80 years were retrospectively enrolled. AKI was diagnosed according to the Kidney Disease Improving Global Outcomes (KDIGO) criteria, using SCr levels. The non-AKI and AKI groups were selected using the following criteria: increase 1.5 times greater than baseline SCr ("baseline SCr") or increase 0.3 mg/dL greater than baseline SCr in 48 h ("increase in 48 h"). AKI accuracy defined by the estimated reference SCr, the average SCr value of the non-AKI population (eb-GFR-A approach), or the back-calculated SCr from fixed eGFR = 75 mL/min/1.73 m 2 (eGFR 75 approach, or, eb-GFR-B approach in this study), was evaluated. We analyzed data from 131,358 Japanese patients. The number of patients with reference baseline SCr in the non-AKI and AKI patients were 29,834 and 8952, respectively. For AKI patients diagnosed using "baseline SCr", the AKI diagnostic accuracy rates as defined by eb-GFR-A and eb-GFR-B were 63.5 and 57.7%, respectively, while in AKI diagnosed using "increase in 48 h", the AKI diagnostic accuracy rates as defined by eb-GFR-A and eb-GFR-B were 78.7 and 75.1%, respectively. In non-AKI patients, false-positive rates of AKI misdiagnosed via eb-GFR-A and eb-GFR-B were 7.4 and 6.8%, respectively. AKI diagnosis using the average SCr value of the general population may yield more accurate results than diagnosis using the eGFR 75 approach when the reference SCr is unavailable.

  9. 1st DeepWind 5 MW Baseline design

    DEFF Research Database (Denmark)

    Schmidt Paulsen, Uwe; Vita, Luca; Aagaard Madsen, Helge

    2012-01-01

    The first 5MW baseline design of the DeepWind concept is presented for a Darrieus type floating wind turbine system for water depths of more than 150 m. This design will be used as design reference to test the next technological improvements of sub-component level, being based as much as possible...... trajectory on the water plane. The generator is placed at the bottom of the platform and uses 5MW direct drive technology.The conceptual design is evaluated with numerical simulations in the time domain using the aero-elastic code HAWC2. In order to investigate the concept, a double-disc blade element....... A site has been chosen for the floating turbine off Norway as representative for external conditions. The structure is verified according to an ultimate strength analysis, including loads from wind, waves and currents. The stability of the platform is investigated, considering the displacements...

  10. Nonlinear Dynamic Inversion Baseline Control Law: Architecture and Performance Predictions

    Science.gov (United States)

    Miller, Christopher J.

    2011-01-01

    A model reference dynamic inversion control law has been developed to provide a baseline control law for research into adaptive elements and other advanced flight control law components. This controller has been implemented and tested in a hardware-in-the-loop simulation; the simulation results show excellent handling qualities throughout the limited flight envelope. A simple angular momentum formulation was chosen because it can be included in the stability proofs for many basic adaptive theories, such as model reference adaptive control. Many design choices and implementation details reflect the requirements placed on the system by the nonlinear flight environment and the desire to keep the system as basic as possible to simplify the addition of the adaptive elements. Those design choices are explained, along with their predicted impact on the handling qualities.

  11. Determination of the Territorial Sea Baseline - Measurement Aspect

    Science.gov (United States)

    Specht, Cezary; Weintrit, Adam; Specht, Mariusz; Dabrowski, Pawel

    2017-12-01

    Determining the course of the territorial sea baseline (TSB) of the coastal state is the basis for establishing its maritime boundaries, thus becoming indirect part of maritime policy of the state. Besides the following aspects: legal and methodological as described in the conventions, acts, standards and regulations, equally important is the issue of measurement methodology with respect to the boundaries of the territorial sea. The publication discussed accuracy requirements of the TSB measurement implementation, the relationship of sea level with a choice of the method of its determination, and discussed the implementation of such a measurement on a selected example. As the test reservoir was used the 400-meter stretch of the public beach in Gdynia. During the measurements they used the GNSS geodetic receiver operating in real time based on the geodetic network - VRSnet.pl. Additionally, a comparison was made of the applied method with analogous measurements of the TSB performed in 1999.

  12. Defining clogging potential for permeable concrete.

    Science.gov (United States)

    Kia, Alalea; Wong, Hong S; Cheeseman, Christopher R

    2018-08-15

    Permeable concrete is used to reduce urban flooding as it allows water to flow through normally impermeable infrastructure. It is prone to clogging by particulate matter and predicting the long-term performance of permeable concrete is challenging as there is currently no reliable means of characterising clogging potential. This paper reports on the performance of a range of laboratory-prepared and commercial permeable concretes, close packed glass spheres and aggregate particles of varying size, exposed to different clogging methods to understand this phenomena. New methods were developed to study clogging and define clogging potential. The tests involved applying flowing water containing sand and/or clay in cycles, and measuring the change in permeability. Substantial permeability reductions were observed in all samples, particularly when exposed to sand and clay simultaneously. Three methods were used to define clogging potential based on measuring the initial permeability decay, half-life cycle and number of cycles to full clogging. We show for the first time strong linear correlations between these parameters for a wide range of samples, indicating their use for service-life prediction. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Generating synthetic baseline populations from register data

    DEFF Research Database (Denmark)

    Rich, Jeppe; Mulalic, Ismir

    2012-01-01

    . A test on historical census data shows that a 2006 population could be predicted by a 1994 population with an overall percentage deviation of 5–6% given that targets were known. It is also indicated that the deviation is approximately a linear function of the length of the forecast period....... algorithm. The solution strategy consists in establishing a harmonisation process for the population targets, which combined with a linear programming approach, is applied to generate a consistent target representation. The model approach is implemented and tested on Danish administrative register data...

  14. Quantum computing. Defining and detecting quantum speedup.

    Science.gov (United States)

    Rønnow, Troels F; Wang, Zhihui; Job, Joshua; Boixo, Sergio; Isakov, Sergei V; Wecker, David; Martinis, John M; Lidar, Daniel A; Troyer, Matthias

    2014-07-25

    The development of small-scale quantum devices raises the question of how to fairly assess and detect quantum speedup. Here, we show how to define and measure quantum speedup and how to avoid pitfalls that might mask or fake such a speedup. We illustrate our discussion with data from tests run on a D-Wave Two device with up to 503 qubits. By using random spin glass instances as a benchmark, we found no evidence of quantum speedup when the entire data set is considered and obtained inconclusive results when comparing subsets of instances on an instance-by-instance basis. Our results do not rule out the possibility of speedup for other classes of problems and illustrate the subtle nature of the quantum speedup question. Copyright © 2014, American Association for the Advancement of Science.

  15. Radiotherapy for brain metastases: defining palliative response

    International Nuclear Information System (INIS)

    Bezjak, Andrea; Adam, Janice; Panzarella, Tony; Levin, Wilfred; Barton, Rachael; Kirkbride, Peter; McLean, Michael; Mason, Warren; Wong, Chong Shun; Laperriere, Normand

    2001-01-01

    Background and purpose: Most patients with brain metastases are treated with palliative whole brain radiotherapy (WBRT). There is no established definition of palliative response. The aim of this study was to develop and test clinically useful criteria for response following palliative WBRT. Materials and methods: A prospective study was conducted of patients with symptomatic brain metastases treated with WBRT (20 Gy/5 fractions) and standardised steroid tapering. Assessments included observer rating of neurological symptoms, patient-completed symptom checklist and performance status (PS). Response criteria were operationally defined based on a combination of neurological symptoms, PS and steroid dose. Results: Seventy-five patients were accrued. At 1 month, presenting neurological symptoms were improved in 14 patients, stable in 17, and worse in 21; 23 patients were not assessed, mainly due to death or frailty. Using response criteria defined a priori, 15% (95% CI 7-23%) of patients were classified as having a response to RT, 25% no response, and 29% progression; 27% were deceased at or soon after 1 month. A revised set of criteria was tested, with less emphasis on complete tapering of steroids: they increased the proportion of patients responding to 39% (95% CI 27-50%) but didn't change the large proportion who did not benefit (44%). Conclusions: Clinical response to RT of patients with brain metastases is multifactorial, comprising symptoms, PS and other factors. Assessment of degree of palliation depend on the exact definition used. More research is needed in this important area, to help validate criteria for assessing palliation after WBRT

  16. Patient baseline interpersonal problems as moderators of outcome in two psychotherapies for bulimia nervosa.

    Science.gov (United States)

    Gomez Penedo, Juan Martin; Constantino, Michael J; Coyne, Alice E; Bernecker, Samantha L; Smith-Hansen, Lotte

    2018-01-19

    We tested an aptitude by treatment interaction; namely, whether patients' baseline interpersonal problems moderated the comparative efficacy of cognitive-behavioral therapy (CBT) vs. interpersonal psychotherapy (IPT) for bulimia nervosa (BN). Data derived from a randomized-controlled trial. Patients reported on their interpersonal problems at baseline; purge frequency at baseline, midtreatment, and posttreatment; and global eating disorder severity at baseline and posttreatment. We estimated the rate of change in purge frequency across therapy, and the likelihood of attaining clinically meaningful improvement (recovery) in global eating disorder severity by posttreatment. We then tested the interpersonal problem by treatment interactions as predictors of both outcomes. Patients with more baseline overly communal/friendly problems showed steeper reduction in likelihood of purging when treated with CBT vs. IPT. Patients with more problems of being under communal/cold had similar reductions in likelihood of purging across both treatments. Patients with more baseline problems of being overly agentic were more likely to recover when treated with IPT vs. CBT, whereas patients with more problems of being under agentic were more likely to recover when treated with CBT vs. IPT. Interpersonal problems related to communion and agency may inform treatment fit among two empirically supported therapies for BN.

  17. A proposal to create an extension to the European baseline series.

    Science.gov (United States)

    Wilkinson, Mark; Gallo, Rosella; Goossens, An; Johansen, Jeanne D; Rustemeyer, Thomas; Sánchez-Pérez, Javier; Schuttelaar, Marie L; Uter, Wolfgang

    2018-02-01

    The current European baseline series consists of 30 allergens, and was last updated in 2015. To use data from the European Surveillance System on Contact Allergies (ESSCA) to propose an extension to the European baseline series in response to changes in environmental exposures. Data from departmental and national extensions to the baseline series, together with some temporary additions from departments contributing to the ESSCA, were collated during 2013-2014. In total, 31689 patients were patch tested in 46 European departments. Many departments and national groups already consider the current European baseline series to be a suboptimal screen, and use their own extensions to it. The haptens tested are heterogeneous, although there are some consistent themes. Potential haptens to include in an extension to the European baseline series comprise sodium metabisulfite, formaldehyde-releasing preservatives, additional markers of fragrance allergy, propolis, Compositae mix, and 2-hydroxyethyl methacrylate. In combination with other published work from the ESSCA, changes to the current European baseline series are proposed for discussion. As well as addition of the allergens listed above, it is suggested that primin and clioquinol should be deleted from the series, owing to reduced environmental exposure. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  18. High Rates of Baseline Drug Resistance and Virologic Failure Among ART-naive HIV-infected Children in Mali.

    Science.gov (United States)

    Crowell, Claudia S; Maiga, Almoustapha I; Sylla, Mariam; Taiwo, Babafemi; Kone, Niaboula; Oron, Assaf P; Murphy, Robert L; Marcelin, Anne-Geneviève; Traore, Ban; Fofana, Djeneba B; Peytavin, Gilles; Chadwick, Ellen G

    2017-11-01

    Limited data exist on drug resistance and antiretroviral treatment (ART) outcomes in HIV-1-infected children in West Africa. We determined the prevalence of baseline resistance and correlates of virologic failure (VF) in a cohort of ART-naive HIV-1-infected children baseline (before ART) and at 6 months. Resistance was defined according to the Stanford HIV Genotypic Resistance database. VF was defined as viral load ≥1000 copies/mL after 6 months of ART. Logistic regression was used to evaluate factors associated with VF or death >1 month after enrollment. Post hoc, antiretroviral concentrations were assayed on baseline samples of participants with baseline resistance. One-hundred twenty children with a median age 2.6 years (interquartile range: 1.6-5.0) were included. Eighty-eight percent reported no prevention of mother-to-child transmission exposure. At baseline, 27 (23%), 4 (3%) and none had non-nucleoside reverse transcriptase inhibitor (NNRTI), nucleoside reverse transcriptase inhibitor or protease inhibitor resistance, respectively. Thirty-nine (33%) developed VF and 4 died >1 month post-ART initiation. In multivariable analyses, poor adherence [odds ratio (OR): 6.1, P = 0.001], baseline NNRTI resistance among children receiving NNRTI-based ART (OR: 22.9, P baseline NNRTI resistance (OR: 5.8, P = 0.018) were significantly associated with VF/death. Ten (38%) with baseline resistance had detectable levels of nevirapine or efavirenz at baseline; 7 were currently breastfeeding, but only 2 reported maternal antiretroviral use. Baseline NNRTI resistance was common in children without reported NNRTI exposure and was associated with increased risk of treatment failure. Detectable NNRTI concentrations were present despite few reports of maternal/infant antiretroviral use.

  19. Physics with a very long neutrino factory baseline

    International Nuclear Information System (INIS)

    Gandhi, Raj; Winter, Walter

    2007-01-01

    We discuss the neutrino oscillation physics of a very long neutrino factory baseline over a broad range of lengths (between 6000 km and 9000 km), centered on the 'magic baseline' (∼7500 km) where correlations with the leptonic CP phase are suppressed by matter effects. Since the magic baseline depends only on the density, we study the impact of matter density profile effects and density uncertainties over this range, and the impact of detector locations off the optimal baseline. We find that the optimal constant density describing the physics over this entire baseline range is about 5% higher than the average matter density. This implies that the magic baseline is significantly shorter than previously inferred. However, while a single detector optimization requires fine-tuning of the (very long) baseline length, its combination with a near detector at a shorter baseline is much less sensitive to the far detector location and to uncertainties in the matter density. In addition, we point out different applications of this baseline which go beyond its excellent correlation and degeneracy resolution potential. We demonstrate that such a long baseline assists in the improvement of the θ 13 precision and in the resolution of the octant degeneracy. Moreover, we show that the neutrino data from such a baseline could be used to extract the matter density along the profile up to 0.24% at 1σ for large sin 2 2θ 13 , providing a useful discriminator between different geophysical models

  20. An Improved Rank Correlation Effect Size Statistic for Single-Case Designs: Baseline Corrected Tau.

    Science.gov (United States)

    Tarlow, Kevin R

    2017-07-01

    Measuring treatment effects when an individual's pretreatment performance is improving poses a challenge for single-case experimental designs. It may be difficult to determine whether improvement is due to the treatment or due to the preexisting baseline trend. Tau- U is a popular single-case effect size statistic that purports to control for baseline trend. However, despite its strengths, Tau- U has substantial limitations: Its values are inflated and not bound between -1 and +1, it cannot be visually graphed, and its relatively weak method of trend control leads to unacceptable levels of Type I error wherein ineffective treatments appear effective. An improved effect size statistic based on rank correlation and robust regression, Baseline Corrected Tau, is proposed and field-tested with both published and simulated single-case time series. A web-based calculator for Baseline Corrected Tau is also introduced for use by single-case investigators.

  1. Reinforced Carbon Carbon (RCC) oxidation resistant material samples - Baseline coated, and baseline coated with tetraethyl orthosilicate (TEOS) impregnation

    Science.gov (United States)

    Gantz, E. E.

    1977-01-01

    Reinforced carbon-carbon material specimens were machined from 19 and 33 ply flat panels which were fabricated and processed in accordance with the specifications and procedures accepted for the fabrication and processing of the leading edge structural subsystem (LESS) elements for the space shuttle orbiter. The specimens were then baseline coated and tetraethyl orthosilicate impregnated, as applicable, in accordance with the procedures and requirements of the appropriate LESS production specifications. Three heater bars were ATJ graphite silicon carbide coated with the Vought 'pack cementation' coating process, and three were stackpole grade 2020 graphite silicon carbide coated with the chemical vapor deposition process utilized by Vought in coating the LESS shell development program entry heater elements. Nondestructive test results are reported.

  2. Ultra-high pressure water jet: Baseline report

    International Nuclear Information System (INIS)

    1997-01-01

    The ultra-high pressure waterjet technology was being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU's evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The ultra-high pressure waterjet technology acts as a cutting tool for the removal of surface substrates. The Husky trademark pump feeds water to a lance that directs the high pressure water at the surface to be removed. The safety and health evaluation during the testing demonstration focused on two main areas of exposure. These were dust and noise. The dust exposure was found to be minimal, which would be expected due to the wet environment inherent in the technology, but noise exposure was at a significant level. Further testing for noise is recommended because of the outdoor environment where the testing demonstration took place. In addition, other areas of concern found were arm-hand vibration, ergonomics, heat stress, tripping hazards, electrical hazards, lockout/tagout, fall hazards, slipping hazards, hazards associated with the high pressure water, and hazards associated with air pressure systems

  3. Early antihypertensive treatment and clinical outcomes in acute ischemic stroke: subgroup analysis by baseline blood pressure.

    Science.gov (United States)

    He, William J; Zhong, Chongke; Xu, Tan; Wang, Dali; Sun, Yingxian; Bu, Xiaoqing; Chen, Chung-Shiuan; Wang, Jinchao; Ju, Zhong; Li, Qunwei; Zhang, Jintao; Geng, Deqin; Zhang, Jianhui; Li, Dong; Li, Yongqiu; Yuan, Xiaodong; Zhang, Yonghong; Kelly, Tanika N

    2018-06-01

    We studied the effect of early antihypertensive treatment on death, major disability, and vascular events among patients with acute ischemic stroke according to their baseline SBP. We randomly assigned 4071 acute ischemic stroke patients with SBP between 140 and less than 220 mmHg to receive antihypertensive treatment or to discontinue all antihypertensive medications during hospitalization. A composite primary outcome of death and major disability and secondary outcomes were compared between treatment and control stratified by baseline SBP levels of less than 160, 160-179, and at least 180 mmHg. At 24 h after randomization, differences in SBP reductions were 8.8, 8.6 and 7.8 mmHg between the antihypertensive treatment and control groups among patients with baseline SBP less than 160, 160-179, and at least 180 mmHg, respectively (P baseline SBP subgroups on death (P = 0.02): odds ratio (95% CI) of 2.42 (0.74-7.89) in patients with baseline SBP less than 60 mmHg and 0.34 (0.11-1.09) in those with baseline SBP at least 180 mmHg. At the 3-month follow-up, the primary and secondary clinical outcomes were not significantly different between the treatment and control groups by baseline SBP levels. Early antihypertensive treatment had a neutral effect on clinical outcomes among acute ischemic stroke patients with various baseline SBP levels. Future clinical trials are warranted to test BP-lowering effects in acute ischemic stroke patients by baseline SBP levels. ClinicalTrials.gov Identifier: NCT01840072.

  4. Intrafractional baseline drift during free breathing breast cancer radiation therapy.

    Science.gov (United States)

    Jensen, Christer Andre; Acosta Roa, Ana María; Lund, Jo-Åsmund; Frengen, Jomar

    2017-06-01

    Intrafraction motion in breast cancer radiation therapy (BCRT) has not yet been thoroughly described in the literature. It has been observed that baseline drift occurs as part of the intrafraction motion. This study aims to measure baseline drift and its incidence in free-breathing BCRT patients using an in-house developed laser system for tracking the position of the sternum. Baseline drift was monitored in 20 right-sided breast cancer patients receiving free breathing 3D-conformal RT by using an in-house developed laser system which measures one-dimensional distance in the AP direction. A total of 357 patient respiratory traces from treatment sessions were logged and analysed. Baseline drift was compared to patient positioning error measured from in-field portal imaging. The mean overall baseline drift at end of treatment sessions was -1.3 mm for the patient population. Relatively small baseline drift was observed during the first fraction; however it was clearly detected already at the second fraction. Over 90% of the baseline drift occurs during the first 3 min of each treatment session. The baseline drift rate for the population was -0.5 ± 0.2 mm/min in the posterior direction the first minute after localization. Only 4% of the treatment sessions had a 5 mm or larger baseline drift at 5 min, all towards the posterior direction. Mean baseline drift in the posterior direction in free breathing BCRT was observed in 18 of 20 patients over all treatment sessions. This study shows that there is a substantial baseline drift in free breathing BCRT patients. No clear baseline drift was observed during the first treatment session; however, baseline drift was markedly present at the rest of the sessions. Intrafraction motion due to baseline drift should be accounted for in margin calculations.

  5. Baseliner: An open-source, interactive tool for processing sap flux data from thermal dissipation probes

    Directory of Open Access Journals (Sweden)

    A. Christopher Oishi

    2016-01-01

    Full Text Available Estimating transpiration from woody plants using thermal dissipation sap flux sensors requires careful data processing. Currently, researchers accomplish this using spreadsheets, or by personally writing scripts for statistical software programs (e.g., R, SAS. We developed the Baseliner software to help establish a standardized protocol for processing sap flux data. Baseliner enables users to QA/QC data and process data using a combination of automated steps, visualization, and manual editing. Data processing requires establishing a zero-flow reference value, or “baseline”, which varies among sensors and with time. Since no set of algorithms currently exists to reliably QA/QC and estimate the zero-flow baseline, Baseliner provides a graphical user interface to allow visual inspection and manipulation of data. Data are first automatically processed using a set of user defined parameters. The user can then view the data for additional, manual QA/QC and baseline identification using mouse and keyboard commands. The open-source software allows for user customization of data processing algorithms as improved methods are developed.

  6. Infrared thermometry and the crop water stress index. I. History, theory, and baselines

    International Nuclear Information System (INIS)

    Gardner, B.R.; Nielsen, D.C.; Shock, C.C.

    1992-01-01

    Development of portable infrared thermometers and the definition of the Crop Water Stress Index (CWSI) have led to widespread interest in infrared thermometry to monitor water stress and schedule irrigations. But the CWSI concept is still new and poorly understood by many. The purpose of this paper is to review the definition of CWSI, and the determination and interpretation of the non-water-stressed baselines used to compute CWSI. The non-water-stressed baseline equation normalizes the canopy minus air temperature differential for variations in vapor pressure deficit. Non-water-stressed baselines can be determined empirically from measurements of canopy and air temperatures and vapor pressure deficit, made diurnally on a single day, or at a single time of day over many days, on well-watered plants. The value of the maximum canopy minus air temperature differential under maximum water stress should also be determined empirically. Causes for CWSI values falling outside of the defined 0 to 10 unit range are reviewed. Non-water-stressed baselines may shift with plant growth stage. Effective use of CWSI is dependent on understanding the definition of CWSI, and the proper determination and use of non-water-stressed baselines. (author)

  7. Spent Nuclear Fuel Project technical baseline document. Fiscal year 1995: Volume 1, Baseline description

    International Nuclear Information System (INIS)

    Womack, J.C.; Cramond, R.; Paedon, R.J.

    1995-01-01

    This document is a revision to WHC-SD-SNF-SD-002, and is issued to support the individual projects that make up the Spent Nuclear Fuel Project in the lower-tier functions, requirements, interfaces, and technical baseline items. It presents results of engineering analyses since Sept. 1994. The mission of the SNFP on the Hanford site is to provide safety, economic, environmentally sound management of Hanford SNF in a manner that stages it to final disposition. This particularly involves K Basin fuel, although other SNF is involved also

  8. Baseline susceptibility to alpha-cypermethrin in Lutzomyia longipalpis (Lutz & Neiva, 1912) from Lapinha Cave (Brazil).

    Science.gov (United States)

    Pessoa, Grasielle Caldas Davila; Lopes, Josiane Valadão; Rocha, Marília Fonseca; Pinheiro, Letícia C; Rosa, Aline Cristine Luiz; Michalsky, Érika Monteiro; Dias, Edelberto Santos

    2015-09-17

    Given the increase in cases of visceral leishmaniasis in recent years, associated with the socio-economic impact of this disease, as well as the wide distribution of Lutzomyia longipalpis in Brazil and the likelihood that this vector may develop resistance to insecticides used for control, the Ministry of Health considers as crucial the creation of a network in order to study and monitor the resistance of this vector to insecticides used for control. In this sense, this study aimed: 1) to characterize the susceptibility of L. longipalpis from Lapinha Cave (Lagoa Santa, MG - Brazil) to Alfateck SC200 in field bioassays, and 2) to define the susceptibility baseline to alpha-cypermethrin in laboratory bioassays, checking the possibility of using it as susceptibility reference lineage (SRL). The field bioassays revealed that the tested population was highly susceptible to alpha-cypermethrin in all time periods with high mortality (~100 %) in all treated surfaces before six months after spraying. In the laboratory bioassays, the studied population presented LD50, LD95 and LD99 to 0.78013, 10.5580 and 31.067 mg/m(2), respectively. The slope was 1.454121. The studied population of L. longipalpis was considered as adequate for SRL according criterion recommended by Pan-American Health Organization and has proven susceptibility to tested insecticide in the field. One cannot rule out the possibility of finding populations of L. longipalpis more susceptible to alpha-cypermethrin; therefore, further research is necessary on other populations with potential use as a SRL.

  9. Fragrance mix II in the baseline series contributes significantly to detection of fragrance allergy

    DEFF Research Database (Denmark)

    Heisterberg, Maria V; Andersen, Klaus E; Avnstorp, Christian

    2010-01-01

    Fragrance mix II (FM II) is a relatively new screening marker for fragrance contact allergy. It was introduced in the patch test baseline series in Denmark in 2005 and contains six different fragrance chemicals commonly present in cosmetic products and which are known allergens.......Fragrance mix II (FM II) is a relatively new screening marker for fragrance contact allergy. It was introduced in the patch test baseline series in Denmark in 2005 and contains six different fragrance chemicals commonly present in cosmetic products and which are known allergens....

  10. Farm elders define health as the ability to work.

    Science.gov (United States)

    Reed, Deborah B; Rayens, Mary Kay; Conley, Christina K; Westneat, Susan; Adkins, Sarah M

    2012-08-01

    Thirty percent of America's 2.2 million farms are operated by individuals older than 65 years. This study examined how older farmers define health and determined whether demographic characteristics, farm work, and physical and mental health status predict health definition. Data were collected via telephone and mailed surveys during the baseline wave of data collection in a longitudinal study of family farmers residing in two southern states (n=1,288). Nearly 42% defined health as the "ability to work" compared to a physical health-related definition. Predictors of defining health as the ability to work included being White, performing more farm tasks in the past week, taking prescription medications daily, and having minimal health-related limitations to farm work. Health behaviors are centered on the individual's perception of health. Understanding the defining attributes of health can support better approaches to health care and health promotion, particularly among rural subcultures such as farmers, whose identity is rooted in their work. Copyright 2012, SLACK Incorporated.

  11. A baseline metabolomic signature is associated with immunological CD4+ T-cell recovery after 36 months of antiretroviral therapy in HIV-infected patients.

    Science.gov (United States)

    Rodríguez-Gallego, Esther; Gómez, Josep; Pacheco, Yolanda M; Peraire, Joaquim; Viladés, Consuelo; Beltrán-Debón, Raúl; Mallol, Roger; López-Dupla, Miguel; Veloso, Sergi; Alba, Verónica; Blanco, Julià; Cañellas, Nicolau; Rull, Anna; Leal, Manuel; Correig, Xavier; Domingo, Pere; Vidal, Francesc

    2018-03-13

    Poor immunological recovery in treated HIV-infected patients is associated with greater morbidity and mortality. To date, predictive biomarkers of this incomplete immune reconstitution have not been established. We aimed to identify a baseline metabolomic signature associated with a poor immunological recovery after antiretroviral therapy (ART) to envisage the underlying mechanistic pathways that influence the treatment response. This was a multicentre, prospective cohort study in ART-naive and a pre-ART low nadir (Immunological recovery was defined as reaching CD4 T-cell count at least 250 cells/μl after 36 months of virologically successful ART. We used univariate comparisons, Random Forest test and receiver-operating characteristic curves to identify and evaluate the predictive factors of immunological recovery after treatment. HIV-infected patients with a baseline metabolic pattern characterized by high levels of large high density lipoprotein (HDL) particles, HDL cholesterol and larger sizes of low density lipoprotein particles had a better immunological recovery after treatment. Conversely, patients with high ratios of non-HDL lipoprotein particles did not experience this full recovery. Medium very-low-density lipoprotein particles and glucose increased the classification power of the multivariate model despite not showing any significant differences between the two groups. In HIV-infected patients, a baseline healthier metabolomic profile is related to a better response to ART where the lipoprotein profile, mainly large HDL particles, may play a key role.

  12. Utilities and offsites design baseline. Outside Battery Limits Facility 6000 tpd SRC-I Demonstration Plant. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    None

    1984-05-25

    As part of the overall Solvent Refined Coal (SRC-1) project baseline being prepared by International Coal Refining Company (ICRC), the RUST Engineering Company is providing necessary input for the Outside Battery Limits (OSBL) Facilities. The project baseline is comprised of: design baseline - technical definition of work; schedule baseline - detailed and management level 1 schedules; and cost baseline - estimates and cost/manpower plan. The design baseline (technical definition) for the OSBL Facilities has been completed and is presented in Volumes I, II, III, IV, V and VI. The OSBL technical definition is based on, and compatible with, the ICRC defined statement of work, design basis memorandum, master project procedures, process and mechanical design criteria, and baseline guidance documents. The design basis memorandum is included in Paragraph 1.3 of Volume I. The baseline design data is presented in 6 volumes. Volume I contains the introduction section and utility systems data through steam and feedwater. Volume II continues with utility systems data through fuel system, and contains the interconnecting systems and utility system integration information. Volume III contains the offsites data through water and waste treatment. Volume IV continues with offsites data, including site development and buildings, and contains raw materials and product handling and storage information. Volume V contains wastewater treatment and solid wastes landfill systems developed by Catalytic, Inc. to supplement the information contained in Volume III. Volume VI contains proprietary information of Resources Conservation Company related to the evaporator/crystallizer system of the wastewater treatment area.

  13. Vitamin D production after UVB exposure depends on baseline vitamin D and total cholesterol but not on skin pigmentation.

    Science.gov (United States)

    Bogh, Morten K B; Schmedes, Anne V; Philipsen, Peter A; Thieden, Elisabeth; Wulf, Hans C

    2010-02-01

    UVB radiation increases serum vitamin D level expressed as 25-hydroxyvitamin-D(3) (25(OH)D), but the influence of skin pigmentation, baseline 25(OH)D level, and total cholesterol has not been well characterized. To determine the importance of skin pigmentation, baseline 25(OH)D level, and total cholesterol on 25(OH)D production after UVB exposure, 182 persons were screened for 25(OH)D level. A total of 50 participants with a wide range in baseline 25(OH)D levels were selected to define the importance of baseline 25(OH)D level. Of these, 28 non-sun worshippers with limited past sun exposure were used to investigate the influence of skin pigmentation and baseline total cholesterol. The participants had 24% of their skin exposed to UVB (3 standard erythema doses) four times every second or third day. Skin pigmentation and 25(OH)D levels were measured before and after the irradiations. Total cholesterol was measured at baseline. The increase in 25(OH)D level after UVB exposure was negatively correlated with baseline 25(OH)D level (Ppigmentation. In addition, we paired a dark-skinned group with a fair-skinned group according to baseline 25(OH)D levels and found no differences in 25(OH)D increase after identical UVB exposure.

  14. CHIMES-I: sub-group analyzes of the effects of NeuroAiD according to baseline brain imaging characteristics among patients randomized in the CHIMES study.

    Science.gov (United States)

    Navarro, Jose C; Chen, Christopher Li Hsian; Lagamayo, Pedro Danilo J; Geslani, Melodia B; Eow, Gaik Bee; Poungvarin, Niphon; de Silva, Asita; Wong, Lawrence K S; Venketasubramanian, N

    2013-08-01

    The clinical effects of neuroprotective and/or neurorestorative therapies may vary according to location and size of the ischemic injury. Imaging techniques can be useful in stratifying patients for trials that may be beneficial against particular ischemic lesion characteristics. To test the hypothesis that the efficacy of NeuroAiD compared with placebo in improving functional outcome and reducing neurological deficit in patients with cerebral infarction of intermediate severity varies between sub-groups of patients randomized in the main Chinese Medicine Neuroaid Efficacy on Stroke study when categorized according to baseline imaging characteristics. This is a retrospective cohort sub-group analysis of patients who participated in the main Chinese Medicine Neuroaid Efficacy on Stroke study, a multicenter, double-blind, placebo-controlled trial that recruited 1100 patients within 72 h of ischemic stroke onset with National Institutes of Health Stroke Scale 6-14 and were randomized to either NeuroAiD or placebo taken four capsules three times daily for three months. Review of the baseline images to classify the acute stroke lesions in terms of size, location, and extent of involvement will be performed retrospectively by two readers who will remain blinded as to treatment allocation and outcomes of the subjects. The primary efficacy end-point in the main Chinese Medicine Neuroaid Efficacy on Stroke study is the modified Rankin Scale grades at three-months. Secondary efficacy end-points are the National Institutes of Health Stroke Scale score at three-months; difference of National Institutes of Health Stroke Scale scores between baseline and 10 days and between baseline and three-months; difference of National Institutes of Health Stroke Scale sub-scores between baseline and 10 days and between baseline and three-months; modified Rankin Scale at 10 days, one-month, and three-months; Barthel index at three-months; and Mini Mental State Examination at 10 days and

  15. Indico CONFERENCE: Define the Call for Abstracts

    CERN Multimedia

    CERN. Geneva; Ferreira, Pedro

    2017-01-01

    In this tutorial, you will learn how to define and open a call for abstracts. When defining a call for abstracts, you will be able to define settings related to the type of questions asked during a review of an abstract, select the users who will review the abstracts, decide when to open the call for abstracts, and more.

  16. On defining semantics of extended attribute grammars

    DEFF Research Database (Denmark)

    Madsen, Ole Lehrmann

    1980-01-01

    Knuth has introduced attribute grammars (AGs) as a tool to define the semanitcs of context-free languages. The use of AGs in connection with programming language definitions has mostly been to define the context-sensitive syntax of the language and to define a translation in code for a hypothetic...

  17. Languages for Software-Defined Networks

    Science.gov (United States)

    2013-02-01

    switches, firewalls, and middleboxes) with closed and proprietary configuration inter- faces. Software - Defined Networks ( SDN ) are poised to change...how- ever, have seen growing interest in software - defined networks ( SDNs ), in which a logically-centralized controller manages the packet-processing...switches, firewalls, and middleboxes) with closed and proprietary configuration interfaces. Software - Defined Networks ( SDN ) are poised to change this

  18. The LIFE Cognition Study: design and baseline characteristics

    Directory of Open Access Journals (Sweden)

    Sink KM

    2014-08-01

    adults at increased risk for incident mobility disability. One LIFE Study objective is to evaluate the effects of a structured physical activity program on changes in cognitive function and incident all-cause mild cognitive impairment or dementia. Here, we present the design and baseline cognitive data. At baseline, participants completed the modified Mini Mental Status Examination, Hopkins Verbal Learning Test, Digit Symbol Coding, Modified Rey–Osterrieth Complex Figure, and a computerized battery, selected to be sensitive to changes in speed of processing and executive functioning. During follow up, participants completed the same battery, along with the Category Fluency for Animals, Boston Naming, and Trail Making tests. The description of the mild cognitive impairment/dementia adjudication process is presented here. Participants with worse baseline Short Physical Performance Battery scores (prespecified at ≤7 had significantly lower median cognitive test scores compared with those having scores of 8 or 9 with modified Mini Mental Status Examination score of 91 versus (vs 93, Hopkins Verbal Learning Test delayed recall score of 7.4 vs 7.9, and Digit Symbol Coding score of 45 vs 48, respectively (all P<0.001. The LIFE Study will contribute important information on the effects of a structured physical activity program on cognitive outcomes in sedentary older adults at particular risk for mobility impairment. In addition to its importance in the area of prevention of cognitive decline, the LIFE Study will also likely serve as a model for exercise and other behavioral intervention trials in older adults. Keywords: exercise, physical activity, older adults, dementia

  19. Large short-baseline νμ disappearance

    International Nuclear Information System (INIS)

    Giunti, Carlo; Laveder, Marco

    2011-01-01

    We analyze the LSND, KARMEN, and MiniBooNE data on short-baseline ν μ →ν e oscillations and the data on short-baseline ν e disappearance obtained in the Bugey-3 and CHOOZ reactor experiments in the framework of 3+1 antineutrino mixing, taking into account the MINOS observation of long-baseline ν μ disappearance and the KamLAND observation of very-long-baseline ν e disappearance. We show that the fit of the data implies that the short-baseline disappearance of ν μ is relatively large. We obtain a prediction of an effective amplitude sin 2 2θ μμ > or approx. 0.1 for short-baseline ν μ disappearance generated by 0.2 2 2 , which could be measured in future experiments.

  20. Investigation of Baseline Antioxidant Enzyme Expression in Pocillopora damicornis

    Science.gov (United States)

    Murphy, J.; Richmond, R. H.

    2016-02-01

    Coral reefs are some of the most diverse and valuable ecosystems in the world. Vital for maintaining ecological balance in coastal tropical environments, they also stand as the foundation for enormous cultural and economic resources. However, the continued degradation of coral reefs around the world, particularly within NOAA's Hawaii Marine Sanctuary, is an alarming call for action towards the identification of stressors and subsequent rehabilitation of these national treasures. Aligned with the goals of NOAA's National Marine Sanctuary to protect areas of the marine environment that are of special national significance to cultural, scientific, educational, and ecological values, this research targets addressing and standardizing antioxidant enzyme stress levels in Hawaiian coral over reproductive cycles in order to increase management aptitude and efficiency. By developing a greater understanding for biochemical biomarkers of stress in corals, specifically through the study of superoxide dismutase, catalase, glutathione peroxidase, and glutathione reductase activity and expression, my research will aid in the adaptation and further development of biochemical tests to understand baseline thresholds of stress on coral reefs within Sanctuary waters. Slight, but significant variations in enzyme expression over reproductive time points alert us to modifications that must be made to consider fluctuating levels of coral susceptibility when sampling corals under stress. These findings will be applied to diagnostic tests describing the effect of different chemical pollutants on coral health in order to identify ecological issues and expand the knowledge of local communities and NOAA, so that steps can be taken to mitigate human Sanctuary impacts.

  1. Baseline inventory data recommendations for National Wildlife Refuges

    Data.gov (United States)

    Department of the Interior — The Baseline Inventory Team recommends that each refuge have available abiotic “data layers” for topography, aerial photography, hydrography, soils, boundaries, and...

  2. Testing Testing Testing.

    Science.gov (United States)

    Deville, Craig; O'Neill, Thomas; Wright, Benjamin D.; Woodcock, Richard W.; Munoz-Sandoval, Ana; Gershon, Richard C.; Bergstrom, Betty

    1998-01-01

    Articles in this special section consider (1) flow in test taking (Craig Deville); (2) testwiseness (Thomas O'Neill); (3) test length (Benjamin Wright); (4) cross-language test equating (Richard W. Woodcock and Ana Munoz-Sandoval); (5) computer-assisted testing and testwiseness (Richard Gershon and Betty Bergstrom); and (6) Web-enhanced testing…

  3. Site Outcomes Baseline Multi Year Work Plan Volume 1, River Corridor Restoration Baseline

    International Nuclear Information System (INIS)

    Wintczak, T.M.

    2001-01-01

    The River Corridor Restoration volume is a compilation of Hanford Site scope, which excludes the approximately 194 km 2 Central Plateau. The River Corridor scope is currently contractually assigned to Fluor Hanford, Bechtel Hanford, inc., DynCorp, and Pacific Northwest National Laboratory, and others. The purpose of this project specification is to provide an overall scoping document for the River Corridor Restoration volume, and to provide a link with the overall Hanford Site River Corridor scope. Additionally, this specification provides an integrated and consolidated source of information for the various scopes, by current contract, for the River Corridor Restoration Baseline. It identifies the vision, mission, and goals, as well as the operational history of the Hanford Site, along with environmental setting and hazards

  4. Business-as-Unusual: Existing policies in energy model baselines

    International Nuclear Information System (INIS)

    Strachan, Neil

    2011-01-01

    Baselines are generally accepted as a key input assumption in long-term energy modelling, but energy models have traditionally been poor on identifying baselines assumptions. Notably, transparency on the current policy content of model baselines is now especially critical as long-term climate mitigation policies have been underway for a number of years. This paper argues that the range of existing energy and emissions policies are an integral part of any long-term baseline, and hence already represent a 'with-policy' baseline, termed here a Business-as-Unusual (BAuU). Crucially, existing energy policies are not a sunk effort; as impacts of existing policy initiatives are targeted at future years, they may be revised through iterative policy making, and their quantitative effectiveness requires ex-post verification. To assess the long-term role of existing policies in energy modelling, currently identified UK policies are explicitly stripped out of the UK MARKAL Elastic Demand (MED) optimisation energy system model, to generate a BAuU (with-policy) and a REF (without policy) baseline. In terms of long-term mitigation costs, policy-baseline assumptions are comparable to another key exogenous modelling assumption - that of global fossil fuel prices. Therefore, best practice in energy modelling would be to have both a no-policy reference baseline, and a current policy reference baseline (BAuU). At a minimum, energy modelling studies should have a transparent assessment of the current policy contained within the baseline. Clearly identifying and comparing policy-baseline assumptions are required for cost effective and objective policy making, otherwise energy models will underestimate the true cost of long-term emissions reductions.

  5. [Effects of Square-Stepping Exercise inducing activation of the brain's cognitive function in community-dwelling older Japanese females--Focus on the baseline cognitive function level and age].

    Science.gov (United States)

    Abe, Takumi; Tsuji, Taishi; Kitano, Naruki; Muraki, Toshiaki; Hotta, Kazushi; Okura, Tomohiro

    2015-01-01

    The purpose of this study was to investigate whether the degree of improvement in cognitive function achieved with an exercise intervention in community-dwelling older Japanese women is affected by the participant's baseline cognitive function and age. Eighty-eight women (mean age: 70.5±4.2 years) participated in a prevention program for long-term care. They completed the Square-Stepping Exercise (SSE) program once a week, 120 minutes/session, for 11 weeks. We assessed participants' cognitive function using 5 cognitive tests (5-Cog) before and after the intervention. We defined cognitive function as the 5-Cog total score and defined the change in cognitive function as the 5-cog post-score minus the pre-score. We divided participants into four groups based on age (≤69 years or ≥70 years) and baseline cognitive function level (above vs. below the median cognitive function level). We conducted two-way analysis of variance. All 4 groups improved significantly in cognitive function after the intervention. There were no baseline cognitive function level×age interactions and no significant main effects of age, although significant main effects of baseline cognitive function level (P=0.004, η(2)=0.09) were observed. Square-Stepping Exercise is an effective exercise for improving cognitive function. These results suggest that older adults with cognitive decline are more likely to improve their cognitive function with exercise than if they start the intervention with high cognitive function. Furthermore, during an exercise intervention, baseline cognitive function level may have more of an effect than a participant's age on the degree of cognitive improvement.

  6. Single-baseline RTK GNSS Positioning for Hydrographic Surveying

    Science.gov (United States)

    Metin Alkan, Reha; Murat Ozulu, I.; Ilçi, Veli; Kahveci, Muzaffer

    2015-04-01

    for a user to determine his/her position with a few cm accuracy in real time in Turkey. Besides, there are some province municipalities in Turkey which have established their own local CORS networks such as Istanbul (with 9 reference stations) and Ankara (with 10 reference stations). There is also a local RTK base station which disseminates real time position corrections for surveyors in Çorum province and is operated by Çorum Municipality. This is the first step of establishing a complete local CORS network in Çorum (the municipality has plans to increase this number and establish a CORS network within a few years). At the time of this study, unfortunately, national CORS-TR stations in Çorum Province were under maintenance and thus we could not receive corrections from our national CORS network. Instead, Çorum Province's local RTK reference station's corrections were used during the study. The main purpose of this study is to investigate the accuracy performance of the Single-baseline RTK GNSS system operated by Çorum Municipality in marine environment. For this purpose, a kinematic test measurement was carried out at Obruk Dam, Çorum, Turkey. During the test measurement, a small vessel equipped with a dual-frequency geodetic-grade GNSS receiver, Spectra Precision ProMark 500, was used. The coordinates of the vessel were obtained from the Single-baseline RTK system in ITRF datum in real-time with fix solutions. At the same time, the raw kinematic GNSS data were also recorded to the receiver in order to estimate the known coordinates of the vessel with post-processed differential kinematic technique. In this way, GPS data were collected under the same conditions, which allowed precise assessment of the used system. The measurements were carried out along the survey profiles for about 1 hour. During the kinematic test, another receiver was set up on a geodetic point at the shore and data were collected in static mode to calculate the coordinates of the vessel

  7. A proposal to create an extension to the European baseline series

    DEFF Research Database (Denmark)

    Wilkinson, Mark; Gallo, Rosella; Goossens, An

    2018-01-01

    exposures. METHODS: Data from departmental and national extensions to the baseline series, together with some temporary additions from departments contributing to the ESSCA, were collated during 2013-2014. RESULTS: In total, 31689 patients were patch tested in 46 European departments. Many departments...

  8. Contact allergy to preservatives : ESSCA* results with the baseline series, 2009-2012

    NARCIS (Netherlands)

    Gimenez-Arnau, A. M.; Deza, G.; Bauer, A.; Johnston, G. A.; Mahler, V.; Schuttelaar, M. -L.; Sanchez-Perez, J.; Silvestre, J. F.; Wilkinson, M.; Uter, W.

    BackgroundAllergic contact dermatitis caused by biocides is common and causes significant patient morbidity. ObjectiveTo describe the current frequency and pattern of patch test reactivity to biocide allergens included in the baseline series of most European countries. MethodsData collected by the

  9. Schema therapy for personality disorders in older adults : A multiple-baseline study

    NARCIS (Netherlands)

    Videler, A.C.; van Alphen, S.P.J.; Van Royen, R.J.J.; van der Feltz-Cornelis, C.M.; Rossi, G.; Arntz, A.

    2018-01-01

    No studies have been conducted yet into the effectiveness of treatment of personality disorders in later life. This study is a first test of the effectiveness of schema therapy for personality disorders in older adults. Multiple-baseline design with eight cluster C personality disorder patients,

  10. Study on the calibration and optimization of double theodolites baseline

    Science.gov (United States)

    Ma, Jing-yi; Ni, Jin-ping; Wu, Zhi-chao

    2018-01-01

    For the double theodolites measurement system baseline as the benchmark of the scale of the measurement system and affect the accuracy of the system, this paper puts forward a method for calibration and optimization of the double theodolites baseline. Using double theodolites to measure the known length of the reference ruler, and then reverse the baseline formula. Based on the error propagation law, the analyses show that the baseline error function is an important index to measure the accuracy of the system, and the reference ruler position, posture and so on have an impact on the baseline error. The optimization model is established and the baseline error function is used as the objective function, and optimizes the position and posture of the reference ruler. The simulation results show that the height of the reference ruler has no effect on the baseline error; the posture is not uniform; when the reference ruler is placed at x=500mm and y=1000mm in the measurement space, the baseline error is the smallest. The experimental results show that the experimental results are consistent with the theoretical analyses in the measurement space. In this paper, based on the study of the placement of the reference ruler, for improving the accuracy of the double theodolites measurement system has a reference value.

  11. 75 FR 30014 - Consumers Energy Company; Notice of Baseline Filing

    Science.gov (United States)

    2010-05-28

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR10-25-000] Consumers Energy Company; Notice of Baseline Filing May 21, 2010. Take notice that on May 17, 2010, Consumers Energy Company (Consumers) submitted a baseline filing of its Statement of Operating Conditions for the...

  12. 77 FR 31841 - Hope Gas, Inc.; Notice of Baseline Filing

    Science.gov (United States)

    2012-05-30

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR12-23-001] Hope Gas, Inc.; Notice of Baseline Filing Take notice that on May 16, 2012, Hope Gas, Inc. (Hope Gas) submitted a revised baseline filing of their Statement of Operating Conditions for services provided under Section 311 of the...

  13. 77 FR 26535 - Hope Gas, Inc.; Notice of Baseline Filing

    Science.gov (United States)

    2012-05-04

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR12-23-000] Hope Gas, Inc.; Notice of Baseline Filing Take notice that on April 26, 2012, Hope Gas, Inc. (Hope Gas) submitted a baseline filing of their Statement of Operating Conditions for services provided under Section 311 of the...

  14. Esophageal acid exposure decreases intraluminal baseline impedance levels

    NARCIS (Netherlands)

    Kessing, Boudewijn F.; Bredenoord, Albert J.; Weijenborg, Pim W.; Hemmink, Gerrit J. M.; Loots, Clara M.; Smout, A. J. P. M.

    2011-01-01

    Intraluminal baseline impedance levels are determined by the conductivity of the esophageal wall and can be decreased in gastroesophageal reflux disease (GERD) patients. The aim of this study was to investigate the baseline impedance in GERD patients, on and off proton pump inhibitor (PPI), and in

  15. Tank waste remediation system technical baseline summary description

    International Nuclear Information System (INIS)

    Raymond, R.E.

    1998-01-01

    This document is one of the tools used to develop and control the mission work as depicted in the included figure. This Technical Baseline Summary Description document is the top-level tool for management of the Technical Baseline for waste storage operations

  16. Predicting Coronary Artery Aneurysms in Kawasaki Disease at a North American Center: An Assessment of Baseline z Scores.

    Science.gov (United States)

    Son, Mary Beth F; Gauvreau, Kimberlee; Kim, Susan; Tang, Alexander; Dedeoglu, Fatma; Fulton, David R; Lo, Mindy S; Baker, Annette L; Sundel, Robert P; Newburger, Jane W

    2017-05-31

    Accurate risk prediction of coronary artery aneurysms (CAAs) in North American children with Kawasaki disease remains a clinical challenge. We sought to determine the predictive utility of baseline coronary dimensions adjusted for body surface area ( z scores) for future CAAs in Kawasaki disease and explored the extent to which addition of established Japanese risk scores to baseline coronary artery z scores improved discrimination for CAA development. We explored the relationships of CAA with baseline z scores; with Kobayashi, Sano, Egami, and Harada risk scores; and with the combination of baseline z scores and risk scores. We defined CAA as a maximum z score (zMax) ≥2.5 of the left anterior descending or right coronary artery at 4 to 8 weeks of illness. Of 261 patients, 77 patients (29%) had a baseline zMax ≥2.0. CAAs occurred in 15 patients (6%). CAAs were strongly associated with baseline zMax ≥2.0 versus Baseline zMax ≥2.0 had a C statistic of 0.77, good sensitivity (80%), and excellent negative predictive value (98%). None of the risk scores alone had adequate discrimination. When high-risk status per the Japanese risk scores was added to models containing baseline zMax ≥2.0, none were significantly better than baseline zMax ≥2.0 alone. In a North American center, baseline zMax ≥2.0 in children with Kawasaki disease demonstrated high predictive utility for later development of CAA. Future studies should validate the utility of our findings. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  17. Quantification of baseline pupillary response and task-evoked pupillary response during constant and incremental task load.

    Science.gov (United States)

    Mosaly, Prithima R; Mazur, Lukasz M; Marks, Lawrence B

    2017-10-01

    The methods employed to quantify the baseline pupil size and task-evoked pupillary response (TEPR) may affect the overall study results. To test this hypothesis, the objective of this study was to assess variability in baseline pupil size and TEPR during two basic working memory tasks: constant load of 3-letters memorisation-recall (10 trials), and incremental load memorisation-recall (two trials of each load level), using two commonly used methods (1) change from trail/load specific baseline, (2) change from constant baseline. Results indicated that there was a significant shift in baseline between the trails for constant load, and between the load levels for incremental load. The TEPR was independent of shifts in baseline using method 1 only for constant load, and method 2 only for higher levels of incremental load condition. These important findings suggest that the assessment of both the baseline and methods to quantify TEPR are critical in ergonomics application, especially in studies with small number of trials per subject per condition. Practitioner Summary: Quantification of TEPR can be affected by shifts in baseline pupil size that are most likely affected by non-cognitive factors when other external factors are kept constant. Therefore, quantification methods employed to compute both baseline and TEPR are critical in understanding the information processing of humans in practical ergonomics settings.

  18. Pulmonary hypertension associated with rheumatic diseases: baseline characteristics from the Korean registry.

    Science.gov (United States)

    Jeon, Chan Hong; Chai, Ji-Young; Seo, Young-Il; Jun, Jae-Bum; Koh, Eun-Mi; Lee, Soo-Kon

    2012-10-01

    The REgistry of Pulmonary Hypertension Associated with Rheumatic Disease (REOPARD) was established in Korea. The baseline data are described from the second year of the registry's operation. Patients with a connective tissue disease (CTD) who met the modified definition of the WHO group I pulmonary arterial hypertension (PAH) were enrolled. PAH was defined as a systolic pulmonary arterial pressure> 40 mmHg by echocardiography or mean pulmonary arterial pressure> 25 mmHg by right heart catheterization. Hemodynamic parameters and clinical data such as demographics, functional class, underlying disease, organ involvement, laboratory tests and current treatment were recorded. A total of 321 patients were enrolled during the 2-year study period from 2008 to 2010. The mean age of the patients at registration was 51.9 years and 87.5% were female. Most patients were diagnosed by echocardiography and only 24 patients (7.5%) underwent cardiac catheterization. Exertional dyspnea was present in 63.6% of patients and 31.8% were New York Heart Association class III or IV. Among the patients, systemic lupus erythematosus accounted for 35.3%, systemic sclerosis 28.3%, rheumatoid arthritis 7.8%, overlap syndrome 9.0%, and mixed connective tissue disease 5.9%. There were no significant differences in hemodynamics, functional class, diffusing capacity and N-terminal pro-brain natriuretic peptide levels between the disease subgroups. Treatments consisted of calcium antagonists (57.0%), endothelin antagonists (32.7%), prostanoids (27.1%), phosphodiesterase-5 inhibitors (14.3%) and combinations (37.4%). Compared with previous studies, the results showed some differences: underlying diseases, functional status and treatments. This may be due to differences in ethnic background and diagnostic methods of our study. © 2012 The Authors International Journal of Rheumatic Diseases © 2012 Asia Pacific League of Associations for Rheumatology and Wiley Publishing Asia Pty Ltd.

  19. Vegetation Parameter Extraction Using Dual Baseline Polarimetric SAR Interferometry Data

    Science.gov (United States)

    Zhang, H.; Wang, C.; Chen, X.; Tang, Y.

    2009-04-01

    For vegetation parameter inversion, the single baseline polarimetric SAR interferometry (POLinSAR) technique, such as the three-stage method and the ESPRIT algorithm, is limited by the observed data with the minimum ground to volume amplitude ration, which effects the estimation of the effective phase center for the vegetation canopy or the surface, and thus results in the underestimated vegetation height. In order to remove this effect of the single baseline inversion techniques in some extend, another baseline POLinSAR data is added on vegetation parameter estimation in this paper, and a dual baseline POLinSAR technique for the extraction of the vegetation parameter is investigated and improved to reduce the dynamic bias for the vegetation parameter estimation. Finally, the simulated data and real data are used to validate this dual baseline technique.

  20. A method to establish seismic noise baselines for automated station assessment

    Science.gov (United States)

    McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.

    2009-01-01

    We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).

  1. Education Organization Baseline Control Protection and Trusted Level Security

    Directory of Open Access Journals (Sweden)

    Wasim A. Al-Hamdani

    2007-12-01

    Full Text Available Many education organizations have adopted for security the enterprise best practices for implementation on their campuses, while others focus on ISO Standard (or/and the National Institution of Standards and Technology.All these adoptions are dependent on IT personal and their experiences or knowledge of the standard. On top of this is the size of the education organizations. The larger the population in an education organization, the more the problem of information and security become very clear. Thus, they have been obliged to comply with information security issues and adopt the national or international standard. The case is quite different when the population size of the education organization is smaller. In such education organizations, they use social security numbers as student ID, and issue administrative rights to faculty and lab managers – or they are not aware of the Family Educational Rights and Privacy Act (FERPA – and release some personal information.The problem of education organization security is widely open and depends on the IT staff and their information security knowledge in addition to the education culture (education, scholarships and services has very special characteristics other than an enterprise or comparative organizationThis paper is part of a research to develop an “Education Organization Baseline Control Protection and Trusted Level Security.” The research has three parts: Adopting (standards, Testing and Modifying (if needed.

  2. Organic Contamination Baseline Study: In NASA JSC Astromaterials Curation Laboratories. Summary Report

    Science.gov (United States)

    Calaway, Michael J.

    2013-01-01

    In preparation for OSIRIS-REx and other future sample return missions concerned with analyzing organics, we conducted an Organic Contamination Baseline Study for JSC Curation Labsoratories in FY12. For FY12 testing, organic baseline study focused only on molecular organic contamination in JSC curation gloveboxes: presumably future collections (i.e. Lunar, Mars, asteroid missions) would use isolation containment systems over only cleanrooms for primary sample storage. This decision was made due to limit historical data on curation gloveboxes, limited IR&D funds and Genesis routinely monitors organics in their ISO class 4 cleanrooms.

  3. Fragrance mix II in the baseline series contributes significantly to detection of fragrance allergy

    DEFF Research Database (Denmark)

    Heisterberg, Maria S Vølund; Andersen, Klaus E.; Avnstorp, Christian

    2010-01-01

    Background: Fragrance mix II (FM II) is a relatively new screening marker for fragrance contact allergy. It was introduced in the patch test baseline series in Denmark in 2005 and contains six different fragrance chemicals commonly present in cosmetic products and which are known allergens. Aim......: To investigate the diagnostic contribution of including FM II in the baseline series by comparing it with other screening markers of fragrance allergy: fragrance mix I (FM I), Myroxylon pereirae and hydroxyisohexyl 3-cyclohexene carboxaldehyde (HICC). Method: Retrospective study of 12 302 patients consecutively...

  4. Wind power projects in the CDM: Methodologies and tools for baselines, carbon financing and sustainability analysis

    International Nuclear Information System (INIS)

    Ringius, L.; Grohnheit, P.E.; Nielsen, L.H.; Olivier, A.L.; Painuly, J.; Villavicencio, A.

    2002-12-01

    The report is intended to be a guidance document for project developers, investors, lenders, and CDM host countries involved in wind power projects in the CDM. The report explores in particular those issues that are important in CDM project assessment and development - that is, baseline development, carbon financing, and environmental sustainability. It does not deal in detail with those issues that are routinely covered in a standard wind power project assessment. The report tests, compares, and recommends methodologies for and approaches to baseline development. To present the application and implications of the various methodologies and approaches in a concrete context, Africa's largest wind farm-namely the 60 MW wind farm located in Zafarana, Egypt- is examined as a hypothetical CDM wind power project The report shows that for the present case example there is a difference of about 25% between the lowest (0.5496 tCO2/MWh) and the highest emission rate (0.6868 tCO 2 /MWh) estimated in accordance with these three standardized approaches to baseline development according to the Marrakesh Accord. This difference in emission factors comes about partly as a result of including hydroelectric power in the baseline scenario. Hydroelectric resources constitute around 21% of the generation capacity in Egypt, and, if excluding hydropower, the difference between the lowest and the highest baseline is reduced to 18%. Furthermore, since the two variations of the 'historical' baseline option examined result in the highest and the lowest baselines, by disregarding this baseline option altogether the difference between the lowest and the highest is reduced to 16%. The ES3-model, which the Systems Analysis Department at Risoe National Laboratory has developed, makes it possible for this report to explore the project-specific approach to baseline development in some detail. Based on quite disaggregated data on the Egyptian electricity system, including the wind power production

  5. 22 CFR 92.36 - Authentication defined.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Authentication defined. 92.36 Section 92.36... Notarial Acts § 92.36 Authentication defined. An authentication is a certification of the genuineness of... recognized in another jurisdiction. Documents which may require authentication include legal instruments...

  6. A definability theorem for first order logic

    NARCIS (Netherlands)

    Butz, C.; Moerdijk, I.

    1997-01-01

    In this paper we will present a definability theorem for first order logic This theorem is very easy to state and its proof only uses elementary tools To explain the theorem let us first observe that if M is a model of a theory T in a language L then clearly any definable subset S M ie a subset S

  7. Dilution Confusion: Conventions for Defining a Dilution

    Science.gov (United States)

    Fishel, Laurence A.

    2010-01-01

    Two conventions for preparing dilutions are used in clinical laboratories. The first convention defines an "a:b" dilution as "a" volumes of solution A plus "b" volumes of solution B. The second convention defines an "a:b" dilution as "a" volumes of solution A diluted into a final volume of "b". Use of the incorrect dilution convention could affect…

  8. Defining Hardwood Veneer Log Quality Attributes

    Science.gov (United States)

    Jan Wiedenbeck; Michael Wiemann; Delton Alderman; John Baumgras; William Luppold

    2004-01-01

    This publication provides a broad spectrum of information on the hardwood veneer industry in North America. Veneer manufacturers and their customers impose guidelines in specifying wood quality attributes that are very discriminating but poorly defined (e.g., exceptional color, texture, and/or figure characteristics). To better understand and begin to define the most...

  9. Development of an automated desktop procedure for defining macro ...

    African Journals Online (AJOL)

    methods (Von Neumann mean square error, CUSUM plots or unweighted values and the Worsley Likelihood Ratio Test (WLRT)) were used to define macro-reach breaks for four South African rivers (Crocodile, Olifants, Mhlathuze and Seekoei Rivers) and were compared to ... Water SA Vol.32 (3) 2006: pp.395-402 ...

  10. On the analytic continuation of functions defined by Legendre series

    International Nuclear Information System (INIS)

    Grinstein, F.F.

    1981-07-01

    An infinite diagonal sequence of Punctual Pade Approximants is considered for the approximate analytical continuation of a function defined by a formal Legendre series. The technique is tested in the case of two series with exactly known analytical sum: the generating function for Legendre polynomials and the Coulombian scattering amplitude. (author)

  11. Baseline factors that influence ASAS 20 response in patients with ankylosing spondylitis treated with etanercept.

    Science.gov (United States)

    Davis, John C; Van der Heijde, Désirée M F M; Dougados, Maxime; Braun, Jurgen; Cush, John J; Clegg, Daniel O; Inman, Robert D; de Vries, Todd; Tsuji, Wayne H

    2005-09-01

    To examine the baseline demographic and disease characteristics that might influence improvement as measured by the Assessment in Ankylosing Spondylitis Response Criteria (ASAS 20) in patients with ankylosing spondylitis (AS). A multicenter Phase 3 study was performed to compare the safety and efficacy of 24 weeks of etanercept 25 mg subcutaneous injection twice weekly (n = 138) and placebo (n = 139) in patients with AS. The ASAS 20 was measured at multiple time points. Using a significance level of 0.05, a repeated measures logistic regression model was used to determine which baseline factors influenced response in the etanercept-treated patients during the 24-week double blind portion of the trial. The following baseline factors were used in the model: demographic and disease severity variables, concomitant medications, extra-articular manifestations, and HLA-B27 status. The predictive capability of the model was then tested on the patients receiving placebo after they had received open-label etanercept treatment. Baseline factors that were significant predictors of an ASAS 20 response in etanercept-treated patients were C-reactive protein (CRP), back pain score, and Bath Ankylosing Spondylitis Functional Index (BASFI) score. Although clinical response to etanercept was seen at all levels of baseline disease activity, responses were consistently more likely with higher CRP levels or back pain scores and less likely with increased BASFI scores at baseline. Higher CRP values and back pain scores and lower BASFI scores at baseline were significant predictors of a higher ASAS 20 response in patients with AS receiving etanercept but predictive value was of insufficient magnitude to determine treatment in individual patients.

  12. The impact of GPS receiver modifications and ionospheric activity on Swarm baseline determination

    Science.gov (United States)

    Mao, X.; Visser, P. N. A. M.; van den IJssel, J.

    2018-05-01

    The European Space Agency (ESA) Swarm mission is a satellite constellation launched on 22 November 2013 aiming at observing the Earth geomagnetic field and its temporal variations. The three identical satellites are equipped with high-precision dual-frequency Global Positioning System (GPS) receivers, which make the constellation an ideal test bed for baseline determination. From October 2014 to August 2016, a number of GPS receiver modifications and a new GPS Receiver Independent Exchange Format (RINEX) converter were implemented. Moreover, the on-board GPS receiver performance has been influenced by the ionospheric scintillations. The impact of these factors is assessed for baseline determination of the pendulum formation flying Swarm-A and -C satellites. In total 30 months of data - from 15 July 2014 to the end of 2016 - is analyzed. The assessment includes analysis of observation residuals, success rate of GPS carrier phase ambiguity fixing, a consistency check between the so-called kinematic and reduced-dynamic baseline solution, and validations of orbits by comparing with Satellite Laser Ranging (SLR) observations. External baseline solutions from The German Space Operations Center (GSOC) and Astronomisches Institut - Universität Bern (AIUB) are also included in the comparison. Results indicate that the GPS receiver modifications and RINEX converter changes are effective to improve the baseline determination. This research eventually shows a consistency level of 9.3/4.9/3.0 mm between kinematic and reduced-dynamic baselines in the radial/along-track/cross-track directions. On average 98.3% of the epochs have kinematic solutions. Consistency between TU Delft and external reduced-dynamic baseline solutions is at a level of 1 mm level in all directions.

  13. Mobile Robots Path Planning Using the Overall Conflict Resolution and Time Baseline Coordination

    Directory of Open Access Journals (Sweden)

    Yong Ma

    2014-01-01

    Full Text Available This paper aims at resolving the path planning problem in a time-varying environment based on the idea of overall conflict resolution and the algorithm of time baseline coordination. The basic task of the introduced path planning algorithms is to fulfill the automatic generation of the shortest paths from the defined start poses to their end poses with consideration of generous constraints for multiple mobile robots. Building on this, by using the overall conflict resolution, within the polynomial based paths, we take into account all the constraints including smoothness, motion boundary, kinematics constraints, obstacle avoidance, and safety constraints among robots together. And time baseline coordination algorithm is proposed to process the above formulated problem. The foremost strong point is that much time can be saved with our approach. Numerical simulations verify the effectiveness of our approach.

  14. An Automated Baseline Correction Method Based on Iterative Morphological Operations.

    Science.gov (United States)

    Chen, Yunliang; Dai, Liankui

    2018-05-01

    Raman spectra usually suffer from baseline drift caused by fluorescence or other reasons. Therefore, baseline correction is a necessary and crucial step that must be performed before subsequent processing and analysis of Raman spectra. An automated baseline correction method based on iterative morphological operations is proposed in this work. The method can adaptively determine the structuring element first and then gradually remove the spectral peaks during iteration to get an estimated baseline. Experiments on simulated data and real-world Raman data show that the proposed method is accurate, fast, and flexible for handling different kinds of baselines in various practical situations. The comparison of the proposed method with some state-of-the-art baseline correction methods demonstrates its advantages over the existing methods in terms of accuracy, adaptability, and flexibility. Although only Raman spectra are investigated in this paper, the proposed method is hopefully to be used for the baseline correction of other analytical instrumental signals, such as IR spectra and chromatograms.

  15. Extracting Baseline Electricity Usage Using Gradient Tree Boosting

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Taehoon [Ulsan Nat. Inst. of Sci. & Tech., Ulsan (South Korea); Lee, Dongeun [Ulsan Nat. Inst. of Sci. & Tech., Ulsan (South Korea); Choi, Jaesik [Ulsan Nat. Inst. of Sci. & Tech., Ulsan (South Korea); Spurlock, Anna [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sim, Alex [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Todd, Annika [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wu, Kesheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-05-05

    To understand how specific interventions affect a process observed over time, we need to control for the other factors that influence outcomes. Such a model that captures all factors other than the one of interest is generally known as a baseline. In our study of how different pricing schemes affect residential electricity consumption, the baseline would need to capture the impact of outdoor temperature along with many other factors. In this work, we examine a number of different data mining techniques and demonstrate Gradient Tree Boosting (GTB) to be an effective method to build the baseline. We train GTB on data prior to the introduction of new pricing schemes, and apply the known temperature following the introduction of new pricing schemes to predict electricity usage with the expected temperature correction. Our experiments and analyses show that the baseline models generated by GTB capture the core characteristics over the two years with the new pricing schemes. In contrast to the majority of regression based techniques which fail to capture the lag between the peak of daily temperature and the peak of electricity usage, the GTB generated baselines are able to correctly capture the delay between the temperature peak and the electricity peak. Furthermore, subtracting this temperature-adjusted baseline from the observed electricity usage, we find that the resulting values are more amenable to interpretation, which demonstrates that the temperature-adjusted baseline is indeed effective.

  16. Baseline simple and complex reaction times in female compared to male boxers.

    Science.gov (United States)

    Bianco, M; Ferri, M; Fabiano, C; Giorgiano, F; Tavella, S; Manili, U; Faina, M; Palmieri, V; Zeppilli, P

    2011-06-01

    The aim of the study was to compare baseline cognitive performance of female in respect to male amateur boxers. Study population included 28 female amateur boxers. Fifty-six male boxers, matched for age, employment and competitive level to female athletes, formed the control group. All boxers had no history of head concussions (except boxing). Each boxer was requested to: 1) fulfill a questionnaire collecting demographic data, level of education, occupational status, boxing record and number of head concussions during boxing; 2) undergo a baseline computerized neuropsychological (NP) test (CogSport) measuring simple and complex reaction times (RT). Female were lighter than male boxers (56±7 vs. 73.1±9.8 kg, Pknock-outs, etc.) correlated with NP scores. Female and male Olympic-style boxers have no (or minimal) differences in baseline cognitive performance. Further research with larger series of female boxers is required to confirm these findings.

  17. Dynamic baseline detection method for power data network service

    Science.gov (United States)

    Chen, Wei

    2017-08-01

    This paper proposes a dynamic baseline Traffic detection Method which is based on the historical traffic data for the Power data network. The method uses Cisco's NetFlow acquisition tool to collect the original historical traffic data from network element at fixed intervals. This method uses three dimensions information including the communication port, time, traffic (number of bytes or number of packets) t. By filtering, removing the deviation value, calculating the dynamic baseline value, comparing the actual value with the baseline value, the method can detect whether the current network traffic is abnormal.

  18. The prognostic utility of baseline alpha-fetoprotein for hepatocellular carcinoma patients.

    Science.gov (United States)

    Silva, Jack P; Gorman, Richard A; Berger, Nicholas G; Tsai, Susan; Christians, Kathleen K; Clarke, Callisia N; Mogal, Harveshp; Gamblin, T Clark

    2017-12-01

    Alpha-fetoprotein (AFP) has a valuable role in postoperative surveillance for hepatocellular carcinoma (HCC) recurrence. The utility of pretreatment or baseline AFP remains controversial. The present study hypothesized that elevated baseline AFP levels are associated with worse overall survival in HCC patients. Adult HCC patients were identified using the National Cancer Database (2004-2013). Patients were stratified according to baseline AFP measurements into the following groups: Negative (2000). The primary outcome was overall survival (OS), which was analyzed by log-rank test and graphed using Kaplan-Meier method. Multivariate regression modeling was used to determine hazard ratios (HR) for OS. Of 41 107 patients identified, 15 809 (33.6%) were Negative. Median overall survival was highest in the Negative group, followed by Borderline, Elevated, and Highly Elevated (28.7 vs 18.9 vs 8.8 vs 3.2 months; P < 0.001). On multivariate analysis, overall survival hazard ratios for the Borderline, Elevated, and Highly Elevated groups were 1.18 (P = 0.267), 1.94 (P < 0.001), and 1.77 (P = 0.007), respectively (reference Negative). Baseline AFP independently predicted overall survival in HCC patients regardless of treatment plan. A baseline AFP value is a simple and effective method to assist in expected survival for HCC patients. © 2017 Wiley Periodicals, Inc.

  19. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method.

    Science.gov (United States)

    Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu

    2016-12-24

    A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  20. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method

    Directory of Open Access Journals (Sweden)

    Yueqian Shen

    2016-12-01

    Full Text Available A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  1. GNSS Single Frequency, Single Epoch Reliable Attitude Determination Method with Baseline Vector Constraint

    Directory of Open Access Journals (Sweden)

    Ang Gong

    2015-12-01

    Full Text Available For Global Navigation Satellite System (GNSS single frequency, single epoch attitude determination, this paper proposes a new reliable method with baseline vector constraint. First, prior knowledge of baseline length, heading, and pitch obtained from other navigation equipment or sensors are used to reconstruct objective function rigorously. Then, searching strategy is improved. It substitutes gradually Enlarged ellipsoidal search space for non-ellipsoidal search space to ensure correct ambiguity candidates are within it and make the searching process directly be carried out by least squares ambiguity decorrelation algorithm (LAMBDA method. For all vector candidates, some ones are further eliminated by derived approximate inequality, which accelerates the searching process. Experimental results show that compared to traditional method with only baseline length constraint, this new method can utilize a priori baseline three-dimensional knowledge to fix ambiguity reliably and achieve a high success rate. Experimental tests also verify it is not very sensitive to baseline vector error and can perform robustly when angular error is not great.

  2. Charactering baseline shift with 4th polynomial function for portable biomedical near-infrared spectroscopy device

    Science.gov (United States)

    Zhao, Ke; Ji, Yaoyao; Pan, Boan; Li, Ting

    2018-02-01

    The continuous-wave Near-infrared spectroscopy (NIRS) devices have been highlighted for its clinical and health care applications in noninvasive hemodynamic measurements. The baseline shift of the deviation measurement attracts lots of attentions for its clinical importance. Nonetheless current published methods have low reliability or high variability. In this study, we found a perfect polynomial fitting function for baseline removal, using NIRS. Unlike previous studies on baseline correction for near-infrared spectroscopy evaluation of non-hemodynamic particles, we focused on baseline fitting and corresponding correction method for NIRS and found that the polynomial fitting function at 4th order is greater than the function at 2nd order reported in previous research. Through experimental tests of hemodynamic parameters of the solid phantom, we compared the fitting effect between the 4th order polynomial and the 2nd order polynomial, by recording and analyzing the R values and the SSE (the sum of squares due to error) values. The R values of the 4th order polynomial function fitting are all higher than 0.99, which are significantly higher than the corresponding ones of 2nd order, while the SSE values of the 4th order are significantly smaller than the corresponding ones of the 2nd order. By using the high-reliable and low-variable 4th order polynomial fitting function, we are able to remove the baseline online to obtain more accurate NIRS measurements.

  3. [Diagnostic value of baseline serum luteinizing hormone level for central precocious puberty in girls].

    Science.gov (United States)

    Ou-Yang, Li-Xue; Yang, Fan

    2017-07-01

    To evaluate the diagnostic value of baseline serum luteinizing hormone (LH) level for central precocious puberty (CPP) in girls. A total of 279 girls with precocious puberty were subjected to assessment of growth and development, bone age determination, baseline LH test, and follicle-stimulating hormone (FSH) test, gonadotropin-releasing hormone stimulation test, and other related examinations. Of the 279 patients, 175 were diagnosed with CPP and 104 with premature thelarche (PT). The receiver operating characteristic (ROC) curve was used to evaluate the diagnostic value of baseline LH and FSH levels and their peak levels for CPP, and the correlation between the baseline LH level and the peak LH level was analyzed. The CPP group had significantly higher bone age, baseline LH and FSH levels, peak LH and FSH levels, and ratio of peak LH level to peak FSH level than the PT group (Pbaseline LH level and peak LH level had good diagnostic values for CPP. Among the three bone age subgroups in the CPP group (7.0-9.0 years, 9.0-11.0 years, and >11.0 years), baseline LH level showed the best diagnostic value in the >11.0 years subgroup, with the largest area under the ROC curve. At a baseline LH level of 0.45 IU/L, the Youden index reached the peak value, and the sensitivity and specificity were 66.7% and 80% respectively, for the diagnosis of CPP. At a peak LH level of 9.935 IU/L, the Youden index reached the peak value, and the sensitivity and specificity were 74.8% and 100% respectively, for the diagnosis of CPP. The baseline LH level was positively correlated with the peak LH level (r=0.440, PBaseline LH level can be used as an primary screening index for the diagnosis of CPP. It has a certain diagnostic value for CPP at different bone ages, and may be used as a monitoring index during the treatment and follow-uP.

  4. Defining the critical hurdles in cancer immunotherapy

    Science.gov (United States)

    2011-01-01

    Scientific discoveries that provide strong evidence of antitumor effects in preclinical models often encounter significant delays before being tested in patients with cancer. While some of these delays have a scientific basis, others do not. We need to do better. Innovative strategies need to move into early stage clinical trials as quickly as it is safe, and if successful, these therapies should efficiently obtain regulatory approval and widespread clinical application. In late 2009 and 2010 the Society for Immunotherapy of Cancer (SITC), convened an "Immunotherapy Summit" with representatives from immunotherapy organizations representing Europe, Japan, China and North America to discuss collaborations to improve development and delivery of cancer immunotherapy. One of the concepts raised by SITC and defined as critical by all parties was the need to identify hurdles that impede effective translation of cancer immunotherapy. With consensus on these hurdles, international working groups could be developed to make recommendations vetted by the participating organizations. These recommendations could then be considered by regulatory bodies, governmental and private funding agencies, pharmaceutical companies and academic institutions to facilitate changes necessary to accelerate clinical translation of novel immune-based cancer therapies. The critical hurdles identified by representatives of the collaborating organizations, now organized as the World Immunotherapy Council, are presented and discussed in this report. Some of the identified hurdles impede all investigators; others hinder investigators only in certain regions or institutions or are more relevant to specific types of immunotherapy or first-in-humans studies. Each of these hurdles can significantly delay clinical translation of promising advances in immunotherapy yet if overcome, have the potential to improve outcomes of patients with cancer. PMID:22168571

  5. Defining the critical hurdles in cancer immunotherapy

    Directory of Open Access Journals (Sweden)

    Fox Bernard A

    2011-12-01

    Full Text Available Abstract Scientific discoveries that provide strong evidence of antitumor effects in preclinical models often encounter significant delays before being tested in patients with cancer. While some of these delays have a scientific basis, others do not. We need to do better. Innovative strategies need to move into early stage clinical trials as quickly as it is safe, and if successful, these therapies should efficiently obtain regulatory approval and widespread clinical application. In late 2009 and 2010 the Society for Immunotherapy of Cancer (SITC, convened an "Immunotherapy Summit" with representatives from immunotherapy organizations representing Europe, Japan, China and North America to discuss collaborations to improve development and delivery of cancer immunotherapy. One of the concepts raised by SITC and defined as critical by all parties was the need to identify hurdles that impede effective translation of cancer immunotherapy. With consensus on these hurdles, international working groups could be developed to make recommendations vetted by the participating organizations. These recommendations could then be considered by regulatory bodies, governmental and private funding agencies, pharmaceutical companies and academic institutions to facilitate changes necessary to accelerate clinical translation of novel immune-based cancer therapies. The critical hurdles identified by representatives of the collaborating organizations, now organized as the World Immunotherapy Council, are presented and discussed in this report. Some of the identified hurdles impede all investigators; others hinder investigators only in certain regions or institutions or are more relevant to specific types of immunotherapy or first-in-humans studies. Each of these hurdles can significantly delay clinical translation of promising advances in immunotherapy yet if overcome, have the potential to improve outcomes of patients with cancer.

  6. Author Response to Sabour (2018), "Comment on Hall et al. (2017), 'How to Choose Between Measures of Tinnitus Loudness for Clinical Research? A Report on the Reliability and Validity of an Investigator-Administered Test and a Patient-Reported Measure Using Baseline Data Collected in a Phase IIa Drug Trial'".

    Science.gov (United States)

    Hall, Deborah A; Mehta, Rajnikant L; Fackrell, Kathryn

    2018-03-08

    The authors respond to a letter to the editor (Sabour, 2018) concerning the interpretation of validity in the context of evaluating treatment-related change in tinnitus loudness over time. The authors refer to several landmark methodological publications and an international standard concerning the validity of patient-reported outcome measurement instruments. The tinnitus loudness rating performed better against our reported acceptability criteria for (face and convergent) validity than did the tinnitus loudness matching test. It is important to distinguish between tests that evaluate the validity of measuring treatment-related change over time and tests that quantify the accuracy of diagnosing tinnitus as a case and non-case.

  7. Association of baseline bleeding pattern on amenorrhea with levonorgestrel intrauterine system use.

    Science.gov (United States)

    Mejia, Manuela; McNicholas, Colleen; Madden, Tessa; Peipert, Jeffrey F

    2016-11-01

    This study aims to evaluate the effect of baseline bleeding patterns on rates of amenorrhea reported at 12 months in levonorgestrel (LNG) 52 mg intrauterine system (IUS) users. We also assessed the effect of baseline bleeding patterns at 3 and 6 months postinsertion. In this secondary analysis of the Contraceptive CHOICE Project, we included participants who had an LNG-IUS inserted within 1 month of enrollment and continued use for 12 months. Using 12-month telephone survey data, we defined amenorrhea at 12 months of use as no bleeding or spotting during the previous 6 months. We used chi-square and multivariable logistic regression to assess the association of baseline bleeding pattern with amenorrhea while controlling for confounding variables. Of 1802 continuous 12-month LNG-IUS users, amenorrhea was reported by 4.9%, 14.8% and 15.4% of participants at 3, 6 and 12 months, receptively. Participants with light baseline bleeding or short duration of flow reported higher rates of amenorrhea at 3 and 6 months postinsertion (pamenorrhea at 3 and 6 months (pamenorrhea at 12 months than those who reported moderate bleeding (OR adj , 0.36; 95% CI, 0.16-0.69). Women with heavier menstrual bleeding are less likely than women with moderate flow to report amenorrhea following 12 months of LNG-IUS use. Baseline heavy menstrual flow reduces the likelihood of amenorrhea with LNG-IUS use, information that could impact contraceptive counseling. Anticipatory counseling can improve method satisfaction and continuation, an important strategy to continue to reduce unintended pregnancy and abortion rates. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Bilayer graphene quantum dot defined by topgates

    Energy Technology Data Exchange (ETDEWEB)

    Müller, André; Kaestner, Bernd; Hohls, Frank; Weimann, Thomas; Pierz, Klaus; Schumacher, Hans W., E-mail: hans.w.schumacher@ptb.de [Physikalisch-Technische Bundesanstalt, Bundesallee 100, 38116 Braunschweig (Germany)

    2014-06-21

    We investigate the application of nanoscale topgates on exfoliated bilayer graphene to define quantum dot devices. At temperatures below 500 mK, the conductance underneath the grounded gates is suppressed, which we attribute to nearest neighbour hopping and strain-induced piezoelectric fields. The gate-layout can thus be used to define resistive regions by tuning into the corresponding temperature range. We use this method to define a quantum dot structure in bilayer graphene showing Coulomb blockade oscillations consistent with the gate layout.

  9. Comment on Hall et al. (2017), "How to Choose Between Measures of Tinnitus Loudness for Clinical Research? A Report on the Reliability and Validity of an Investigator-Administered Test and a Patient-Reported Measure Using Baseline Data Collected in a Phase IIa Drug Trial".

    Science.gov (United States)

    Sabour, Siamak

    2018-03-08

    The purpose of this letter, in response to Hall, Mehta, and Fackrell (2017), is to provide important knowledge about methodology and statistical issues in assessing the reliability and validity of an audiologist-administered tinnitus loudness matching test and a patient-reported tinnitus loudness rating. The author uses reference textbooks and published articles regarding scientific assessment of the validity and reliability of a clinical test to discuss the statistical test and the methodological approach in assessing validity and reliability in clinical research. Depending on the type of the variable (qualitative or quantitative), well-known statistical tests can be applied to assess reliability and validity. The qualitative variables of sensitivity, specificity, positive predictive value, negative predictive value, false positive and false negative rates, likelihood ratio positive and likelihood ratio negative, as well as odds ratio (i.e., ratio of true to false results), are the most appropriate estimates to evaluate validity of a test compared to a gold standard. In the case of quantitative variables, depending on distribution of the variable, Pearson r or Spearman rho can be applied. Diagnostic accuracy (validity) and diagnostic precision (reliability or agreement) are two completely different methodological issues. Depending on the type of the variable (qualitative or quantitative), well-known statistical tests can be applied to assess validity.

  10. Baseline Biomarkers for Outcome of Melanoma Patients Treated with Pembrolizumab

    NARCIS (Netherlands)

    Weide, Benjamin; Martens, Alexander; Hassel, Jessica C.; Berking, Carola; Postow, Michael A.; Bisschop, Kees; Simeone, Ester; Mangana, Johanna; Schilling, Bastian; Di Giacomo, Anna Maria; Brenner, Nicole; Kaehler, Katharina; Heinzerling, Lucie; Gutzmer, Ralf; Bender, Armin; Gebhardt, Christoffer; Romano, Emanuela; Meier, Friedegund; Martus, Peter; Maio, Michele; Blank, Christian; Schadendorf, Dirk; Dummer, Reinhard; Ascierto, Paolo A.; Hospers, Geke; Garbe, Claus; Wolchok, Jedd D.

    2016-01-01

    Purpose: Biomarkers for outcome after immune-checkpoint blockade are strongly needed as these may influence individual treatment selection or sequence. We aimed to identify baseline factors associated with overall survival (OS) after pembrolizumab treatment in melanoma patients. Experimental Design:

  11. Baseline assessment of fish communities of the Flower Garden Banks

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The work developed baseline information on fish and benthic communities within the Flower Garden Banks National Marine Sanctuary (FGBNMS). Surveys employed diving,...

  12. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  13. Magical properties of a 2540 km baseline superbeam experiment

    International Nuclear Information System (INIS)

    Raut, Sushant K.; Singh, Ravi Shanker; Uma Sankar, S.

    2011-01-01

    Lack of any information on the CP violating phase δ CP weakens our ability to determine neutrino mass hierarchy. Magic baseline of 7500 km was proposed to overcome this problem. However, to obtain large enough fluxes, at this very long baseline, one needs new techniques of generating high intensity neutrino beams. In this Letter, we highlight the magical properties of a 2540 km baseline. At such a baseline, using a narrow band neutrino superbeam whose no oscillation event rate peaks around the energy 3.5 GeV, we can determine neutrino mass hierarchy independently of the CP phase. For sin 2 2θ 13 ≥0.05, a very modest exposure of 10 Kiloton-years is sufficient to determine the hierarchy. For 0.02≤sin 2 2θ 13 ≤0.05, an exposure of about 100 Kiloton-years is needed.

  14. Parametric estimation of time varying baselines in airborne interferometric SAR

    DEFF Research Database (Denmark)

    Mohr, Johan Jacob; Madsen, Søren Nørvang

    1996-01-01

    A method for estimation of time varying spatial baselines in airborne interferometric synthetic aperture radar (SAR) is described. The range and azimuth distortions between two images acquired with a non-linear baseline are derived. A parametric model of the baseline is then, in a least square...... sense, estimated from image shifts obtained by cross correlation of numerous small patches throughout the image. The method has been applied to airborne EMISAR imagery from the 1995 campaign over the Storstrommen Glacier in North East Greenland conducted by the Danish Center for Remote Sensing. This has...... reduced the baseline uncertainties from several meters to the centimeter level in a 36 km scene. Though developed for airborne SAR the method can easily be adopted to satellite data...

  15. STATUS OF THE US LONG BASELINE NEUTRINO EXPERIMENT STUDY.

    Energy Technology Data Exchange (ETDEWEB)

    BISHAI,M.

    2006-09-21

    The US Long Baseline Neutrino Experiment Study was commissioned jointly by Brookhaven National Laboratory and Fermi National Accelerator Laboratory to investigate the potential for future U.S. based long baseline neutrino oscillation experiments beyond the currently planned program. The Study focused on MW class convention at neutrino beams that can be produced at Fermilab or BNL. The experimental baselines are based on two possible detector locations: (1) off-axis to the existing Fermilab NuMI beamline at baselines of 700 to 810 km and (2) NSF's proposed future Deep Underground Science and Engineering Laboratory (DUSEL) at baselines greater than 1000 km. Two detector technologies are considered: a megaton class Water Cherenkov detector deployed deep underground at a DUSEL site, or a 100kT Liquid Argon Time-Projection Chamber (TPC) deployed on the surface at any of the proposed sites. The physics sensitivities of the proposed experiments are summarized. We find that conventional horn focused wide-band neutrino beam options from Fermilab or BNL aimed at a massive detector with a baseline of > 1000 km have the best sensitivity to CP violation and the neutrino mass hierarchy for values of the mixing angle {theta}{sub 13} down to 2.2{sup o}.

  16. MALDI-TOF Baseline Drift Removal Using Stochastic Bernstein Approximation

    Directory of Open Access Journals (Sweden)

    Howard Daniel

    2006-01-01

    Full Text Available Stochastic Bernstein (SB approximation can tackle the problem of baseline drift correction of instrumentation data. This is demonstrated for spectral data: matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF data. Two SB schemes for removing the baseline drift are presented: iterative and direct. Following an explanation of the origin of the MALDI-TOF baseline drift that sheds light on the inherent difficulty of its removal by chemical means, SB baseline drift removal is illustrated for both proteomics and genomics MALDI-TOF data sets. SB is an elegant signal processing method to obtain a numerically straightforward baseline shift removal method as it includes a free parameter that can be optimized for different baseline drift removal applications. Therefore, research that determines putative biomarkers from the spectral data might benefit from a sensitivity analysis to the underlying spectral measurement that is made possible by varying the SB free parameter. This can be manually tuned (for constant or tuned with evolutionary computation (for .

  17. A Fully Customized Baseline Removal Framework for Spectroscopic Applications.

    Science.gov (United States)

    Giguere, Stephen; Boucher, Thomas; Carey, C J; Mahadevan, Sridhar; Dyar, M Darby

    2017-07-01

    The task of proper baseline or continuum removal is common to nearly all types of spectroscopy. Its goal is to remove any portion of a signal that is irrelevant to features of interest while preserving any predictive information. Despite the importance of baseline removal, median or guessed default parameters are commonly employed, often using commercially available software supplied with instruments. Several published baseline removal algorithms have been shown to be useful for particular spectroscopic applications but their generalizability is ambiguous. The new Custom Baseline Removal (Custom BLR) method presented here generalizes the problem of baseline removal by combining operations from previously proposed methods to synthesize new correction algorithms. It creates novel methods for each technique, application, and training set, discovering new algorithms that maximize the predictive accuracy of the resulting spectroscopic models. In most cases, these learned methods either match or improve on the performance of the best alternative. Examples of these advantages are shown for three different scenarios: quantification of components in near-infrared spectra of corn and laser-induced breakdown spectroscopy data of rocks, and classification/matching of minerals using Raman spectroscopy. Software to implement this optimization is available from the authors. By removing subjectivity from this commonly encountered task, Custom BLR is a significant step toward completely automatic and general baseline removal in spectroscopic and other applications.

  18. Application-Defined Decentralized Access Control

    Science.gov (United States)

    Xu, Yuanzhong; Dunn, Alan M.; Hofmann, Owen S.; Lee, Michael Z.; Mehdi, Syed Akbar; Witchel, Emmett

    2014-01-01

    DCAC is a practical OS-level access control system that supports application-defined principals. It allows normal users to perform administrative operations within their privilege, enabling isolation and privilege separation for applications. It does not require centralized policy specification or management, giving applications freedom to manage their principals while the policies are still enforced by the OS. DCAC uses hierarchically-named attributes as a generic framework for user-defined policies such as groups defined by normal users. For both local and networked file systems, its execution time overhead is between 0%–9% on file system microbenchmarks, and under 1% on applications. This paper shows the design and implementation of DCAC, as well as several real-world use cases, including sandboxing applications, enforcing server applications’ security policies, supporting NFS, and authenticating user-defined sub-principals in SSH, all with minimal code changes. PMID:25426493

  19. Software Defined Multiband EVA Radio, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this research is to propose a reliable, lightweight, programmable, multi-band, multi-mode, miniaturized frequency-agile EVA software defined radio...

  20. Reconfigurable, Cognitive Software Defined Radio, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — IAI is actively developing Software Defined Radio platforms that can adaptively switch between different modes of operation by modifying both transmit waveforms and...

  1. Software Defined Multiband EVA Radio, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of Phase 2 is to build a reliable, lightweight, programmable, multi-mode, miniaturized EVA Software Defined Radio (SDR) that supports data telemetry,...

  2. Reconfigurable, Cognitive Software Defined Radio, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Intelligent Automation Inc, (IAI) is currently developing a software defined radio (SDR) platform that can adaptively switch between different modes of operation for...

  3. A baseline correction algorithm for Raman spectroscopy by adaptive knots B-spline

    International Nuclear Information System (INIS)

    Wang, Xin; Fan, Xian-guang; Xu, Ying-jie; Wang, Xiu-fen; He, Hao; Zuo, Yong

    2015-01-01

    The Raman spectroscopy technique is a powerful and non-invasive technique for molecular fingerprint detection which has been widely used in many areas, such as food safety, drug safety, and environmental testing. But Raman signals can be easily corrupted by a fluorescent background, therefore we presented a baseline correction algorithm to suppress the fluorescent background in this paper. In this algorithm, the background of the Raman signal was suppressed by fitting a curve called a baseline using a cyclic approximation method. Instead of the traditional polynomial fitting, we used the B-spline as the fitting algorithm due to its advantages of low-order and smoothness, which can avoid under-fitting and over-fitting effectively. In addition, we also presented an automatic adaptive knot generation method to replace traditional uniform knots. This algorithm can obtain the desired performance for most Raman spectra with varying baselines without any user input or preprocessing step. In the simulation, three kinds of fluorescent background lines were introduced to test the effectiveness of the proposed method. We showed that two real Raman spectra (parathion-methyl and colza oil) can be detected and their baselines were also corrected by the proposed method. (paper)

  4. Optimum Criteria for Developing Defined Structures

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2008-01-01

    Full Text Available Basic aspects concerning distributed applications are presented: definition, particularities and importance. For distributed applications linear, arborescent, graph structures are defined with different versions and aggregation methods. Distributed applications have associated structures which through their characteristics influence the costs of the stages in the development cycle and the exploitation costs transferred to each user. The complexity of the defined structures is analyzed. The minimum and maximum criteria are enumerated for optimizing distributed application structures.

  5. Precise baseline determination for the TanDEM-X mission

    Science.gov (United States)

    Koenig, Rolf; Moon, Yongjin; Neumayer, Hans; Wermuth, Martin; Montenbruck, Oliver; Jäggi, Adrian

    The TanDEM-X mission will strive for generating a global precise Digital Elevation Model (DEM) by way of bi-static SAR in a close formation of the TerraSAR-X satellite, already launched on June 15, 2007, and the TanDEM-X satellite to be launched in May 2010. Both satellites carry the Tracking, Occultation and Ranging (TOR) payload supplied by the GFZ German Research Centre for Geosciences. The TOR consists of a high-precision dual-frequency GPS receiver, called Integrated GPS Occultation Receiver (IGOR), and a Laser retro-reflector (LRR) for precise orbit determination (POD) and atmospheric sounding. The IGOR is of vital importance for the TanDEM-X mission objectives as the millimeter level determination of the baseline or distance between the two spacecrafts is needed to derive meter level accurate DEMs. Within the TanDEM-X ground segment GFZ is responsible for the operational provision of precise baselines. For this GFZ uses two software chains, first its Earth Parameter and Orbit System (EPOS) software and second the BERNESE software, for backup purposes and quality control. In a concerted effort also the German Aerospace Center (DLR) generates precise baselines independently with a dedicated Kalman filter approach realized in its FRNS software. By the example of GRACE the generation of baselines with millimeter accuracy from on-board GPS data can be validated directly by way of comparing them to the intersatellite K-band range measurements. The K-band ranges are accurate down to the micrometer-level and therefore may be considered as truth. Both TanDEM-X baseline providers are able to generate GRACE baselines with sub-millimeter accuracy. By merging the independent baselines by GFZ and DLR, the accuracy can even be increased. The K-band validation however covers solely the along-track component as the K-band data measure just the distance between the two GRACE satellites. In addition they inhibit an un-known bias which must be modelled in the comparison, so the

  6. Methylphenidate during early consolidation affects long-term associative memory retrieval depending on baseline catecholamines.

    Science.gov (United States)

    Wagner, Isabella C; van Buuren, Mariët; Bovy, Leonore; Morris, Richard G; Fernández, Guillén

    2017-02-01

    Synaptic memory consolidation is thought to rely on catecholaminergic signaling. Eventually, it is followed by systems consolidation, which embeds memories in a neocortical network. Although this sequence was demonstrated in rodents, it is unclear how catecholamines affect memory consolidation in humans. Here, we tested the effects of catecholaminergic modulation on synaptic and subsequent systems consolidation. We expected enhanced memory performance and increased neocortical engagement during delayed retrieval. Additionally, we tested if this effect was modulated by individual differences in a cognitive proxy measure of baseline catecholamine synthesis capacity. Fifty-three healthy males underwent a between-subjects, double-blind, placebo-controlled procedure across 2 days. On day 1, subjects studied and retrieved object-location associations and received 20 mg of methylphenidate or placebo. Drug intake was timed so that methylphenidate was expected to affect early consolidation but not encoding or retrieval. Memory was tested again while subjects were scanned three days later. Methylphenidate did not facilitate memory performance, and there was no significant group difference in activation during delayed retrieval. However, memory representations differed between groups depending on baseline catecholamines. The placebo group showed increased activation in occipito-temporal regions but decreased connectivity with the hippocampus, associated with lower baseline catecholamine synthesis capacity. The methylphenidate group showed stronger activation in the postcentral gyrus, associated with higher baseline catecholamine synthesis capacity. Altogether, methylphenidate during early consolidation did not foster long-term memory performance, but it affected retrieval-related neural processes depending on individual levels of baseline catecholamines.

  7. Parent's Guide to Understanding Tests.

    Science.gov (United States)

    CTB / McGraw-Hill, Monterey, CA.

    This brief introduction to testing is geared to parents. Types of tests are defined, such as standardized tests, achievement tests, norm referenced tests, criterion referenced tests, and aptitude tests. Various types of scores (grade equivalent, percentile rank, and stanine are also defined, and the uses made of tests by administrators, teachers,…

  8. Deficient motion-defined and texture-defined figure-ground segregation in amblyopic children.

    Science.gov (United States)

    Wang, Jane; Ho, Cindy S; Giaschi, Deborah E

    2007-01-01

    Motion-defined form deficits in the fellow eye and the amblyopic eye of children with amblyopia implicate possible direction-selective motion processing or static figure-ground segregation deficits. Deficient motion-defined form perception in the fellow eye of amblyopic children may not be fully accounted for by a general motion processing deficit. This study investigates the contribution of figure-ground segregation deficits to the motion-defined form perception deficits in amblyopia. Performances of 6 amblyopic children (5 anisometropic, 1 anisostrabismic) and 32 control children with normal vision were assessed on motion-defined form, texture-defined form, and global motion tasks. Performance on motion-defined and texture-defined form tasks was significantly worse in amblyopic children than in control children. Performance on global motion tasks was not significantly different between the 2 groups. Faulty figure-ground segregation mechanisms are likely responsible for the observed motion-defined form perception deficits in amblyopia.

  9. Defining and determining the significance of impacts: concepts and methods

    International Nuclear Information System (INIS)

    Christensen, S.W.; Van Winkle, W.; Mattice, J.S.

    1975-01-01

    The term impact is conceptually and mathematically defined to be the difference in the state or value of an ecosystem with versus without the source of impact. Some resulting problems associated with the measurement of impacts based on comparisons of baseline and operational data are discussed briefly. The concept of a significant adverse impact on a biological system is operationally defined in terms of an adverse impact which, according to a proposed decision-tree, justifies rejection of a project or a change in its site, design, or mode of operation. A gradient of increasing difficulty in the prediction of impacts exists as the scope of the assessment is expanded to consider long-term, far-field impacts with respect to higher levels of biological organization (e.g., communities or ecosystems). The analytical methods available for predicting short-term, near-field impacts are discussed. Finally, the role of simulation modeling as an aid to professional judgment in predicting the long-term, far-field consequences of impacts is considered, and illustrated with an example. (U.S.)

  10. Predicting clinical concussion measures at baseline based on motivation and academic profile.

    Science.gov (United States)

    Trinidad, Katrina J; Schmidt, Julianne D; Register-Mihalik, Johna K; Groff, Diane; Goto, Shiho; Guskiewicz, Kevin M

    2013-11-01

    The purpose of this study was to predict baseline neurocognitive and postural control performance using a measure of motivation, high school grade point average (hsGPA), and Scholastic Aptitude Test (SAT) score. Cross-sectional. Clinical research center. Eighty-eight National Collegiate Athletic Association Division I incoming student-athletes (freshman and transfers). Participants completed baseline clinical concussion measures, including a neurocognitive test battery (CNS Vital Signs), a balance assessment [Sensory Organization Test (SOT)], and motivation testing (Rey Dot Counting). Participants granted permission to access hsGPA and SAT total score. Standard scores for each CNS Vital Signs domain and SOT composite score. Baseline motivation, hsGPA, and SAT explained a small percentage of the variance of complex attention (11%), processing speed (12%), and composite SOT score (20%). Motivation, hsGPA, and total SAT score do not explain a significant amount of the variance in neurocognitive and postural control measures but may still be valuable to consider when interpreting neurocognitive and postural control measures.

  11. Relationship between plasma analytes and SPARE-AD defined brain atrophy patterns in ADNI.

    Directory of Open Access Journals (Sweden)

    Jon B Toledo

    Full Text Available Different inflammatory and metabolic pathways have been associated with Alzheimeŕs disease (AD. However, only recently multi-analyte panels to study a large number of molecules in well characterized cohorts have been made available. These panels could help identify molecules that point to the affected pathways. We studied the relationship between a panel of plasma biomarkers (Human DiscoveryMAP and presence of AD-like brain atrophy patterns defined by a previously published index (SPARE-AD at baseline in subjects of the ADNI cohort. 818 subjects had MRI-derived SPARE-AD scores, of these subjects 69% had plasma biomarkers and 51% had CSF tau and Aβ measurements. Significant analyte-SPARE-AD and analytes correlations were studied in adjusted models. Plasma cortisol and chromogranin A showed a significant association that did not remain significant in the CSF signature adjusted model. Plasma macrophage inhibitory protein-1α and insulin-like growth factor binding protein 2 showed a significant association with brain atrophy in the adjusted model. Cortisol levels showed an inverse association with tests measuring processing speed. Our results indicate that stress and insulin responses and cytokines associated with recruitment of inflammatory cells in MCI-AD are associated with its characteristic AD-like brain atrophy pattern and correlate with clinical changes or CSF biomarkers.

  12. Technical support for GEIS: radioactive waste isolation in geologic formations. Volume 7. Baseline rock properties-basalt

    International Nuclear Information System (INIS)

    1978-04-01

    This volume, Y/OWI/TM-36/7 Baseline Rock Properties--Basalt, is one of a 23-volume series, ''Technical Support for GEIS: Radioactive Waste Isolation in Geologic Formations, Y/OWI/TM-36'' which supplements a ''Contribution to Draft Generic Environmental Impact Statement on Commercial Waste Management: Radioactive Waste Isolation in Geologic Formations, Y/OWI/TM-44.'' The series provides a more complete technical basis for the preconceptual designs, resource requirements, and environmental source terms associated with isolating commercial LWR wastes in underground repositories in salt, granite, shale and basalt. Wastes are considered from three fuel cycles: uranium and plutonium recycling, no recycling of spent fuel and uranium-only recycling. This report contains an evaluation of the results of a literature survey to define the rock mass properties of a generic basalt, which could be considered as a geological medium for storing radioactive waste. The general formation and structure of basaltic rocks is described. This is followed by specific descriptions and rock property data for the Dresser Basalt, the Amchitka Island Basalt, the Nevada Test Site Basalt and the Columbia River Group Basalt. Engineering judgment has been used to derive the rock mass properties of a typical basalt from the relevant intact rock property data and the geological information pertaining to structural defects, such as joints and faults

  13. Radiographic Test

    Energy Technology Data Exchange (ETDEWEB)

    Lee, H.J; Yang, S.H. [Korea Electric Power Research Institute, Taejon (Korea)

    2002-07-01

    This report contains theory, procedure technique and interpretation of radiographic examination and written for whom preparing radiographic test Level II. To determine this baseline of technical competence in the examination, the individual must demonstrate a knowledge of radiography physics, radiation safety, technique development, radiation detection and measurement, facility design, and the characteristics of radiation-producing devices and their principles of operation. (author) 98 figs., 23 tabs.

  14. Atmospheric pressure loading parameters from very long baseline interferometry observations

    Science.gov (United States)

    Macmillan, D. S.; Gipson, John M.

    1994-01-01

    Atmospheric mass loading produces a primarily vertical displacement of the Earth's crust. This displacement is correlated with surface pressure and is large enough to be detected by very long baseline interferometry (VLBI) measurements. Using the measured surface pressure at VLBI stations, we have estimated the atmospheric loading term for each station location directly from VLBI data acquired from 1979 to 1992. Our estimates of the vertical sensitivity to change in pressure range from 0 to -0.6 mm/mbar depending on the station. These estimates agree with inverted barometer model calculations (Manabe et al., 1991; vanDam and Herring, 1994) of the vertical displacement sensitivity computed by convolving actual pressure distributions with loading Green's functions. The pressure sensitivity tends to be smaller for stations near the coast, which is consistent with the inverted barometer hypothesis. Applying this estimated pressure loading correction in standard VLBI geodetic analysis improves the repeatability of estimated lengths of 25 out of 37 baselines that were measured at least 50 times. In a root-sum-square (rss) sense, the improvement generally increases with baseline length at a rate of about 0.3 to 0.6 ppb depending on whether the baseline stations are close to the coast. For the 5998-km baseline from Westford, Massachusetts, to Wettzell, Germany, the rss improvement is about 3.6 mm out of 11.0 mm. The average rss reduction of the vertical scatter for inland stations ranges from 2.7 to 5.4 mm.

  15. Is CP violation observable in long baseline neutrino oscillation experiments?

    International Nuclear Information System (INIS)

    Tanimoto, M.

    1997-01-01

    We have studied CP violation originating from the phase of the neutrino-mixing matrix in the long baseline neutrino oscillation experiments. The direct measurement of CP violation is the difference of the transition probabilities between CP-conjugate channels. In those experiments, the CP-violating effect is not suppressed if the highest neutrino mass scale is taken to be 1 endash 5 eV, which is appropriate for the cosmological hot dark matter. Assuming the hierarchy for the neutrino masses, the upper bounds of CP violation have been calculated for three cases, in which mixings are constrained by the recent short baseline ones. The calculated upper bounds are larger than 10 -2 , which will be observable in the long baseline accelerator experiments. The matter effect, which is not CP invariant, has been also estimated in those experiments. copyright 1997 The American Physical Society

  16. Carbon tetrachloride ERA soil-gas baseline monitoring

    International Nuclear Information System (INIS)

    Fancher, J.D.

    1994-01-01

    From December 1991 through December 1993, Westinghouse Hanford Company performed routine baseline monitoring of selected wells ad soil-gas points twice weekly in the 200 West Area of the Hanford Site. This work supported the carbon Tetrachloride Expedited Response Action (ERA) and provided a solid baseline of volatile organic compound (VOC) concentrations in wells and in the subsurface at the ERA site. As site remediation continues, comparisons to this baseline can be one means of measuring the success of carbon tetrachloride vapor extraction. This report contains observations of the patterns and trends associated with data obtained during soil-gas monitoring at the 200 West Area: Monitoring performed since late 1991 includes monitoring soil-gas probes ad wellheads for volatile organic compounds (VOCs). This report reflects monitoring data collected from December 1991 through December 1993

  17. Environmental baselines: preparing for shale gas in the UK

    Science.gov (United States)

    Bloomfield, John; Manamsa, Katya; Bell, Rachel; Darling, George; Dochartaigh, Brighid O.; Stuart, Marianne; Ward, Rob

    2014-05-01

    Groundwater is a vital source of freshwater in the UK. It provides almost 30% of public water supply on average, but locally, for example in south-east England, it is constitutes nearly 90% of public supply. In addition to public supply, groundwater has a number of other uses including agriculture, industry, and food and drink production. It is also vital for maintaining river flows especially during dry periods and so is essential for maintaining ecosystem health. Recently, there have been concerns expressed about the potential impacts of shale gas development on groundwater. The UK has abundant shales and clays which are currently the focus of considerable interest and there is active research into their characterisation, resource evaluation and exploitation risks. The British Geological Survey (BGS) is undertaking research to provide information to address some of the environmental concerns related to the potential impacts of shale gas development on groundwater resources and quality. The aim of much of this initial work is to establish environmental baselines, such as a baseline survey of methane occurrence in groundwater (National methane baseline study) and the spatial relationships between potential sources and groundwater receptors (iHydrogeology project), prior to any shale gas exploration and development. The poster describes these two baseline studies and presents preliminary findings. BGS are currently undertaking a national survey of baseline methane concentrations in groundwater across the UK. This work will enable any potential future changes in methane in groundwater associated with shale gas development to be assessed. Measurements of methane in potable water from the Cretaceous, Jurassic and Triassic carbonate and sandstone aquifers are variable and reveal methane concentrations of up to 500 micrograms per litre, but the mean value is relatively low at documented in the range 2km. The geological modelling process will be presented and discussed

  18. Evidence of the shifting baseline syndrome in ethnobotanical research.

    Science.gov (United States)

    Hanazaki, Natalia; Herbst, Dannieli Firme; Marques, Mel Simionato; Vandebroek, Ina

    2013-11-14

    The shifting baseline syndrome is a concept from ecology that can be analyzed in the context of ethnobotanical research. Evidence of shifting baseline syndrome can be found in studies dealing with intracultural variation of knowledge, when knowledge from different generations is compared and combined with information about changes in the environment and/or natural resources. We reviewed 84 studies published between 1993 and 2012 that made comparisons of ethnobotanical knowledge according to different age classes. After analyzing these studies for evidence of the shifting baseline syndrome (lower knowledge levels in younger generations and mention of declining abundance of local natural resources), we searched within these studies for the use of the expressions "cultural erosion", "loss of knowledge", or "acculturation". The studies focused on different groups of plants (e.g. medicinal plants, foods, plants used for general purposes, or the uses of specific important species). More than half of all 84 studies (57%) mentioned a concern towards cultural erosion or knowledge loss; 54% of the studies showed evidence of the shifting baseline syndrome; and 37% of the studies did not provide any evidence of shifting baselines (intergenerational knowledge differences but no information available about the abundance of natural resources). The general perception of knowledge loss among young people when comparing ethnobotanical repertoires among different age groups should be analyzed with caution. Changes in the landscape or in the abundance of plant resources may be associated with changes in ethnobotanical repertoires held by people of different age groups. Also, the relationship between the availability of resources and current plant use practices rely on a complexity of factors. Fluctuations in these variables can cause changes in the reference (baseline) of different generations and consequently be responsible for differences in intergenerational knowledge. Unraveling

  19. Constraining proposed combinations of ice history and Earth rheology using VLBI determined baseline length rates in North America

    Science.gov (United States)

    Mitrovica, J. X.; Davis, J. L.; Shapiro, I. I.

    1993-01-01

    We predict the present-day rates of change of the lengths of 19 North American baselines due to the glacial isostatic adjustment process. Contrary to previously published research, we find that the three dimensional motion of each of the sites defining a baseline, rather than only the radial motions of these sites, needs to be considered to obtain an accurate estimate of the rate of change of the baseline length. Predictions are generated using a suite of Earth models and late Pleistocene ice histories, these include specific combinations of the two which have been proposed in the literature as satisfying a variety of rebound related geophysical observations from the North American region. A number of these published models are shown to predict rates which differ significantly from the VLBI observations.

  20. Effect of Baseline Nutritional Status on Long-term Multivitamin Use and Cardiovascular Disease Risk

    Science.gov (United States)

    Rautiainen, Susanne; Gaziano, J. Michael; Christen, William G.; Bubes, Vadim; Kotler, Gregory; Glynn, Robert J.; Manson, JoAnn E.; Buring, Julie E.

    2017-01-01

    Importance Long-term multivitamin use had no effect on risk of cardiovascular disease (CVD) in the Physicians’ Health Study II. Baseline nutritional status may have modified the lack of effect. Objective To investigate effect modification by various baseline dietary factors on CVD risk in the Physicians’ Health Study II. Design, Setting, and Participants The Physicians’ Health Study II was a randomized, double-blind, placebo-controlled trial testing multivitamin use (multivitamin [Centrum Silver] or placebo daily) among US male physicians. The Physicians’ Health Study II included 14 641 male physicians 50 years or older, 13 316 of whom (91.0%) completed a baseline 116-item semiquantitative food frequency questionnaire and were included in the analyses. This study examined effect modification by baseline intake of key foods, individual nutrients, dietary patterns (Alternate Healthy Eating Index and Alternate Mediterranean Diet Score), and dietary supplement use. The study began in 1997, with continued treatment and follow-up through June 1, 2011. Interventions Multivitamin or placebo daily. Main Outcomes and Measures Major cardiovascular events, including nonfatal myocardial infarction, nonfatal stroke, and CVD mortality. Secondary outcomes included myocardial infarction, total stroke, CVD mortality, and total mortality individually. Results In total, 13 316 male physicians (mean [SD] age at randomization, 64.0 [9.0] years in those receiving the active multivitamin and 64.0 [9.1] years in those receiving the placebo) were observed for a mean (SD) follow-up of 11.4 (2.3) years. There was no consistent evidence of effect modification by various foods, nutrients, dietary patterns, or baseline supplement use on the effect of multivitamin use on CVD end points. Statistically significant interaction effects were observed between multivitamin use and vitamin B6 intake on myocardial infarction, between multivitamin use and vitamin D intake on CVD mortality