WorldWideScience

Sample records for models including baseline

  1. A comparison of different ways of including baseline counts in negative binomial models for data from falls prevention trials.

    Science.gov (United States)

    Zheng, Han; Kimber, Alan; Goodwin, Victoria A; Pickering, Ruth M

    2018-01-01

    A common design for a falls prevention trial is to assess falling at baseline, randomize participants into an intervention or control group, and ask them to record the number of falls they experience during a follow-up period of time. This paper addresses how best to include the baseline count in the analysis of the follow-up count of falls in negative binomial (NB) regression. We examine the performance of various approaches in simulated datasets where both counts are generated from a mixed Poisson distribution with shared random subject effect. Including the baseline count after log-transformation as a regressor in NB regression (NB-logged) or as an offset (NB-offset) resulted in greater power than including the untransformed baseline count (NB-unlogged). Cook and Wei's conditional negative binomial (CNB) model replicates the underlying process generating the data. In our motivating dataset, a statistically significant intervention effect resulted from the NB-logged, NB-offset, and CNB models, but not from NB-unlogged, and large, outlying baseline counts were overly influential in NB-unlogged but not in NB-logged. We conclude that there is little to lose by including the log-transformed baseline count in standard NB regression compared to CNB for moderate to larger sized datasets. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Improving the accuracy of energy baseline models for commercial buildings with occupancy data

    International Nuclear Information System (INIS)

    Liang, Xin; Hong, Tianzhen; Shen, Geoffrey Qiping

    2016-01-01

    Highlights: • We evaluated several baseline models predicting energy use in buildings. • Including occupancy data improved accuracy of baseline model prediction. • Occupancy is highly correlated with energy use in buildings. • This simple approach can be used in decision makings of energy retrofit projects. - Abstract: More than 80% of energy is consumed during operation phase of a building’s life cycle, so energy efficiency retrofit for existing buildings is considered a promising way to reduce energy use in buildings. The investment strategies of retrofit depend on the ability to quantify energy savings by “measurement and verification” (M&V), which compares actual energy consumption to how much energy would have been used without retrofit (called the “baseline” of energy use). Although numerous models exist for predicting baseline of energy use, a critical limitation is that occupancy has not been included as a variable. However, occupancy rate is essential for energy consumption and was emphasized by previous studies. This study develops a new baseline model which is built upon the Lawrence Berkeley National Laboratory (LBNL) model but includes the use of building occupancy data. The study also proposes metrics to quantify the accuracy of prediction and the impacts of variables. However, the results show that including occupancy data does not significantly improve the accuracy of the baseline model, especially for HVAC load. The reasons are discussed further. In addition, sensitivity analysis is conducted to show the influence of parameters in baseline models. The results from this study can help us understand the influence of occupancy on energy use, improve energy baseline prediction by including the occupancy factor, reduce risks of M&V and facilitate investment strategies of energy efficiency retrofit.

  3. Business-as-Unusual: Existing policies in energy model baselines

    International Nuclear Information System (INIS)

    Strachan, Neil

    2011-01-01

    Baselines are generally accepted as a key input assumption in long-term energy modelling, but energy models have traditionally been poor on identifying baselines assumptions. Notably, transparency on the current policy content of model baselines is now especially critical as long-term climate mitigation policies have been underway for a number of years. This paper argues that the range of existing energy and emissions policies are an integral part of any long-term baseline, and hence already represent a 'with-policy' baseline, termed here a Business-as-Unusual (BAuU). Crucially, existing energy policies are not a sunk effort; as impacts of existing policy initiatives are targeted at future years, they may be revised through iterative policy making, and their quantitative effectiveness requires ex-post verification. To assess the long-term role of existing policies in energy modelling, currently identified UK policies are explicitly stripped out of the UK MARKAL Elastic Demand (MED) optimisation energy system model, to generate a BAuU (with-policy) and a REF (without policy) baseline. In terms of long-term mitigation costs, policy-baseline assumptions are comparable to another key exogenous modelling assumption - that of global fossil fuel prices. Therefore, best practice in energy modelling would be to have both a no-policy reference baseline, and a current policy reference baseline (BAuU). At a minimum, energy modelling studies should have a transparent assessment of the current policy contained within the baseline. Clearly identifying and comparing policy-baseline assumptions are required for cost effective and objective policy making, otherwise energy models will underestimate the true cost of long-term emissions reductions.

  4. Integrated Baseline System (IBS) Version 2.0: Models guide

    Energy Technology Data Exchange (ETDEWEB)

    1994-03-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Models Guide summarizes the IBS use of several computer models for predicting the results of emergency situations. These include models for predicting dispersion/doses of airborne contaminants, traffic evacuation, explosion effects, heat radiation from a fire, and siren sound transmission. The guide references additional technical documentation on the models when such documentation is available from other sources. The audience for this manual is chiefly emergency management planners and analysts, but also data managers and system managers.

  5. Analysis of baseline, average, and longitudinally measured blood pressure data using linear mixed models.

    Science.gov (United States)

    Hossain, Ahmed; Beyene, Joseph

    2014-01-01

    This article compares baseline, average, and longitudinal data analysis methods for identifying genetic variants in genome-wide association study using the Genetic Analysis Workshop 18 data. We apply methods that include (a) linear mixed models with baseline measures, (b) random intercept linear mixed models with mean measures outcome, and (c) random intercept linear mixed models with longitudinal measurements. In the linear mixed models, covariates are included as fixed effects, whereas relatedness among individuals is incorporated as the variance-covariance structure of the random effect for the individuals. The overall strategy of applying linear mixed models decorrelate the data is based on Aulchenko et al.'s GRAMMAR. By analyzing systolic and diastolic blood pressure, which are used separately as outcomes, we compare the 3 methods in identifying a known genetic variant that is associated with blood pressure from chromosome 3 and simulated phenotype data. We also analyze the real phenotype data to illustrate the methods. We conclude that the linear mixed model with longitudinal measurements of diastolic blood pressure is the most accurate at identifying the known single-nucleotide polymorphism among the methods, but linear mixed models with baseline measures perform best with systolic blood pressure as the outcome.

  6. A baseline-free procedure for transformation models under interval censorship.

    Science.gov (United States)

    Gu, Ming Gao; Sun, Liuquan; Zuo, Guoxin

    2005-12-01

    An important property of Cox regression model is that the estimation of regression parameters using the partial likelihood procedure does not depend on its baseline survival function. We call such a procedure baseline-free. Using marginal likelihood, we show that an baseline-free procedure can be derived for a class of general transformation models under interval censoring framework. The baseline-free procedure results a simplified and stable computation algorithm for some complicated and important semiparametric models, such as frailty models and heteroscedastic hazard/rank regression models, where the estimation procedures so far available involve estimation of the infinite dimensional baseline function. A detailed computational algorithm using Markov Chain Monte Carlo stochastic approximation is presented. The proposed procedure is demonstrated through extensive simulation studies, showing the validity of asymptotic consistency and normality. We also illustrate the procedure with a real data set from a study of breast cancer. A heuristic argument showing that the score function is a mean zero martingale is provided.

  7. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

  8. The WITCH Model. Structure, Baseline, Solutions.

    Energy Technology Data Exchange (ETDEWEB)

    Bosetti, V.; Massetti, E.; Tavoni, M.

    2007-07-01

    WITCH - World Induced Technical Change Hybrid - is a regionally disaggregated hard link hybrid global model with a neoclassical optimal growth structure (top down) and an energy input detail (bottom up). The model endogenously accounts for technological change, both through learning curves affecting prices of new vintages of capital and through R and D investments. The model features the main economic and environmental policies in each world region as the outcome of a dynamic game. WITCH belongs to the class of Integrated Assessment Models as it possesses a climate module that feeds climate changes back into the economy. In this paper we provide a thorough discussion of the model structure and baseline projections. We report detailed information on the evolution of energy demand, technology and CO2 emissions. Finally, we explicitly quantifiy the role of free riding in determining the emissions scenarios. (auth)

  9. Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing

    Energy Technology Data Exchange (ETDEWEB)

    Price, Phillip N.; Granderson, Jessica; Sohn, Michael; Addy, Nathan; Jump, David

    2013-09-01

    whose savings can be calculated with least error? 4. What is the state of public domain models, that is, how well do they perform, and what are the associated implications for whole-building measurement and verification (M&V)? Additional project objectives that were addressed as part of this study include: (1) clarification of the use cases and conditions for baseline modeling performance metrics, benchmarks and evaluation criteria, (2) providing guidance for determining customer suitability for baseline modeling, (3) describing the portfolio level effects of baseline model estimation errors, (4) informing PG&E’s development of EMIS technology product specifications, and (5) providing the analytical foundation for future studies about baseline modeling and saving effects of EMIS technologies. A final objective of this project was to demonstrate the application of the methodology, performance metrics, and test protocols with participating EMIS product vendors.

  10. Advanced three-dimensional thermal modeling of a baseline spent fuel repository

    International Nuclear Information System (INIS)

    Altenbach, T.J.; Lowry, W.E.

    1980-01-01

    A three-dimensional thermal analysis using finite difference techniques was performed to determine the near-field response of a baseline spent fuel repository in a deep geologic salt medium. A baseline design incorporates previous thermal modeling experience and OWI recommendations for areal thermal loading in specifying the waste form properties, package details, and emplacement configuration. The base case in this thermal analysis considers one 10-year old PWR spent fuel assembly emplaced to yield a 36 kW/acre (8.9 W/m 2 ) loading. A unit cell model in an infinite array is used to simplify the problem and provide upper-bound temperatures. Boundary conditions are imposed which allow simulations to 1000 years. Variations studied include a comparison of ventilated and unventilated storage room conditions, emplacement packages with and without air gaps surrounding the canister, and room cool-down scenarios with ventilation following an unventilated state for retrieval purposes. It was found that at this low-power level, ventilating the emplacement room has an immediate cooling influence on the canister and effectively maintains the emplacement room floor near the temperature of the ventilating air

  11. Recommendation to include fragrance mix 2 and hydroxyisohexyl 3-cyclohexene carboxaldehyde (Lyral) in the European baseline patch test series.

    Science.gov (United States)

    Bruze, Magnus; Andersen, Klaus Ejner; Goossens, An

    2008-03-01

    The currently used fragrance mix in the European baseline patch test series (baseline series) fails to detect a substantial number of clinically relevant fragrance allergies. To investigate whether it is justified to include hydroxyisohexyl 3-cyclohexene carboxaldehyde (Lyral) and fragrance mix 2 containing hydroxyisohexyl 3-cyclohexene carboxaldehyde, citral, farnesol, coumarin, citronellol, and alpha-hexyl cinnamal in the European baseline patch test series. Survey of the literature on reported frequencies of contact allergy and allergic contact dermatitis from fragrance mix 2 and hydroxyisohexyl 3-cyclohexene carboxaldehyde (Lyral) as well as reported results of experimental provocation test. Fragrance mix 2 has been demonstrated to be a useful additional marker of fragrance allergy with contact allergy rates up to 5% when included in various national baseline patch test series. Of the fragrance substances present in fragrance mix 2, hydroxyisohexyl 3-cyclohexene carboxaldehyde is the most common sensitizer. Contact allergy rates between 1.5% and 3% have been reported for hydroxyisohexyl 3-cyclohexene carboxaldehyde in petrolatum (pet.) at 5% from various European centres when tested in consecutive dermatitis patients. From 2008, pet. preparations of fragrance mix 2 at 14% w/w (5.6 mg/cm(2)) and hydroxyisohexyl 3-cyclohexene carboxaldehyde at 5% w/w (2.0 mg/cm(2)) are recommended for inclusion in the baseline series. With the Finn Chamber technique, a dose of 20 mg pet. preparation is recommended. Whenever there is a positive reaction to fragrance mix 2, additional patch testing with the 6 ingredients, 5 if there are simultaneous positive reactions to hydroxyisohexyl 3-cyclohexene carboxaldehyde and fragrance mix 2, is recommended.

  12. Annual Technology Baseline (Including Supporting Data); NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Blair, Nate; Cory, Karlynn; Hand, Maureen; Parkhill, Linda; Speer, Bethany; Stehly, Tyler; Feldman, David; Lantz, Eric; Augusting, Chad; Turchi, Craig; O' Connor, Patrick

    2015-07-08

    Consistent cost and performance data for various electricity generation technologies can be difficult to find and may change frequently for certain technologies. With the Annual Technology Baseline (ATB), National Renewable Energy Laboratory provides an organized and centralized dataset that was reviewed by internal and external experts. It uses the best information from the Department of Energy laboratory's renewable energy analysts and Energy Information Administration information for conventional technologies. The ATB will be updated annually in order to provide an up-to-date repository of current and future cost and performance data. Going forward, we plan to revise and refine the values using best available information. The ATB includes both a presentation with notes (PDF) and an associated Excel Workbook. The ATB includes the following electricity generation technologies: land-based wind; offshore wind; utility-scale solar PV; concentrating solar power; geothermal power; hydropower plants (upgrades to existing facilities, powering non-powered dams, and new stream-reach development); conventional coal; coal with carbon capture and sequestration; integrated gasification combined cycle coal; natural gas combustion turbines; natural gas combined cycle; conventional biopower. Nuclear laboratory's renewable energy analysts and Energy Information Administration information for conventional technologies. The ATB will be updated annually in order to provide an up-to-date repository of current and future cost and performance data. Going forward, we plan to revise and refine the values using best available information.

  13. Project W-320 thermal hydraulic model benchmarking and baselining

    International Nuclear Information System (INIS)

    Sathyanarayana, K.

    1998-01-01

    Project W-320 will be retrieving waste from Tank 241-C-106 and transferring the waste to Tank 241-AY-102. Waste in both tanks must be maintained below applicable thermal limits during and following the waste transfer. Thermal hydraulic process control models will be used for process control of the thermal limits. This report documents the process control models and presents a benchmarking of the models with data from Tanks 241-C-106 and 241-AY-102. Revision 1 of this report will provide a baselining of the models in preparation for the initiation of sluicing

  14. Improving Baseline Model Assumptions: Evaluating the Impacts of Typical Methodological Approaches in Watershed Models

    Science.gov (United States)

    Muenich, R. L.; Kalcic, M. M.; Teshager, A. D.; Long, C. M.; Wang, Y. C.; Scavia, D.

    2017-12-01

    Thanks to the availability of open-source software, online tutorials, and advanced software capabilities, watershed modeling has expanded its user-base and applications significantly in the past thirty years. Even complicated models like the Soil and Water Assessment Tool (SWAT) are being used and documented in hundreds of peer-reviewed publications each year, and likely more applied in practice. These models can help improve our understanding of present, past, and future conditions, or analyze important "what-if" management scenarios. However, baseline data and methods are often adopted and applied without rigorous testing. In multiple collaborative projects, we have evaluated the influence of some of these common approaches on model results. Specifically, we examined impacts of baseline data and assumptions involved in manure application, combined sewer overflows, and climate data incorporation across multiple watersheds in the Western Lake Erie Basin. In these efforts, we seek to understand the impact of using typical modeling data and assumptions, versus using improved data and enhanced assumptions on model outcomes and thus ultimately, study conclusions. We provide guidance for modelers as they adopt and apply data and models for their specific study region. While it is difficult to quantitatively assess the full uncertainty surrounding model input data and assumptions, recognizing the impacts of model input choices is important when considering actions at the both the field and watershed scales.

  15. Effects of ignoring baseline on modeling transitions from intact cognition to dementia.

    Science.gov (United States)

    Yu, Lei; Tyas, Suzanne L; Snowdon, David A; Kryscio, Richard J

    2009-07-01

    This paper evaluates the effect of ignoring baseline when modeling transitions from intact cognition to dementia with mild cognitive impairment (MCI) and global impairment (GI) as intervening cognitive states. Transitions among states are modeled by a discrete-time Markov chain having three transient (intact cognition, MCI, and GI) and two competing absorbing states (death and dementia). Transition probabilities depend on two covariates, age and the presence/absence of an apolipoprotein E-epsilon4 allele, through a multinomial logistic model with shared random effects. Results are illustrated with an application to the Nun Study, a cohort of 678 participants 75+ years of age at baseline and followed longitudinally with up to ten cognitive assessments per nun.

  16. Integrated Baseline Bystem (IBS) Version 1.03: Models guide

    Energy Technology Data Exchange (ETDEWEB)

    1993-01-01

    The Integrated Baseline System)(IBS), operated by the Federal Emergency Management Agency (FEMA), is a system of computerized tools for emergency planning and analysis. This document is the models guide for the IBS and explains how to use the emergency related computer models. This document provides information for the experienced system user, and is the primary reference for the computer modeling software supplied with the system. It is designed for emergency managers and planners, and others familiar with the concepts of computer modeling. Although the IBS manual set covers basic and advanced operations, it is not a complete reference document set. Emergency situation modeling software in the IBS is supported by additional technical documents. Some of the other IBS software is commercial software for which more complete documentation is available. The IBS manuals reference such documentation where necessary.

  17. Validity of urinary monoamine assay sales under the "spot baseline urinary neurotransmitter testing marketing model".

    Science.gov (United States)

    Hinz, Marty; Stein, Alvin; Uncini, Thomas

    2011-01-01

    Spot baseline urinary monoamine assays have been used in medicine for over 50 years as a screening test for monoamine-secreting tumors, such as pheochromocytoma and carcinoid syndrome. In these disease states, when the result of a spot baseline monoamine assay is above the specific value set by the laboratory, it is an indication to obtain a 24-hour urine sample to make a definitive diagnosis. There are no defined applications where spot baseline urinary monoamine assays can be used to diagnose disease or other states directly. No peer-reviewed published original research exists which demonstrates that these assays are valid in the treatment of individual patients in the clinical setting. Since 2001, urinary monoamine assay sales have been promoted for numerous applications under the "spot baseline urinary neurotransmitter testing marketing model". There is no published peer-reviewed original research that defines the scientific foundation upon which the claims for these assays are made. On the contrary, several articles have been published that discredit various aspects of the model. To fill the void, this manuscript is a comprehensive review of the scientific foundation and claims put forth by laboratories selling urinary monoamine assays under the spot baseline urinary neurotransmitter testing marketing model.

  18. Program reference schedule baseline

    International Nuclear Information System (INIS)

    1986-07-01

    This Program Reference Schedule Baseline (PRSB) provides the baseline Program-level milestones and associated schedules for the Civilian Radioactive Waste Management Program. It integrates all Program-level schedule-related activities. This schedule baseline will be used by the Director, Office of Civilian Radioactive Waste Management (OCRWM), and his staff to monitor compliance with Program objectives. Chapter 1 includes brief discussions concerning the relationship of the PRSB to the Program Reference Cost Baseline (PRCB), the Mission Plan, the Project Decision Schedule, the Total System Life Cycle Cost report, the Program Management Information System report, the Program Milestone Review, annual budget preparation, and system element plans. Chapter 2 includes the identification of all Level 0, or Program-level, milestones, while Chapter 3 presents and discusses the critical path schedules that correspond to those Level 0 milestones

  19. Developing RESRAD-BASELINE for environmental baseline risk assessment

    International Nuclear Information System (INIS)

    Cheng, Jing-Jy.

    1995-01-01

    RESRAD-BASELINE is a computer code developed at Argonne developed at Argonne National Laboratory for the US Department of Energy (DOE) to perform both radiological and chemical risk assessments. The code implements the baseline risk assessment guidance of the US Environmental Protection Agency (EPA 1989). The computer code calculates (1) radiation doses and cancer risks from exposure to radioactive materials, and (2) hazard indexes and cancer risks from exposure to noncarcinogenic and carcinogenic chemicals, respectively. The user can enter measured or predicted environmental media concentrations from the graphic interface and can simulate different exposure scenarios by selecting the appropriate pathways and modifying the exposure parameters. The database used by PESRAD-BASELINE includes dose conversion factors and slope factors for radionuclides and toxicity information and properties for chemicals. The user can modify the database for use in the calculation. Sensitivity analysis can be performed while running the computer code to examine the influence of the input parameters. Use of RESRAD-BASELINE for risk analysis is easy, fast, and cost-saving. Furthermore, it ensures in consistency in methodology for both radiological and chemical risk analyses

  20. The use of Bayesian networks for nanoparticle risk forecasting: model formulation and baseline evaluation.

    Science.gov (United States)

    Money, Eric S; Reckhow, Kenneth H; Wiesner, Mark R

    2012-06-01

    We describe the use of Bayesian networks as a tool for nanomaterial risk forecasting and develop a baseline probabilistic model that incorporates nanoparticle specific characteristics and environmental parameters, along with elements of exposure potential, hazard, and risk related to nanomaterials. The baseline model, FINE (Forecasting the Impacts of Nanomaterials in the Environment), was developed using expert elicitation techniques. The Bayesian nature of FINE allows for updating as new data become available, a critical feature for forecasting risk in the context of nanomaterials. The specific case of silver nanoparticles (AgNPs) in aquatic environments is presented here (FINE(AgNP)). The results of this study show that Bayesian networks provide a robust method for formally incorporating expert judgments into a probabilistic measure of exposure and risk to nanoparticles, particularly when other knowledge bases may be lacking. The model is easily adapted and updated as additional experimental data and other information on nanoparticle behavior in the environment become available. The baseline model suggests that, within the bounds of uncertainty as currently quantified, nanosilver may pose the greatest potential risk as these particles accumulate in aquatic sediments. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Framework for ensuring appropriate maintenance of baseline PSA and risk monitor models in a nuclear power plant

    International Nuclear Information System (INIS)

    Vrbanic, I.; Sorman, J.

    2005-01-01

    The necessity of observing both long term and short term risk changes many times imposes the need for a nuclear power plant to have a baseline PSA model to produce an estimate of long term averaged risk and a risk monitor to produce a time-dependent risk curve and/or safety functions status at points in time or over a shorter time period of interest. By nature, a baseline PSA reflects plant systems and operation in terms of average conditions and provides time-invariant quantitative risk metrics. Risk monitor, on the other hand, requires condition-specific modeling to produce a quantitative and/or qualitative estimate of plant's condition-specific risk metrics. While risk monitor is used for computing condition-specific risk metrics over time, a baseline PSA model is needed for variety of other risk oriented applications, such as assessments of proposed design modifications or risk ranking of equipment. Having in mind their importance and roles, it is essential that both models, i.e. baseline PSA model and risk monitor are maintained in the way that they represent, as accurately as practically achievable, the actual plant status (e.g. systems' design and plant's procedures in effect) and its history (e.g. numbers of equipment failures and demands that influence relevant PSA parameters). Paper discusses the requirements for appropriate maintenance of plant's baseline PSA model and risk monitor model and presents the framework for plant's engineering and administrative procedures that would ensure they are met. (author)

  2. Accounting for baseline differences and measurement error in the analysis of change over time.

    Science.gov (United States)

    Braun, Julia; Held, Leonhard; Ledergerber, Bruno

    2014-01-15

    If change over time is compared in several groups, it is important to take into account baseline values so that the comparison is carried out under the same preconditions. As the observed baseline measurements are distorted by measurement error, it may not be sufficient to include them as covariate. By fitting a longitudinal mixed-effects model to all data including the baseline observations and subsequently calculating the expected change conditional on the underlying baseline value, a solution to this problem has been provided recently so that groups with the same baseline characteristics can be compared. In this article, we present an extended approach where a broader set of models can be used. Specifically, it is possible to include any desired set of interactions between the time variable and the other covariates, and also, time-dependent covariates can be included. Additionally, we extend the method to adjust for baseline measurement error of other time-varying covariates. We apply the methodology to data from the Swiss HIV Cohort Study to address the question if a joint infection with HIV-1 and hepatitis C virus leads to a slower increase of CD4 lymphocyte counts over time after the start of antiretroviral therapy. Copyright © 2013 John Wiley & Sons, Ltd.

  3. Development and application of a statistical methodology to evaluate the predictive accuracy of building energy baseline models

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Energy Technologies Area Div.; Price, Phillip N. [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Energy Technologies Area Div.

    2014-03-01

    This paper documents the development and application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-­building energy savings. The methodology complements the principles addressed in resources such as ASHRAE Guideline 14 and the International Performance Measurement and Verification Protocol. It requires fitting a baseline model to data from a ``training period’’ and using the model to predict total electricity consumption during a subsequent ``prediction period.’’ We illustrate the methodology by evaluating five baseline models using data from 29 buildings. The training period and prediction period were varied, and model predictions of daily, weekly, and monthly energy consumption were compared to meter data to determine model accuracy. Several metrics were used to characterize the accuracy of the predictions, and in some cases the best-­performing model as judged by one metric was not the best performer when judged by another metric.

  4. Damages detection in cylindrical metallic specimens by means of statistical baseline models and updated daily temperature profiles

    Science.gov (United States)

    Villamizar-Mejia, Rodolfo; Mujica-Delgado, Luis-Eduardo; Ruiz-Ordóñez, Magda-Liliana; Camacho-Navarro, Jhonatan; Moreno-Beltrán, Gustavo

    2017-05-01

    In previous works, damage detection of metallic specimens exposed to temperature changes has been achieved by using a statistical baseline model based on Principal Component Analysis (PCA), piezodiagnostics principle and taking into account temperature effect by augmenting the baseline model or by using several baseline models according to the current temperature. In this paper a new approach is presented, where damage detection is based in a new index that combine Q and T2 statistical indices with current temperature measurements. Experimental tests were achieved in a carbon-steel pipe of 1m length and 1.5 inches diameter, instrumented with piezodevices acting as actuators or sensors. A PCA baseline model was obtained to a temperature of 21º and then T2 and Q statistical indices were obtained for a 24h temperature profile. Also, mass adding at different points of pipe between sensor and actuator was used as damage. By using the combined index the temperature contribution can be separated and a better differentiation of damages respect to undamaged cases can be graphically obtained.

  5. Reconsidering Cluster Bias in Multilevel Data: A Monte Carlo Comparison of Free and Constrained Baseline Approaches.

    Science.gov (United States)

    Guenole, Nigel

    2018-01-01

    The test for item level cluster bias examines the improvement in model fit that results from freeing an item's between level residual variance from a baseline model with equal within and between level factor loadings and between level residual variances fixed at zero. A potential problem is that this approach may include a misspecified unrestricted model if any non-invariance is present, but the log-likelihood difference test requires that the unrestricted model is correctly specified. A free baseline approach where the unrestricted model includes only the restrictions needed for model identification should lead to better decision accuracy, but no studies have examined this yet. We ran a Monte Carlo study to investigate this issue. When the referent item is unbiased, compared to the free baseline approach, the constrained baseline approach led to similar true positive (power) rates but much higher false positive (Type I error) rates. The free baseline approach should be preferred when the referent indicator is unbiased. When the referent assumption is violated, the false positive rate was unacceptably high for both free and constrained baseline approaches, and the true positive rate was poor regardless of whether the free or constrained baseline approach was used. Neither the free or constrained baseline approach can be recommended when the referent indicator is biased. We recommend paying close attention to ensuring the referent indicator is unbiased in tests of cluster bias. All Mplus input and output files, R, and short Python scripts used to execute this simulation study are uploaded to an open access repository.

  6. Baseline scenarios of global environmental change

    International Nuclear Information System (INIS)

    Alcamo, J.; Kreileman, G.J.J.; Bollen, J.C.; Born, G.J. van den; Krol, M.S.; Toet, A.M.C.; Vries, H.J.M. de; Gerlagh, R.

    1996-01-01

    This paper presents three baseline scenarios of no policy action computed by the IMAGE2 model. These scenarios cover a wide range of coupled global change indicators, including: energy demand and consumption; food demand, consumption, and production; changes in land cover including changes in extent of agricultural land and forest; emissions of greenhouse gases and ozone precursors; and climate change and its impacts on sea level rise, crop productivity and natural vegetation. Scenario information is available for the entire world with regional and grid scale detail, and covers from 1970 to 2100. (author)

  7. A comparison of baseline methodologies for 'Reducing Emissions from Deforestation and Degradation'

    Directory of Open Access Journals (Sweden)

    Kok Kasper

    2009-07-01

    Full Text Available Abstract Background A mechanism for emission reductions from deforestation and degradation (REDD is very likely to be included in a future climate agreement. The choice of REDD baseline methodologies will crucially influence the environmental and economic effectiveness of the climate regime. We compare three different historical baseline methods and one innovative dynamic model baseline approach to appraise their applicability under a future REDD policy framework using a weighted multi-criteria analysis. Results The results show that each baseline method has its specific strengths and weaknesses. Although the dynamic model allows for the best environmental and for comparatively good economic performance, its high demand for data and technical capacity limit the current applicability in many developing countries. Conclusion The adoption of a multi-tier approach will allow countries to select the baseline method best suiting their specific capabilities and data availability while simultaneously ensuring scientific transparency, environmental effectiveness and broad political support.

  8. Oscillation Baselining and Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-27

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  9. Large-baseline InSAR for precise topographic mapping: a framework for TanDEM-X large-baseline data

    Directory of Open Access Journals (Sweden)

    M. Pinheiro

    2017-09-01

    Full Text Available The global Digital Elevation Model (DEM resulting from the TanDEM-X mission provides information about the world topography with outstanding precision. In fact, performance analysis carried out with the already available data have shown that the global product is well within the requirements of 10 m absolute vertical accuracy and 2 m relative vertical accuracy for flat to moderate terrain. The mission's science phase took place from October 2014 to December 2015. During this phase, bistatic acquisitions with across-track separation between the two satellites up to 3.6 km at the equator were commanded. Since the relative vertical accuracy of InSAR derived elevation models is, in principle, inversely proportional to the system baseline, the TanDEM-X science phase opened the doors for the generation of elevation models with improved quality with respect to the standard product. However, the interferometric processing of the large-baseline data is troublesome due to the increased volume decorrelation and very high frequency of the phase variations. Hence, in order to fully profit from the increased baseline, sophisticated algorithms for the interferometric processing, and, in particular, for the phase unwrapping have to be considered. This paper proposes a novel dual-baseline region-growing framework for the phase unwrapping of the large-baseline interferograms. Results from two experiments with data from the TanDEM-X science phase are discussed, corroborating the expected increased level of detail of the large-baseline DEMs.

  10. Survey of Models on Demand, Customer Base-Line and Demand Response and Their Relationships in the Power Market

    OpenAIRE

    Heshmati, Almas

    2012-01-01

    The increasing use of demand-side management as a tool to reliably meet electricity demand at peak time has stimulated interest among researchers, consumers and producer organizations, managers, regulators and policymakers, This research reviews the growing literature on models used to study demand, consumer baseline (CBL) and demand response in the electricity market. After characterizing the general demand models, it reviews consumer baseline based on which further study the demand response...

  11. A Systematic Review and Meta-Analysis of Baseline Ohip-Edent Scores.

    Science.gov (United States)

    Duale, J M J; Patel, Y A; Wu, J; Hyde, T P

    2018-03-01

    OHIP-EDENT is widely used in the literature to assess Oral-Health-Related-Quality-of-Life (OHRQoL) for edentulous patients. However the normal variance and mean of the baseline OHIP scores has not been reported. It would facilitate critical appraisal of studies if we had knowledge of the normal variation and mean of baseline OHIP-EDENT scores. An established figure for baseline OHIP-EDENT, obtained from a meta-analysis, would simplify comparisons of studies and quantify variations in initial OHRQoL of the trial participants. The aim of this study is to quantify a normal baseline value for pre-operative OHIP-EDENT scores by a systematic review and meta-analysis of the available literature. A systematic literature review was carried. 83 papers were identified that included OHIP-EDENT values. After screening and eligibility assessment, 7 papers were selected and included in the meta-analysis. A meta-analysis for the 7 papers by a random-effect model yielded a mean baseline OHIP-EDENT score of 28.63 with a 95% Confidence intervals from 21.93 to 35.34. A pre-operative baseline OHIP-EDENT has been established by meta-analysis of published papers. This will facilitate the comparison of the initial OHRQoL of one study population to that found elsewhere in the published literature. Copyright© 2018 Dennis Barber Ltd.

  12. Baseline requirements of the proposed action for the Transportation Management Division routing models

    International Nuclear Information System (INIS)

    Johnson, P.E.; Joy, D.S.

    1995-02-01

    The potential impacts associated with the transportation of hazardous materials are important to shippers, carriers, and the general public. This is particularly true for shipments of radioactive material. The shippers are primarily concerned with safety, security, efficiency, and equipment requirements. The carriers are concerned with the potential impact that radioactive shipments may have on their operations--particularly if such materials are involved in an accident. The general public has also expressed concerns regarding the safety of transporting radioactive and other hazardous materials through their communities. Because transportation routes are a central concern in hazardous material transport, the prediction of likely routes is the first step toward resolution of these issues. In response to these routing needs, several models have been developed over the past fifteen years at Oak Ridge National Laboratory (ORNL). The HIGHWAY routing model is used to predict routes for truck transportation, the INTERLINE routing model is used to predict both rail and barge routes, and the AIRPORT locator model is used to determine airports with specified criteria near a specific location. As part of the ongoing improvement of the US Department of Energy's (DOE) Environmental Management Transportation Management Division's (EM-261) computer systems and development efforts, a Baseline Requirements Assessment Session on the HIGHWAY, INTERLINE, and AIRPORT models was held at ORNL on April 27, 1994. The purpose of this meeting was to discuss the existing capabilities of the models and data bases and to review enhancements of the models and data bases to expand their usefulness. The results of the Baseline Requirements Assessment Section will be discussed in this report. The discussions pertaining to the different models are contained in separate sections

  13. CryoSat SAR/SARin Level1b products: assessment of BaselineC and improvements towards BaselineD

    Science.gov (United States)

    Scagliola, Michele; Fornari, Marco; Bouffard, Jerome; Parrinello, Tommaso

    2017-04-01

    CryoSat was launched on the 8th April 2010 and is the first European ice mission dedicated to the monitoring of precise changes in the thickness of polar ice sheets and floating sea ice. Cryosat carries an innovative radar altimeter called the Synthetic Aperture Interferometric Altimeter (SIRAL), that transmits pulses at a high pulse repetition frequency thus making the received echoes phase coherent and suitable for azimuth processing. This allows to reach a significantly improved along track resolution with respect to traditional pulse-width limited altimeters. CryoSat is the first altimetry mission operating in SAR mode and continuous improvements in the Level1 Instrument Processing Facility (IPF1) are being identified, tested and validated in order to improve the quality of the Level1b products. The current IPF, Baseline C, was released in operation in April 2015 and the second CryoSat reprocessing campaign was jointly initiated, taking benefit of the upgrade implemented in the IPF1 processing chain but also of some specific configurations for the calibration corrections. In particular, the CryoSat Level1b BaselineC products generated in the framework of the second reprocessing campaign include refined information for what concerns the mispointing angles and the calibration corrections. This poster will thus detail thus the evolutions that are currently planned for the CryoSat BaselineD SAR/SARin Level1b products and the corresponding quality improvements that are expected.

  14. A Baseline Patient Model to Support Testing of Medical Cyber-Physical Systems.

    Science.gov (United States)

    Silva, Lenardo C; Perkusich, Mirko; Almeida, Hyggo O; Perkusich, Angelo; Lima, Mateus A M; Gorgônio, Kyller C

    2015-01-01

    Medical Cyber-Physical Systems (MCPS) are currently a trending topic of research. The main challenges are related to the integration and interoperability of connected medical devices, patient safety, physiologic closed-loop control, and the verification and validation of these systems. In this paper, we focus on patient safety and MCPS validation. We present a formal patient model to be used in health care systems validation without jeopardizing the patient's health. To determine the basic patient conditions, our model considers the four main vital signs: heart rate, respiratory rate, blood pressure and body temperature. To generate the vital signs we used regression models based on statistical analysis of a clinical database. Our solution should be used as a starting point for a behavioral patient model and adapted to specific clinical scenarios. We present the modeling process of the baseline patient model and show its evaluation. The conception process may be used to build different patient models. The results show the feasibility of the proposed model as an alternative to the immediate need for clinical trials to test these medical systems.

  15. Precise baseline determination for the TanDEM-X mission

    Science.gov (United States)

    Koenig, Rolf; Moon, Yongjin; Neumayer, Hans; Wermuth, Martin; Montenbruck, Oliver; Jäggi, Adrian

    The TanDEM-X mission will strive for generating a global precise Digital Elevation Model (DEM) by way of bi-static SAR in a close formation of the TerraSAR-X satellite, already launched on June 15, 2007, and the TanDEM-X satellite to be launched in May 2010. Both satellites carry the Tracking, Occultation and Ranging (TOR) payload supplied by the GFZ German Research Centre for Geosciences. The TOR consists of a high-precision dual-frequency GPS receiver, called Integrated GPS Occultation Receiver (IGOR), and a Laser retro-reflector (LRR) for precise orbit determination (POD) and atmospheric sounding. The IGOR is of vital importance for the TanDEM-X mission objectives as the millimeter level determination of the baseline or distance between the two spacecrafts is needed to derive meter level accurate DEMs. Within the TanDEM-X ground segment GFZ is responsible for the operational provision of precise baselines. For this GFZ uses two software chains, first its Earth Parameter and Orbit System (EPOS) software and second the BERNESE software, for backup purposes and quality control. In a concerted effort also the German Aerospace Center (DLR) generates precise baselines independently with a dedicated Kalman filter approach realized in its FRNS software. By the example of GRACE the generation of baselines with millimeter accuracy from on-board GPS data can be validated directly by way of comparing them to the intersatellite K-band range measurements. The K-band ranges are accurate down to the micrometer-level and therefore may be considered as truth. Both TanDEM-X baseline providers are able to generate GRACE baselines with sub-millimeter accuracy. By merging the independent baselines by GFZ and DLR, the accuracy can even be increased. The K-band validation however covers solely the along-track component as the K-band data measure just the distance between the two GRACE satellites. In addition they inhibit an un-known bias which must be modelled in the comparison, so the

  16. Accelerated Best Basis Inventory Baselining Task

    International Nuclear Information System (INIS)

    SASAKI, L.M.

    2001-01-01

    The baselining effort was recently proposed to bring the Best-Basis Inventory (BBI) and Question No.8 of the Tank Interpretive Report (TIR) for all 177 tanks to the current standards and protocols and to prepare a TIR Question No.8 if one is not already available. This plan outlines the objectives and methodology of the accelerated BBI baselining task. BBI baselining meetings held during December 2000 resulted in a revised BBI methodology and an initial set of BBI creation rules to be used in the baselining effort. The objectives of the BBI baselining effort are to: (1) Provide inventories that are consistent with the revised BBI methodology and new BBI creation rules. (2) Split the total tank waste in each tank into six waste phases, as appropriate (Supernatant, saltcake solids, saltcake liquid, sludge solids, sludge liquid, and retained gas). In some tanks, the solids and liquid portions of the sludge and/or saltcake may be combined into a single sludge or saltcake phase. (3) Identify sampling events that are to be used for calculating the BBIs. (4) Update waste volumes for subsequent reconciliation with the Hanlon (2001) waste tank summary. (5) Implement new waste type templates. (6) Include any sample data that might have been unintentionally omitted in the previous BBI and remove any sample data that should not have been included. Sample data to be used in the BBI must be available on TWINS. (7) Ensure that an inventory value for each standard BBI analyte is provided for each waste component. Sample based inventories for supplemental BBI analytes will be included when available. (8) Provide new means and confidence interval reports if one is not already available and include uncertainties in reporting inventory values

  17. Renewable Diesel from Algal Lipids: An Integrated Baseline for Cost, Emissions, and Resource Potential from a Harmonized Model

    Energy Technology Data Exchange (ETDEWEB)

    Davis, R.; Fishman, D.; Frank, E. D.; Wigmosta, M. S.; Aden, A.; Coleman, A. M.; Pienkos, P. T.; Skaggs, R. J.; Venteris, E. R.; Wang, M. Q.

    2012-06-01

    The U.S. Department of Energy's Biomass Program has begun an initiative to obtain consistent quantitative metrics for algal biofuel production to establish an 'integrated baseline' by harmonizing and combining the Program's national resource assessment (RA), techno-economic analysis (TEA), and life-cycle analysis (LCA) models. The baseline attempts to represent a plausible near-term production scenario with freshwater microalgae growth, extraction of lipids, and conversion via hydroprocessing to produce a renewable diesel (RD) blendstock. Differences in the prior TEA and LCA models were reconciled (harmonized) and the RA model was used to prioritize and select the most favorable consortium of sites that supports production of 5 billion gallons per year of RD. Aligning the TEA and LCA models produced slightly higher costs and emissions compared to the pre-harmonized results. However, after then applying the productivities predicted by the RA model (13 g/m2/d on annual average vs. 25 g/m2/d in the original models), the integrated baseline resulted in markedly higher costs and emissions. The relationship between performance (cost and emissions) and either productivity or lipid fraction was found to be non-linear, and important implications on the TEA and LCA results were observed after introducing seasonal variability from the RA model. Increasing productivity and lipid fraction alone was insufficient to achieve cost and emission targets; however, combined with lower energy, less expensive alternative technology scenarios, emissions and costs were substantially reduced.

  18. 40 CFR 74.20 - Data for baseline and alternative baseline.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Data for baseline and alternative baseline. 74.20 Section 74.20 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... baseline and alternative baseline. (a) Acceptable data. (1) The designated representative of a combustion...

  19. Baseline Predictors of Missed Visits in the Look AHEAD Study

    Science.gov (United States)

    Fitzpatrick, Stephanie L.; Jeffery, Robert; Johnson, Karen C.; Roche, Cathy C.; Van Dorsten, Brent; Gee, Molly; Johnson, Ruby Ann; Charleston, Jeanne; Dotson, Kathy; Walkup, Michael P.; Hill-Briggs, Felicia; Brancati, Frederick L.

    2013-01-01

    Objective To identify baseline attributes associated with consecutively missed data collection visits during the first 48 months of Look AHEAD—a randomized, controlled trial in 5145 overweight/obese adults with type 2 diabetes designed to determine the long-term health benefits of weight loss achieved by lifestyle change. Design and Methods The analyzed sample consisted of 5016 participants who were alive at month 48 and enrolled at Look AHEAD sites. Demographic, baseline behavior, psychosocial factors, and treatment randomization were included as predictors of missed consecutive visits in proportional hazard models. Results In multivariate Cox proportional hazard models, baseline attributes of participants who missed consecutive visits (n=222) included: younger age ( Hazard Ratio [HR] 1.18 per 5 years younger; 95% Confidence Interval 1.05, 1.30), higher depression score (HR 1.04; 1.01, 1.06), non-married status (HR 1.37; 1.04, 1.82), never self-weighing prior to enrollment (HR 2.01; 1.25, 3.23), and randomization to minimal vs. intensive lifestyle intervention (HR 1.46; 1.11, 1.91). Conclusions Younger age, symptoms of depression, non-married status, never self-weighing, and randomization to minimal intervention were associated with a higher likelihood of missing consecutive data collection visits, even in a high-retention trial like Look AHEAD. Whether modifications to screening or retention efforts targeted to these attributes might enhance long-term retention in behavioral trials requires further investigation. PMID:23996977

  20. Geochemical modelling baseline compositions of groundwater

    DEFF Research Database (Denmark)

    Postma, Diederik Jan; Kjøller, Claus; Andersen, Martin Søgaard

    2008-01-01

    and variations in water chemistry that are caused by large scale geochemical processes taking place at the timescale of thousands of years. The most important geochemical processes are ion exchange (Valreas and Aveiro) where freshwater solutes are displacing marine ions from the sediment surface, and carbonate......Reactive transport models, were developed to explore the evolution in groundwater chemistry along the flow path in three aquifers; the Triassic East Midland aquifer (UK), the Miocene aquifer at Valreas (F) and the Cretaceous aquifer near Aveiro (P). All three aquifers contain very old groundwaters...... dissolution (East Midlands, Valreas and Aveiro). Reactive transport models, employing the code PHREEQC, which included these geochemical processes and one-dimensional solute transport were able to duplicate the observed patterns in water quality. These models may provide a quantitative understanding...

  1. Markov Model Predicts Changes in STH Prevalence during Control Activities Even with a Reduced Amount of Baseline Information.

    Directory of Open Access Journals (Sweden)

    Antonio Montresor

    2016-04-01

    Full Text Available Estimating the reduction in levels of infection during implementation of soil-transmitted helminth (STH control programmes is important to measure their performance and to plan interventions. Markov modelling techniques have been used with some success to predict changes in STH prevalence following treatment in Viet Nam. The model is stationary and to date, the prediction has been obtained by calculating the transition probabilities between the different classes of intensity following the first year of drug distribution and assuming that these remain constant in subsequent years. However, to run this model longitudinal parasitological data (including intensity of infection are required for two consecutive years from at least 200 individuals. Since this amount of data is not often available from STH control programmes, the possible application of the model in control programme is limited. The present study aimed to address this issue by adapting the existing Markov model to allow its application when a more limited amount of data is available and to test the predictive capacities of these simplified models.We analysed data from field studies conducted with different combination of three parameters: (i the frequency of drug administration; (ii the drug distributed; and (iii the target treatment population (entire population or school-aged children only. This analysis allowed us to define 10 sets of standard transition probabilities to be used to predict prevalence changes when only baseline data are available (simplified model 1. We also formulated three equations (one for each STH parasite to calculate the predicted prevalence of the different classes of intensity from the total prevalence. These equations allowed us to design a simplified model (SM2 to obtain predictions when the classes of intensity at baseline were not known. To evaluate the performance of the simplified models, we collected data from the scientific literature on changes in

  2. Technical baseline description for in situ vitrification laboratory test equipment

    International Nuclear Information System (INIS)

    Beard, K.V.; Bonnenberg, R.W.; Watson, L.R.

    1991-09-01

    IN situ vitrification (ISV) has been identified as possible waste treatment technology. ISV was developed by Pacific Northwest Laboratory (PNL), Richland, Washington, as a thermal treatment process to treat contaminated soils in place. The process, which electrically melts and dissolves soils and associated inorganic materials, simultaneously destroys and/or removes organic contaminants while incorporating inorganic contaminants into a stable, glass-like residual product. This Technical Baseline Description has been prepared to provide high level descriptions of the design of the Laboratory Test model, including all design modifications and safety improvements made to data. Furthermore, the Technical Baseline Description provides a basic overview of the interface documents for configuration management, program management interfaces, safety, quality, and security requirements. 8 figs

  3. TAPIR--Finnish national geochemical baseline database.

    Science.gov (United States)

    Jarva, Jaana; Tarvainen, Timo; Reinikainen, Jussi; Eklund, Mikael

    2010-09-15

    In Finland, a Government Decree on the Assessment of Soil Contamination and Remediation Needs has generated a need for reliable and readily accessible data on geochemical baseline concentrations in Finnish soils. According to the Decree, baseline concentrations, referring both to the natural geological background concentrations and the diffuse anthropogenic input of substances, shall be taken into account in the soil contamination assessment process. This baseline information is provided in a national geochemical baseline database, TAPIR, that is publicly available via the Internet. Geochemical provinces with elevated baseline concentrations were delineated to provide regional geochemical baseline values. The nationwide geochemical datasets were used to divide Finland into geochemical provinces. Several metals (Co, Cr, Cu, Ni, V, and Zn) showed anomalous concentrations in seven regions that were defined as metal provinces. Arsenic did not follow a similar distribution to any other elements, and four arsenic provinces were separately determined. Nationwide geochemical datasets were not available for some other important elements such as Cd and Pb. Although these elements are included in the TAPIR system, their distribution does not necessarily follow the ones pre-defined for metal and arsenic provinces. Regional geochemical baseline values, presented as upper limit of geochemical variation within the region, can be used as trigger values to assess potential soil contamination. Baseline values have also been used to determine upper and lower guideline values that must be taken into account as a tool in basic risk assessment. If regional geochemical baseline values are available, the national guideline values prescribed in the Decree based on ecological risks can be modified accordingly. The national geochemical baseline database provides scientifically sound, easily accessible and generally accepted information on the baseline values, and it can be used in various

  4. Life Support Baseline Values and Assumptions Document

    Science.gov (United States)

    Anderson, Molly S.; Ewert, Michael K.; Keener, John F.

    2018-01-01

    The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. This document identifies many specific physical quantities that define life support systems, serving as a general reference for spacecraft life support system technology developers.

  5. From groundwater baselines to numerical groundwater flow modelling for the Milan metropolitan area

    Science.gov (United States)

    Crosta, Giovanni B.; Frattini, Paolo; Peretti, Lidia; Villa, Federica; Gorla, Maurizio

    2015-04-01

    Contamination of major aquifers in highly densely populated areas is a major concern for stakeholders involved in the use and protection of groundwater resources. Sustainable groundwater withdrawal and management, and the identification of trends in groundwater contamination require a careful hydrochemical baseline characterization. This characterization is fundamental to investigate the presence and evolutionary trend of contaminants. In fact, it allows recovering and understanding: the spatial-temporal trend of contamination; the relative age of the contamination episodes; the reasons for anomalous behavior of some compounds during migration to and in the groundwater; the associations with which some contaminants can be found; the different behaviors in phreatic and semi-confined and confined aquifers. To attain such a characterization for the Milan metropolitan area (about 2,500 km2, ca 4.000.000 inhabitants, Lombardy, Italy), we carried out three main activities. (1) Collection of complete and reliable datasets concerning the geological, hydrogeological and hydrochemical (over 60,000 chemical analysis since 2003 to 2013) characteristics of the area and of the involved aquifers. This activity was very demanding because the available data are provided by different authorities (Lombardy Region, Provinces, Lombardy Environmental Agency - ARPA Lombardia, public own companies in charge of water system managements) in raw format and with different database standard, which required a large effort of manual verification and harmonization. (2) Completion of a hydrochemical characterization of the metropolitan area aquifers by classical statistical and multivariate statistical analyses, in order to define a baseline both for some major physical chemical characteristics and for the most relevant contaminants. (3) Development of a three dimensional hydrogeological model for the metropolitan area starting from the above listed datasets and existing models. This model will

  6. A multistate model of cognitive dynamics in relation to resistance training: the contribution of baseline function.

    Science.gov (United States)

    Fallah, Nader; Hsu, Chun L; Bolandzadeh, Niousha; Davis, Jennifer; Beattie, B Lynn; Graf, Peter; Liu-Ambrose, Teresa

    2013-08-01

    We investigated: (1) the effect of different targeted exercise training on an individual's overall probability for cognitive improvement, maintenance, or decline; and (2) the simultaneous effect of targeted exercise training and baseline function on the dynamics of executive functions when a multistate transition model is used. Analyses are based on a 12-month randomized clinical trial including 155 community-dwelling women 65-75 years of age who were randomly allocated to once-weekly resistance training (1x RT; n = 54), twice-weekly resistance training (2x RT; n = 52), or twice-weekly balance and tone training (BAT; n = 49). The primary outcome measure was performance on the Stroop test, an executive cognitive test of selective attention and conflict resolution. Secondary outcomes of executive functions were set shifting and working memory. Individuals in the 1x RT or 2x RT group demonstrated a significantly greater probability for improved performance on the Stroop Test (0.49; 95% confidence interval, 0.41-0.57) compared with those in the BAT group (0.25; 95% confidence interval, 0.25-0.40). Resistance training had significant effects on transitions in selective attention and conflict resolution. Resistance training is efficacious in improving a measure of selective attention and conflict resolution in older women, probably more so among those with greater baseline cognitive function. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. Extracting Baseline Electricity Usage Using Gradient Tree Boosting

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Taehoon [Ulsan Nat. Inst. of Sci. & Tech., Ulsan (South Korea); Lee, Dongeun [Ulsan Nat. Inst. of Sci. & Tech., Ulsan (South Korea); Choi, Jaesik [Ulsan Nat. Inst. of Sci. & Tech., Ulsan (South Korea); Spurlock, Anna [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sim, Alex [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Todd, Annika [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wu, Kesheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-05-05

    To understand how specific interventions affect a process observed over time, we need to control for the other factors that influence outcomes. Such a model that captures all factors other than the one of interest is generally known as a baseline. In our study of how different pricing schemes affect residential electricity consumption, the baseline would need to capture the impact of outdoor temperature along with many other factors. In this work, we examine a number of different data mining techniques and demonstrate Gradient Tree Boosting (GTB) to be an effective method to build the baseline. We train GTB on data prior to the introduction of new pricing schemes, and apply the known temperature following the introduction of new pricing schemes to predict electricity usage with the expected temperature correction. Our experiments and analyses show that the baseline models generated by GTB capture the core characteristics over the two years with the new pricing schemes. In contrast to the majority of regression based techniques which fail to capture the lag between the peak of daily temperature and the peak of electricity usage, the GTB generated baselines are able to correctly capture the delay between the temperature peak and the electricity peak. Furthermore, subtracting this temperature-adjusted baseline from the observed electricity usage, we find that the resulting values are more amenable to interpretation, which demonstrates that the temperature-adjusted baseline is indeed effective.

  8. A long baseline global stereo matching based upon short baseline estimation

    Science.gov (United States)

    Li, Jing; Zhao, Hong; Li, Zigang; Gu, Feifei; Zhao, Zixin; Ma, Yueyang; Fang, Meiqi

    2018-05-01

    In global stereo vision, balancing the matching efficiency and computing accuracy seems to be impossible because they contradict each other. In the case of a long baseline, this contradiction becomes more prominent. In order to solve this difficult problem, this paper proposes a novel idea to improve both the efficiency and accuracy in global stereo matching for a long baseline. In this way, the reference images located between the long baseline image pairs are firstly chosen to form the new image pairs with short baselines. The relationship between the disparities of pixels in the image pairs with different baselines is revealed by considering the quantized error so that the disparity search range under the long baseline can be reduced by guidance of the short baseline to gain matching efficiency. Then, the novel idea is integrated into the graph cuts (GCs) to form a multi-step GC algorithm based on the short baseline estimation, by which the disparity map under the long baseline can be calculated iteratively on the basis of the previous matching. Furthermore, the image information from the pixels that are non-occluded under the short baseline but are occluded for the long baseline can be employed to improve the matching accuracy. Although the time complexity of the proposed method depends on the locations of the chosen reference images, it is usually much lower for a long baseline stereo matching than when using the traditional GC algorithm. Finally, the validity of the proposed method is examined by experiments based on benchmark datasets. The results show that the proposed method is superior to the traditional GC method in terms of efficiency and accuracy, and thus it is suitable for long baseline stereo matching.

  9. Tank waste remediation system technical baseline summary description

    International Nuclear Information System (INIS)

    Raymond, R.E.

    1998-01-01

    This document is one of the tools used to develop and control the mission work as depicted in the included figure. This Technical Baseline Summary Description document is the top-level tool for management of the Technical Baseline for waste storage operations

  10. CryoSat Ice Processor: High-Level Overview of Baseline-C Data and Quality-Control

    Science.gov (United States)

    Mannan, R.; Webb, E.; Hall, A.; Bouffard, J.; Femenias, P.; Parrinello, T.; Bouffard, J.; Brockley, D.; Baker, S.; Scagliola, M.; Urien, S.

    2016-08-01

    Since April 2015, the CryoSat ice products have been generated with the new Baseline-C Instrument Processing Facilities (IPFs). This represents a major upgrade to the CryoSat ice IPFs and is the baseline for the second CryoSat Reprocessing Campaign. Baseline- C introduces major evolutions with respect to Baseline- B, most notably the release of freeboard data within the L2 SAR products, following optimisation of the SAR retracker. Additional L2 improvements include a new Arctic Mean Sea Surface (MSS) in SAR; a new tuneable land ice retracker in LRM; and a new Digital Elevation Model (DEM) in SARIn. At L1B new attitude fields have been introduced and existing datation and range biases reduced. This paper provides a high level overview of the changes and evolutions implemented at Baseline-C in order to improve CryoSat L1B and L2 data characteristics and exploitation over polar regions. An overview of the main Quality Control (QC) activities performed on operational Baseline-C products is also presented.

  11. Hanford Site technical baseline database. Revision 1

    International Nuclear Information System (INIS)

    Porter, P.E.

    1995-01-01

    This report lists the Hanford specific files (Table 1) that make up the Hanford Site Technical Baseline Database. Table 2 includes the delta files that delineate the differences between this revision and revision 0 of the Hanford Site Technical Baseline Database. This information is being managed and maintained on the Hanford RDD-100 System, which uses the capabilities of RDD-100, a systems engineering software system of Ascent Logic Corporation (ALC). This revision of the Hanford Site Technical Baseline Database uses RDD-100 version 3.0.2.2 (see Table 3). Directories reflect those controlled by the Hanford RDD-100 System Administrator. Table 4 provides information regarding the platform. A cassette tape containing the Hanford Site Technical Baseline Database is available

  12. The TDAQ Baseline Architecture

    CERN Multimedia

    Wickens, F J

    The Trigger-DAQ community is currently busy preparing material for the DAQ, HLT and DCS TDR. Over the last few weeks a very important step has been a series of meetings to complete agreement on the baseline architecture. An overview of the architecture indicating some of the main parameters is shown in figure 1. As reported at the ATLAS Plenary during the February ATLAS week, the main area where the baseline had not yet been agreed was around the Read-Out System (ROS) and details in the DataFlow. The agreed architecture has: Read-Out Links (ROLs) from the RODs using S-Link; Read-Out Buffers (ROB) sited near the RODs, mounted in a chassis - today assumed to be a PC, using PCI bus at least for configuration, control and monitoring. The baseline assumes data aggregation, in the ROB and/or at the output (which could either be over a bus or in the network). Optimization of the data aggregation will be made in the coming months, but the current model has each ROB card receiving input from 4 ROLs, and 3 such c...

  13. Wide baseline stereo matching based on double topological relationship consistency

    Science.gov (United States)

    Zou, Xiaohong; Liu, Bin; Song, Xiaoxue; Liu, Yang

    2009-07-01

    Stereo matching is one of the most important branches in computer vision. In this paper, an algorithm is proposed for wide-baseline stereo vision matching. Here, a novel scheme is presented called double topological relationship consistency (DCTR). The combination of double topological configuration includes the consistency of first topological relationship (CFTR) and the consistency of second topological relationship (CSTR). It not only sets up a more advanced model on matching, but discards mismatches by iteratively computing the fitness of the feature matches and overcomes many problems of traditional methods depending on the powerful invariance to changes in the scale, rotation or illumination across large view changes and even occlusions. Experimental examples are shown where the two cameras have been located in very different orientations. Also, epipolar geometry can be recovered using RANSAC by far the most widely method adopted possibly. By the method, we can obtain correspondences with high precision on wide baseline matching problems. Finally, the effectiveness and reliability of this method are demonstrated in wide-baseline experiments on the image pairs.

  14. MOS modeling hierarchy including radiation effects

    International Nuclear Information System (INIS)

    Alexander, D.R.; Turfler, R.M.

    1975-01-01

    A hierarchy of modeling procedures has been developed for MOS transistors, circuit blocks, and integrated circuits which include the effects of total dose radiation and photocurrent response. The models were developed for use with the SCEPTRE circuit analysis program, but the techniques are suitable for other modern computer aided analysis programs. The modeling hierarchy permits the designer or analyst to select the level of modeling complexity consistent with circuit size, parametric information, and accuracy requirements. Improvements have been made in the implementation of important second order effects in the transistor MOS model, in the definition of MOS building block models, and in the development of composite terminal models for MOS integrated circuits

  15. Physics with a very long neutrino factory baseline

    International Nuclear Information System (INIS)

    Gandhi, Raj; Winter, Walter

    2007-01-01

    We discuss the neutrino oscillation physics of a very long neutrino factory baseline over a broad range of lengths (between 6000 km and 9000 km), centered on the 'magic baseline' (∼7500 km) where correlations with the leptonic CP phase are suppressed by matter effects. Since the magic baseline depends only on the density, we study the impact of matter density profile effects and density uncertainties over this range, and the impact of detector locations off the optimal baseline. We find that the optimal constant density describing the physics over this entire baseline range is about 5% higher than the average matter density. This implies that the magic baseline is significantly shorter than previously inferred. However, while a single detector optimization requires fine-tuning of the (very long) baseline length, its combination with a near detector at a shorter baseline is much less sensitive to the far detector location and to uncertainties in the matter density. In addition, we point out different applications of this baseline which go beyond its excellent correlation and degeneracy resolution potential. We demonstrate that such a long baseline assists in the improvement of the θ 13 precision and in the resolution of the octant degeneracy. Moreover, we show that the neutrino data from such a baseline could be used to extract the matter density along the profile up to 0.24% at 1σ for large sin 2 2θ 13 , providing a useful discriminator between different geophysical models

  16. Updating the U.S. Life Cycle GHG Petroleum Baseline to 2014 with Projections to 2040 Using Open-Source Engineering-Based Models.

    Science.gov (United States)

    Cooney, Gregory; Jamieson, Matthew; Marriott, Joe; Bergerson, Joule; Brandt, Adam; Skone, Timothy J

    2017-01-17

    The National Energy Technology Laboratory produced a well-to-wheels (WTW) life cycle greenhouse gas analysis of petroleum-based fuels consumed in the U.S. in 2005, known as the NETL 2005 Petroleum Baseline. This study uses a set of engineering-based, open-source models combined with publicly available data to calculate baseline results for 2014. An increase between the 2005 baseline and the 2014 results presented here (e.g., 92.4 vs 96.2 g CO 2 e/MJ gasoline, + 4.1%) are due to changes both in modeling platform and in the U.S. petroleum sector. An updated result for 2005 was calculated to minimize the effect of the change in modeling platform, and emissions for gasoline in 2014 were about 2% lower than in 2005 (98.1 vs 96.2 g CO 2 e/MJ gasoline). The same methods were utilized to forecast emissions from fuels out to 2040, indicating maximum changes from the 2014 gasoline result between +2.1% and -1.4%. The changing baseline values lead to potential compliance challenges with frameworks such as the Energy Independence and Security Act (EISA) Section 526, which states that Federal agencies should not purchase alternative fuels unless their life cycle GHG emissions are less than those of conventionally produced, petroleum-derived fuels.

  17. A Kalman filter-based short baseline RTK algorithm for single-frequency combination of GPS and BDS.

    Science.gov (United States)

    Zhao, Sihao; Cui, Xiaowei; Guan, Feng; Lu, Mingquan

    2014-08-20

    The emerging Global Navigation Satellite Systems (GNSS) including the BeiDou Navigation Satellite System (BDS) offer more visible satellites for positioning users. To employ those new satellites in a real-time kinematic (RTK) algorithm to enhance positioning precision and availability, a data processing model for the dual constellation of GPS and BDS is proposed and analyzed. A Kalman filter-based algorithm is developed to estimate the float ambiguities for short baseline scenarios. The entire work process of the high-precision algorithm based on the proposed model is deeply investigated in detail. The model is validated with real GPS and BDS data recorded from one zero and two short baseline experiments. Results show that the proposed algorithm can generate fixed baseline output with the same precision level as that of either a single GPS or BDS RTK algorithm. The significantly improved fixed rate and time to first fix of the proposed method demonstrates a better availability and effectiveness on processing multi-GNSSs.

  18. A Kalman Filter-Based Short Baseline RTK Algorithm for Single-Frequency Combination of GPS and BDS

    Directory of Open Access Journals (Sweden)

    Sihao Zhao

    2014-08-01

    Full Text Available The emerging Global Navigation Satellite Systems (GNSS including the BeiDou Navigation Satellite System (BDS offer more visible satellites for positioning users. To employ those new satellites in a real-time kinematic (RTK algorithm to enhance positioning precision and availability, a data processing model for the dual constellation of GPS and BDS is proposed and analyzed. A Kalman filter-based algorithm is developed to estimate the float ambiguities for short baseline scenarios. The entire work process of the high-precision algorithm based on the proposed model is deeply investigated in detail. The model is validated with real GPS and BDS data recorded from one zero and two short baseline experiments. Results show that the proposed algorithm can generate fixed baseline output with the same precision level as that of either a single GPS or BDS RTK algorithm. The significantly improved fixed rate and time to first fix of the proposed method demonstrates a better availability and effectiveness on processing multi-GNSSs.

  19. Baseline for the cumulants of net-proton distributions at STAR

    International Nuclear Information System (INIS)

    Luo, Xiaofeng; Mohanty, Bedangadas; Xu, Nu

    2014-01-01

    We present a systematic comparison between the recently measured cumulants of the net-proton distributions by STAR for 0–5% central Au + Au collisions at √(s NN )=7.7–200 GeV and two kinds of possible baseline measure, the Poisson and Binomial baselines. These baseline measures are assuming that the proton and anti-proton distributions independently follow Poisson statistics or Binomial statistics. The higher order cumulant net-proton data are observed to deviate from all the baseline measures studied at 19.6 and 27 GeV. We also compare the net-proton with net-baryon fluctuations in UrQMD and AMPT model, and convert the net-proton fluctuations to net-baryon fluctuations in AMPT model by using a set of formula

  20. Office of Geologic Repositories program baseline procedures notebook (OGR/B-1)

    International Nuclear Information System (INIS)

    1986-06-01

    Baseline management is typically applied to aid in the internal control of a program by providing consistent programmatic direction, control, and surveillance to an evolving system development. This fundamental concept of internal program control involves the establishment of a baseline to serve as a point of departure for consistent technical program coordination and to control subsequent changes from that baseline. The existence of a program-authorized baseline ensures that all participants are working to the same ground rules. Baseline management also ensures that, once the baseline is defined, changes are assessed and approved by a process which ensures adequate consideration of overall program impact. Baseline management also includes the consideration of examptions from the baseline. The process of baseline management continues through all the phases of an evolving system development program. As the Program proceeds, there will be a progressive increase in the data contained in the baseline documentation. Baseline management has been selected as a management technique to aid in the internal control of the Office of Geologic Repositories (OGR) program. Specifically, an OGR Program Baseline, including technical and programmatic requirements, is used for program control of the four Mined Geologic Disposal System field projects, i.e., Basalt Waste Isolation Project, Nevada Nuclear Waste Storage Investigation, Salt Repository Project and Crystalline Repository Project. This OGR Program Baseline Procedures Notebook provides a description of the baseline mwanagement concept, establishes the OGR Program baseline itself, and provides procedures to be followed for controlling changes to that baseline. The notebook has a controlled distribution and will be updated as required

  1. Multilevel models for multiple-baseline data: modeling across-participant variation in autocorrelation and residual variance.

    Science.gov (United States)

    Baek, Eun Kyeng; Ferron, John M

    2013-03-01

    Multilevel models (MLM) have been used as a method for analyzing multiple-baseline single-case data. However, some concerns can be raised because the models that have been used assume that the Level-1 error covariance matrix is the same for all participants. The purpose of this study was to extend the application of MLM of single-case data in order to accommodate across-participant variation in the Level-1 residual variance and autocorrelation. This more general model was then used in the analysis of single-case data sets to illustrate the method, to estimate the degree to which the autocorrelation and residual variances differed across participants, and to examine whether inferences about treatment effects were sensitive to whether or not the Level-1 error covariance matrix was allowed to vary across participants. The results from the analyses of five published studies showed that when the Level-1 error covariance matrix was allowed to vary across participants, some relatively large differences in autocorrelation estimates and error variance estimates emerged. The changes in modeling the variance structure did not change the conclusions about which fixed effects were statistically significant in most of the studies, but there was one exception. The fit indices did not consistently support selecting either the more complex covariance structure, which allowed the covariance parameters to vary across participants, or the simpler covariance structure. Given the uncertainty in model specification that may arise when modeling single-case data, researchers should consider conducting sensitivity analyses to examine the degree to which their conclusions are sensitive to modeling choices.

  2. Beyond total treatment effects in randomised controlled trials: Baseline measurement of intermediate outcomes needed to reduce confounding in mediation investigations.

    Science.gov (United States)

    Landau, Sabine; Emsley, Richard; Dunn, Graham

    2018-06-01

    Random allocation avoids confounding bias when estimating the average treatment effect. For continuous outcomes measured at post-treatment as well as prior to randomisation (baseline), analyses based on (A) post-treatment outcome alone, (B) change scores over the treatment phase or (C) conditioning on baseline values (analysis of covariance) provide unbiased estimators of the average treatment effect. The decision to include baseline values of the clinical outcome in the analysis is based on precision arguments, with analysis of covariance known to be most precise. Investigators increasingly carry out explanatory analyses to decompose total treatment effects into components that are mediated by an intermediate continuous outcome and a non-mediated part. Traditional mediation analysis might be performed based on (A) post-treatment values of the intermediate and clinical outcomes alone, (B) respective change scores or (C) conditioning on baseline measures of both intermediate and clinical outcomes. Using causal diagrams and Monte Carlo simulation, we investigated the performance of the three competing mediation approaches. We considered a data generating model that included three possible confounding processes involving baseline variables: The first two processes modelled baseline measures of the clinical variable or the intermediate variable as common causes of post-treatment measures of these two variables. The third process allowed the two baseline variables themselves to be correlated due to past common causes. We compared the analysis models implied by the competing mediation approaches with this data generating model to hypothesise likely biases in estimators, and tested these in a simulation study. We applied the methods to a randomised trial of pragmatic rehabilitation in patients with chronic fatigue syndrome, which examined the role of limiting activities as a mediator. Estimates of causal mediation effects derived by approach (A) will be biased if one of

  3. A simplified baseline prediction model for joint damage progression in rheumatoid arthritis: a step toward personalized medicine.

    Science.gov (United States)

    de Punder, Yvonne M R; van Riel, Piet L C M; Fransen, Jaap

    2015-03-01

    To compare the performance of an extended model and a simplified prognostic model for joint damage in rheumatoid arthritis (RA) based on 3 baseline risk factors: anticyclic citrullinated peptide antibodies (anti-CCP), erosions, and acute-phase reaction. Data were used from the Nijmegen early RA cohort. An extended model and a simplified baseline prediction model were developed to predict joint damage progression between 0 and 3 years. Joint damage progression was assessed using the Ratingen score. In the extended model, prediction factors were positivity for anti-CCP and/or rheumatoid factor, the level of erythrocyte sedimentation rate, and the quantity of erosions. The prediction score was calculated as the sum of the regression coefficients. In the simplified model, the prediction factors were dichotomized and the number of risk factors was counted. Performances of both models were compared using discrimination and calibration. The models were internally validated using bootstrapping. The extended model resulted in a prediction score between 0 and 5.6 with an area under the receiver-operation characteristic (ROC) curve of 0.77 (95% CI 0.72-0.81). The simplified model resulted in a prediction score between 0 and 3. This model had an area under the ROC curve of 0.75 (95% CI 0.70-0.80). In internal validation, the 2 models showed reasonably well the agreement between observed and predicted probabilities for joint damage progression (Hosmer-Lemeshow test p > 0.05 and calibration slope near 1.0). A simple prediction model for joint damage progression in early RA, by only counting the number of risk factors, has adequate performance. This facilitates the translation of the theoretical prognostic models to daily clinical practice.

  4. Baseline for Climate Change: Modeling Watershed Aquatic Biodiversity Relative to Environmental and Anthropogenic Factors

    Energy Technology Data Exchange (ETDEWEB)

    Maurakis, Eugene G

    2010-10-01

    Objectives of the two-year study were to (1) establish baselines for fish and macroinvertebrate community structures in two mid-Atlantic lower Piedmont watersheds (Quantico Creek, a pristine forest watershed; and Cameron Run, an urban watershed, Virginia) that can be used to monitor changes relative to the impacts related to climate change in the future; (2) create mathematical expressions to model fish species richness and diversity, and macroinvertebrate taxa and macroinvertebrate functional feeding group taxa richness and diversity that can serve as a baseline for future comparisons in these and other watersheds in the mid-Atlantic region; and (3) heighten people’s awareness, knowledge and understanding of climate change and impacts on watersheds in a laboratory experience and interactive exhibits, through internship opportunities for undergraduate and graduate students, a week-long teacher workshop, and a website about climate change and watersheds. Mathematical expressions modeled fish and macroinvertebrate richness and diversity accurately well during most of the six thermal seasons where sample sizes were robust. Additionally, hydrologic models provide the basis for estimating flows under varying meteorological conditions and landscape changes. Continuations of long-term studies are requisite for accurately teasing local human influences (e.g. urbanization and watershed alteration) from global anthropogenic impacts (e.g. climate change) on watersheds. Effective and skillful translations (e.g. annual potential exposure of 750,000 people to our inquiry-based laboratory activities and interactive exhibits in Virginia) of results of scientific investigations are valuable ways of communicating information to the general public to enhance their understanding of climate change and its effects in watersheds.

  5. Performance Analysis for Airborne Interferometric SAR Affected by Flexible Baseline Oscillation

    Directory of Open Access Journals (Sweden)

    Liu Zhong-sheng

    2014-04-01

    Full Text Available The airborne interferometric SAR platform suffers from instability factors, such as air turbulence and mechanical vibrations during flight. Such factors cause the oscillation of the flexible baseline, which leads to significant degradation of the performance of the interferometric SAR system. This study is concerned with the baseline oscillation. First, the error of the slant range model under baseline oscillation conditions is formulated. Then, the SAR complex image signal and dual-channel correlation coefficient are modeled based on the first-order, second-order, and generic slant range error. Subsequently, the impact of the baseline oscillation on the imaging and interferometric performance of the SAR system is analyzed. Finally, simulations of the echo data are used to validate the theoretical analysis of the baseline oscillation in the airborne interferometric SAR.

  6. The Benefits of Including Clinical Factors in Rectal Normal Tissue Complication Probability Modeling After Radiotherapy for Prostate Cancer

    International Nuclear Information System (INIS)

    Defraene, Gilles; Van den Bergh, Laura; Al-Mamgani, Abrahim; Haustermans, Karin; Heemsbergen, Wilma; Van den Heuvel, Frank; Lebesque, Joos V.

    2012-01-01

    Purpose: To study the impact of clinical predisposing factors on rectal normal tissue complication probability modeling using the updated results of the Dutch prostate dose-escalation trial. Methods and Materials: Toxicity data of 512 patients (conformally treated to 68 Gy [n = 284] and 78 Gy [n = 228]) with complete follow-up at 3 years after radiotherapy were studied. Scored end points were rectal bleeding, high stool frequency, and fecal incontinence. Two traditional dose-based models (Lyman-Kutcher-Burman (LKB) and Relative Seriality (RS) and a logistic model were fitted using a maximum likelihood approach. Furthermore, these model fits were improved by including the most significant clinical factors. The area under the receiver operating characteristic curve (AUC) was used to compare the discriminating ability of all fits. Results: Including clinical factors significantly increased the predictive power of the models for all end points. In the optimal LKB, RS, and logistic models for rectal bleeding and fecal incontinence, the first significant (p = 0.011–0.013) clinical factor was “previous abdominal surgery.” As second significant (p = 0.012–0.016) factor, “cardiac history” was included in all three rectal bleeding fits, whereas including “diabetes” was significant (p = 0.039–0.048) in fecal incontinence modeling but only in the LKB and logistic models. High stool frequency fits only benefitted significantly (p = 0.003–0.006) from the inclusion of the baseline toxicity score. For all models rectal bleeding fits had the highest AUC (0.77) where it was 0.63 and 0.68 for high stool frequency and fecal incontinence, respectively. LKB and logistic model fits resulted in similar values for the volume parameter. The steepness parameter was somewhat higher in the logistic model, also resulting in a slightly lower D 50 . Anal wall DVHs were used for fecal incontinence, whereas anorectal wall dose best described the other two endpoints. Conclusions

  7. The Benefits of Including Clinical Factors in Rectal Normal Tissue Complication Probability Modeling After Radiotherapy for Prostate Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Defraene, Gilles, E-mail: gilles.defraene@uzleuven.be [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Van den Bergh, Laura [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Al-Mamgani, Abrahim [Department of Radiation Oncology, Erasmus Medical Center - Daniel den Hoed Cancer Center, Rotterdam (Netherlands); Haustermans, Karin [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Heemsbergen, Wilma [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital, Amsterdam (Netherlands); Van den Heuvel, Frank [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Lebesque, Joos V. [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital, Amsterdam (Netherlands)

    2012-03-01

    Purpose: To study the impact of clinical predisposing factors on rectal normal tissue complication probability modeling using the updated results of the Dutch prostate dose-escalation trial. Methods and Materials: Toxicity data of 512 patients (conformally treated to 68 Gy [n = 284] and 78 Gy [n = 228]) with complete follow-up at 3 years after radiotherapy were studied. Scored end points were rectal bleeding, high stool frequency, and fecal incontinence. Two traditional dose-based models (Lyman-Kutcher-Burman (LKB) and Relative Seriality (RS) and a logistic model were fitted using a maximum likelihood approach. Furthermore, these model fits were improved by including the most significant clinical factors. The area under the receiver operating characteristic curve (AUC) was used to compare the discriminating ability of all fits. Results: Including clinical factors significantly increased the predictive power of the models for all end points. In the optimal LKB, RS, and logistic models for rectal bleeding and fecal incontinence, the first significant (p = 0.011-0.013) clinical factor was 'previous abdominal surgery.' As second significant (p = 0.012-0.016) factor, 'cardiac history' was included in all three rectal bleeding fits, whereas including 'diabetes' was significant (p = 0.039-0.048) in fecal incontinence modeling but only in the LKB and logistic models. High stool frequency fits only benefitted significantly (p = 0.003-0.006) from the inclusion of the baseline toxicity score. For all models rectal bleeding fits had the highest AUC (0.77) where it was 0.63 and 0.68 for high stool frequency and fecal incontinence, respectively. LKB and logistic model fits resulted in similar values for the volume parameter. The steepness parameter was somewhat higher in the logistic model, also resulting in a slightly lower D{sub 50}. Anal wall DVHs were used for fecal incontinence, whereas anorectal wall dose best described the other two endpoints

  8. Baseline Estimation and Outlier Identification for Halocarbons

    Science.gov (United States)

    Wang, D.; Schuck, T.; Engel, A.; Gallman, F.

    2017-12-01

    The aim of this paper is to build a baseline model for halocarbons and to statistically identify the outliers under specific conditions. In this paper, time series of regional CFC-11 and Chloromethane measurements was discussed, which taken over the last 4 years at two locations, including a monitoring station at northwest of Frankfurt am Main (Germany) and Mace Head station (Ireland). In addition to analyzing time series of CFC-11 and Chloromethane, more importantly, a statistical approach of outlier identification is also introduced in this paper in order to make a better estimation of baseline. A second-order polynomial plus harmonics are fitted to CFC-11 and chloromethane mixing ratios data. Measurements with large distance to the fitting curve are regard as outliers and flagged. Under specific requirement, the routine is iteratively adopted without the flagged measurements until no additional outliers are found. Both model fitting and the proposed outlier identification method are realized with the help of a programming language, Python. During the period, CFC-11 shows a gradual downward trend. And there is a slightly upward trend in the mixing ratios of Chloromethane. The concentration of chloromethane also has a strong seasonal variation, mostly due to the seasonal cycle of OH. The usage of this statistical method has a considerable effect on the results. This method efficiently identifies a series of outliers according to the standard deviation requirements. After removing the outliers, the fitting curves and trend estimates are more reliable.

  9. Damage Identification of Bridge Based on Chebyshev Polynomial Fitting and Fuzzy Logic without Considering Baseline Model Parameters

    Directory of Open Access Journals (Sweden)

    Yu-Bo Jiao

    2015-01-01

    Full Text Available The paper presents an effective approach for damage identification of bridge based on Chebyshev polynomial fitting and fuzzy logic systems without considering baseline model data. The modal curvature of damaged bridge can be obtained through central difference approximation based on displacement modal shape. Depending on the modal curvature of damaged structure, Chebyshev polynomial fitting is applied to acquire the curvature of undamaged one without considering baseline parameters. Therefore, modal curvature difference can be derived and used for damage localizing. Subsequently, the normalized modal curvature difference is treated as input variable of fuzzy logic systems for damage condition assessment. Numerical simulation on a simply supported bridge was carried out to demonstrate the feasibility of the proposed method.

  10. Environmental Modeling, A goal of the Baseline Sampling and Analysis program is to determine baseline levels of select priority pollutants and petroleum markers in areas with high probability for oil spills., Published in 1999, 1:24000 (1in=2000ft) scale, Louisiana State University (LSU).

    Data.gov (United States)

    NSGIC Education | GIS Inventory — Environmental Modeling dataset current as of 1999. A goal of the Baseline Sampling and Analysis program is to determine baseline levels of select priority pollutants...

  11. National greenhouse gas emissions baseline scenarios. Learning from experiences in developing countries

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-04-15

    This report reviews national approaches to preparing baseline scenarios of greenhouse-gas (GHG) emissions. It does so by describing and comparing in non-technical language existing practices and choices made by ten developing countries - Brazil, China, Ethiopia, India, Indonesia, Kenya, Mexico, South Africa, Thailand and Vietnam. The review focuses on a number of key elements, including model choices, transparency considerations, choices about underlying assumptions and challenges associated with data management. The aim is to improve overall understanding of baseline scenarios and facilitate their use for policy-making in developing countries more broadly. The findings are based on the results of a collaborative project involving a number of activities undertaken by the Danish Energy Agency, the Organisation for Economic Co-operation and Development (OECD) and the UNEP Risoe Centre (URC), including a series of workshops on the subject. The ten contributing countries account for approximately 40% of current global GHG emissions - a share that is expected to increase in the future. The breakdown of emissions by sector varies widely among these countries. In some countries, the energy sector is the leading source of emissions; for others, the land-use sector and/or agricultural sector dominate emissions. The report underscores some common technical and financial capacity gaps faced by developing countries when preparing baseline scenarios. It does not endeavour to propose guidelines for preparing baseline scenarios. Rather, it is hoped that the report will inform any future attempts at preparing such kind of guidelines. (Author)

  12. Baseline Tumor Lipiodol Uptake after Transarterial Chemoembolization for Hepatocellular Carcinoma: Identification of a Threshold Value Predicting Tumor Recurrence.

    Science.gov (United States)

    Matsui, Yusuke; Horikawa, Masahiro; Jahangiri Noudeh, Younes; Kaufman, John A; Kolbeck, Kenneth J; Farsad, Khashayar

    2017-12-01

    The aim of the study was to evaluate the association between baseline Lipiodol uptake in hepatocellular carcinoma (HCC) after transarterial chemoembolization (TACE) with early tumor recurrence, and to identify a threshold baseline uptake value predicting tumor response. A single-institution retrospective database of HCC treated with Lipiodol-TACE was reviewed. Forty-six tumors in 30 patients treated with a Lipiodol-chemotherapy emulsion and no additional particle embolization were included. Baseline Lipiodol uptake was measured as the mean Hounsfield units (HU) on a CT within one week after TACE. Washout rate was calculated dividing the difference in HU between the baseline CT and follow-up CT by time (HU/month). Cox proportional hazard models were used to correlate baseline Lipiodol uptake and other variables with tumor response. A receiver operating characteristic (ROC) curve was used to identify the optimal threshold for baseline Lipiodol uptake predicting tumor response. During the follow-up period (mean 5.6 months), 19 (41.3%) tumors recurred (mean time to recurrence = 3.6 months). In a multivariate model, low baseline Lipiodol uptake and higher washout rate were significant predictors of early tumor recurrence ( P = 0.001 and Baseline Lipiodol uptake and washout rate on follow-up were independent predictors of early tumor recurrence. A threshold value of baseline Lipiodol uptake > 270.2 HU was highly sensitive and specific for tumor response. These findings may prove useful for determining subsequent treatment strategies after Lipiodol TACE.

  13. Baseline Evaluations to Support Control Room Modernization at Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald L.; Joe, Jeffrey C.

    2015-02-01

    For any major control room modernization activity at a commercial nuclear power plant (NPP) in the U.S., a utility should carefully follow the four phases prescribed by the U.S. Nuclear Regulatory Commission in NUREG-0711, Human Factors Engineering Program Review Model. These four phases include Planning and Analysis, Design, Verification and Validation, and Implementation and Operation. While NUREG-0711 is a useful guideline, it is written primarily from the perspective of regulatory review, and it therefore does not provide a nuanced account of many of the steps the utility might undertake as part of control room modernization. The guideline is largely summative—intended to catalog final products—rather than formative—intended to guide the overall modernization process. In this paper, we highlight two crucial formative sub-elements of the Planning and Analysis phase specific to control room modernization that are not covered in NUREG-0711. These two sub-elements are the usability and ergonomics baseline evaluations. A baseline evaluation entails evaluating the system as-built and currently in use. The usability baseline evaluation provides key insights into operator performance using the control system currently in place. The ergonomics baseline evaluation identifies possible deficiencies in the physical configuration of the control system. Both baseline evaluations feed into the design of the replacement system and subsequent summative benchmarking activities that help ensure that control room modernization represents a successful evolution of the control system.

  14. Baseline Evaluations to Support Control Room Modernization at Nuclear Power Plants

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Joe, Jeffrey C.

    2015-01-01

    For any major control room modernization activity at a commercial nuclear power plant (NPP) in the U.S., a utility should carefully follow the four phases prescribed by the U.S. Nuclear Regulatory Commission in NUREG-0711, Human Factors Engineering Program Review Model. These four phases include Planning and Analysis, Design, Verification and Validation, and Implementation and Operation. While NUREG-0711 is a useful guideline, it is written primarily from the perspective of regulatory review, and it therefore does not provide a nuanced account of many of the steps the utility might undertake as part of control room modernization. The guideline is largely summative–intended to catalog final products–rather than formative–intended to guide the overall modernization process. In this paper, we highlight two crucial formative sub-elements of the Planning and Analysis phase specific to control room modernization that are not covered in NUREG-0711. These two sub-elements are the usability and ergonomics baseline evaluations. A baseline evaluation entails evaluating the system as-built and currently in use. The usability baseline evaluation provides key insights into operator performance using the control system currently in place. The ergonomics baseline evaluation identifies possible deficiencies in the physical configuration of the control system. Both baseline evaluations feed into the design of the replacement system and subsequent summative benchmarking activities that help ensure that control room modernization represents a successful evolution of the control system.

  15. Environmental baselines: preparing for shale gas in the UK

    Science.gov (United States)

    Bloomfield, John; Manamsa, Katya; Bell, Rachel; Darling, George; Dochartaigh, Brighid O.; Stuart, Marianne; Ward, Rob

    2014-05-01

    Groundwater is a vital source of freshwater in the UK. It provides almost 30% of public water supply on average, but locally, for example in south-east England, it is constitutes nearly 90% of public supply. In addition to public supply, groundwater has a number of other uses including agriculture, industry, and food and drink production. It is also vital for maintaining river flows especially during dry periods and so is essential for maintaining ecosystem health. Recently, there have been concerns expressed about the potential impacts of shale gas development on groundwater. The UK has abundant shales and clays which are currently the focus of considerable interest and there is active research into their characterisation, resource evaluation and exploitation risks. The British Geological Survey (BGS) is undertaking research to provide information to address some of the environmental concerns related to the potential impacts of shale gas development on groundwater resources and quality. The aim of much of this initial work is to establish environmental baselines, such as a baseline survey of methane occurrence in groundwater (National methane baseline study) and the spatial relationships between potential sources and groundwater receptors (iHydrogeology project), prior to any shale gas exploration and development. The poster describes these two baseline studies and presents preliminary findings. BGS are currently undertaking a national survey of baseline methane concentrations in groundwater across the UK. This work will enable any potential future changes in methane in groundwater associated with shale gas development to be assessed. Measurements of methane in potable water from the Cretaceous, Jurassic and Triassic carbonate and sandstone aquifers are variable and reveal methane concentrations of up to 500 micrograms per litre, but the mean value is relatively low at documented in the range 2km. The geological modelling process will be presented and discussed

  16. Analysis of Seasonal Signal in GPS Short-Baseline Time Series

    Science.gov (United States)

    Wang, Kaihua; Jiang, Weiping; Chen, Hua; An, Xiangdong; Zhou, Xiaohui; Yuan, Peng; Chen, Qusen

    2018-04-01

    Proper modeling of seasonal signals and their quantitative analysis are of interest in geoscience applications, which are based on position time series of permanent GPS stations. Seasonal signals in GPS short-baseline (paper, to better understand the seasonal signal in GPS short-baseline time series, we adopted and processed six different short-baselines with data span that varies from 2 to 14 years and baseline length that varies from 6 to 1100 m. To avoid seasonal signals that are overwhelmed by noise, each of the station pairs is chosen with significant differences in their height (> 5 m) or type of the monument. For comparison, we also processed an approximately zero baseline with a distance of pass-filtered (BP) noise is valid for approximately 40% of the baseline components, and another 20% of the components can be best modeled by a combination of the first-order Gauss-Markov (FOGM) process plus white noise (WN). The TEM displacements are then modeled by considering the monument height of the building structure beneath the GPS antenna. The median contributions of TEM to the annual amplitude in the vertical direction are 84% and 46% with and without additional parts of the monument, respectively. Obvious annual signals with amplitude > 0.4 mm in the horizontal direction are observed in five short-baselines, and the amplitudes exceed 1 mm in four of them. These horizontal seasonal signals are likely related to the propagation of daily/sub-daily TEM displacement or other signals related to the site environment. Mismodeling of the tropospheric delay may also introduce spurious seasonal signals with annual amplitudes of 5 and 2 mm, respectively, for two short-baselines with elevation differences greater than 100 m. The results suggest that the monument height of the additional part of a typical GPS station should be considered when estimating the TEM displacement and that the tropospheric delay should be modeled cautiously, especially with station pairs with

  17. Constraining proposed combinations of ice history and Earth rheology using VLBI determined baseline length rates in North America

    Science.gov (United States)

    Mitrovica, J. X.; Davis, J. L.; Shapiro, I. I.

    1993-01-01

    We predict the present-day rates of change of the lengths of 19 North American baselines due to the glacial isostatic adjustment process. Contrary to previously published research, we find that the three dimensional motion of each of the sites defining a baseline, rather than only the radial motions of these sites, needs to be considered to obtain an accurate estimate of the rate of change of the baseline length. Predictions are generated using a suite of Earth models and late Pleistocene ice histories, these include specific combinations of the two which have been proposed in the literature as satisfying a variety of rebound related geophysical observations from the North American region. A number of these published models are shown to predict rates which differ significantly from the VLBI observations.

  18. Corrective action baseline report for underground storage tank 2331-U Building 9201-1

    International Nuclear Information System (INIS)

    1994-01-01

    The purpose of this report is to provide baseline geochemical and hydrogeologic data relative to corrective action for underground storage tank (UST) 2331-U at the Building 9201-1 Site. Progress in support of the Building 9201-1 Site has included monitoring well installation and baseline groundwater sampling and analysis. This document represents the baseline report for corrective action at the Building 9201-1 site and is organized into three sections. Section 1 presents introductory information relative to the site, including the regulatory initiative, site description, and progress to date. Section 2 includes the summary of additional monitoring well installation activities and the results of baseline groundwater sampling. Section 3 presents the baseline hydrogeology and planned zone of influence for groundwater remediation

  19. Placental baseline conditions modulate the hyperoxic BOLD-MRI response.

    Science.gov (United States)

    Sinding, Marianne; Peters, David A; Poulsen, Sofie S; Frøkjær, Jens B; Christiansen, Ole B; Petersen, Astrid; Uldbjerg, Niels; Sørensen, Anne

    2018-01-01

    Human pregnancies complicated by placental dysfunction may be characterized by a high hyperoxic Blood oxygen level-dependent (BOLD) MRI response. The pathophysiology behind this phenomenon remains to be established. The aim of this study was to evaluate whether it is associated with altered placental baseline conditions, including a lower oxygenation and altered tissue morphology, as estimated by the placental transverse relaxation time (T2*). We included 49 normal pregnancies (controls) and 13 pregnancies complicated by placental dysfunction (cases), defined by a birth weight baseline BOLD)/baseline BOLD) from a dynamic single-echo gradient-recalled echo (GRE) MRI sequence and the absolute ΔT2* (hyperoxic T2*- baseline T2*) from breath-hold multi-echo GRE sequences. In the control group, the relative ΔBOLD response increased during gestation from 5% in gestational week 20 to 20% in week 40. In the case group, the relative ΔBOLD response was significantly higher (mean Z-score 4.94; 95% CI 2.41, 7.47). The absolute ΔT2*, however, did not differ between controls and cases (p = 0.37), whereas the baseline T2* was lower among cases (mean Z-score -3.13; 95% CI -3.94, -2.32). Furthermore, we demonstrated a strong negative linear correlation between the Log 10 ΔBOLD response and the baseline T2* (r = -0.88, p baseline conditions, as the absolute increase in placental oxygenation (ΔT2*) does not differ between groups. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Testing a machine-learning algorithm to predict the persistence and severity of major depressive disorder from baseline self-reports.

    Science.gov (United States)

    Kessler, R C; van Loo, H M; Wardenaar, K J; Bossarte, R M; Brenner, L A; Cai, T; Ebert, D D; Hwang, I; Li, J; de Jonge, P; Nierenberg, A A; Petukhova, M V; Rosellini, A J; Sampson, N A; Schoevers, R A; Wilcox, M A; Zaslavsky, A M

    2016-10-01

    Heterogeneity of major depressive disorder (MDD) illness course complicates clinical decision-making. Although efforts to use symptom profiles or biomarkers to develop clinically useful prognostic subtypes have had limited success, a recent report showed that machine-learning (ML) models developed from self-reports about incident episode characteristics and comorbidities among respondents with lifetime MDD in the World Health Organization World Mental Health (WMH) Surveys predicted MDD persistence, chronicity and severity with good accuracy. We report results of model validation in an independent prospective national household sample of 1056 respondents with lifetime MDD at baseline. The WMH ML models were applied to these baseline data to generate predicted outcome scores that were compared with observed scores assessed 10-12 years after baseline. ML model prediction accuracy was also compared with that of conventional logistic regression models. Area under the receiver operating characteristic curve based on ML (0.63 for high chronicity and 0.71-0.76 for the other prospective outcomes) was consistently higher than for the logistic models (0.62-0.70) despite the latter models including more predictors. A total of 34.6-38.1% of respondents with subsequent high persistence chronicity and 40.8-55.8% with the severity indicators were in the top 20% of the baseline ML-predicted risk distribution, while only 0.9% of respondents with subsequent hospitalizations and 1.5% with suicide attempts were in the lowest 20% of the ML-predicted risk distribution. These results confirm that clinically useful MDD risk-stratification models can be generated from baseline patient self-reports and that ML methods improve on conventional methods in developing such models.

  1. MALDI-TOF Baseline Drift Removal Using Stochastic Bernstein Approximation

    Directory of Open Access Journals (Sweden)

    Howard Daniel

    2006-01-01

    Full Text Available Stochastic Bernstein (SB approximation can tackle the problem of baseline drift correction of instrumentation data. This is demonstrated for spectral data: matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF data. Two SB schemes for removing the baseline drift are presented: iterative and direct. Following an explanation of the origin of the MALDI-TOF baseline drift that sheds light on the inherent difficulty of its removal by chemical means, SB baseline drift removal is illustrated for both proteomics and genomics MALDI-TOF data sets. SB is an elegant signal processing method to obtain a numerically straightforward baseline shift removal method as it includes a free parameter that can be optimized for different baseline drift removal applications. Therefore, research that determines putative biomarkers from the spectral data might benefit from a sensitivity analysis to the underlying spectral measurement that is made possible by varying the SB free parameter. This can be manually tuned (for constant or tuned with evolutionary computation (for .

  2. FED baseline engineering studies report

    Energy Technology Data Exchange (ETDEWEB)

    Sager, P.H.

    1983-04-01

    Studies were carried out on the FED Baseline to improve design definition, establish feasibility, and reduce cost. Emphasis was placed on cost reduction, but significant feasibility concerns existed in several areas, and better design definition was required to establish feasibility and provide a better basis for cost estimates. Design definition and feasibility studies included the development of a labyrinth shield ring concept to prevent radiation streaming between the torus spool and the TF coil cryostat. The labyrinth shield concept which was developed reduced radiation streaming sufficiently to permit contact maintenance of the inboard EF coils. Various concepts of preventing arcing between adjacent shield sectors were also explored. It was concluded that installation of copper straps with molybdenum thermal radiation shields would provide the most reliable means of preventing arcing. Other design studies included torus spool electrical/structural concepts, test module shielding, torus seismic response, poloidal conditions in the magnets, disruption characteristics, and eddy current effects. These additional studies had no significant impact on cost but did confirm the feasibility of the basic FED Baseline concept.

  3. FED baseline engineering studies report

    International Nuclear Information System (INIS)

    Sager, P.H.

    1983-04-01

    Studies were carried out on the FED Baseline to improve design definition, establish feasibility, and reduce cost. Emphasis was placed on cost reduction, but significant feasibility concerns existed in several areas, and better design definition was required to establish feasibility and provide a better basis for cost estimates. Design definition and feasibility studies included the development of a labyrinth shield ring concept to prevent radiation streaming between the torus spool and the TF coil cryostat. The labyrinth shield concept which was developed reduced radiation streaming sufficiently to permit contact maintenance of the inboard EF coils. Various concepts of preventing arcing between adjacent shield sectors were also explored. It was concluded that installation of copper straps with molybdenum thermal radiation shields would provide the most reliable means of preventing arcing. Other design studies included torus spool electrical/structural concepts, test module shielding, torus seismic response, poloidal conditions in the magnets, disruption characteristics, and eddy current effects. These additional studies had no significant impact on cost but did confirm the feasibility of the basic FED Baseline concept

  4. Baseline Vascular Cognitive Impairment Predicts the Course of Apathetic Symptoms After Stroke: The CASPER Study.

    Science.gov (United States)

    Douven, Elles; Köhler, Sebastian; Schievink, Syenna H J; van Oostenbrugge, Robert J; Staals, Julie; Verhey, Frans R J; Aalten, Pauline

    2018-03-01

    To examine the influence of vascular cognitive impairment (VCI) on the course of poststroke depression (PSD) and poststroke apathy (PSA). Included were 250 stroke patients who underwent neuropsychological and neuropsychiatric assessment 3 months after stroke (baseline) and at a 6- and 12-month follow-up after baseline. Linear mixed models tested the influence of VCI in at least one cognitive domain (any VCI) or multidomain VCI (VCI in multiple cognitive domains) at baseline and domain-specific VCI at baseline on levels of depression and apathy over time, with random effects for intercept and slope. Almost half of the patients showed any VCI at baseline, and any VCI was associated with increasing apathy levels from baseline to the 12-month follow-up. Patients with multidomain VCI had higher apathy scores at the 6- and 12-month follow-up compared with patients with VCI in a single cognitive domain. Domain-specific analyses showed that impaired executive function and slowed information processing speed went together with increasing apathy levels from baseline to 6- and 12-month follow-up. None of the cognitive variables predicted the course of depressive symptoms. Baseline VCI is associated with increasing apathy levels from baseline to the chronic stroke phase, whereas no association was found between baseline VCI and the course of depressive symptoms. Health professionals should be aware that apathy might be absent early after stroke but may evolve over time in patients with VCI. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Carbon tetrachloride ERA soil-gas baseline monitoring

    International Nuclear Information System (INIS)

    Fancher, J.D.

    1994-01-01

    From December 1991 through December 1993, Westinghouse Hanford Company performed routine baseline monitoring of selected wells ad soil-gas points twice weekly in the 200 West Area of the Hanford Site. This work supported the carbon Tetrachloride Expedited Response Action (ERA) and provided a solid baseline of volatile organic compound (VOC) concentrations in wells and in the subsurface at the ERA site. As site remediation continues, comparisons to this baseline can be one means of measuring the success of carbon tetrachloride vapor extraction. This report contains observations of the patterns and trends associated with data obtained during soil-gas monitoring at the 200 West Area: Monitoring performed since late 1991 includes monitoring soil-gas probes ad wellheads for volatile organic compounds (VOCs). This report reflects monitoring data collected from December 1991 through December 1993

  6. Modeling and Simulation of Offshore Wind Power Platform for 5 MW Baseline NREL Turbine

    Science.gov (United States)

    Roni Sahroni, Taufik

    2015-01-01

    This paper presents the modeling and simulation of offshore wind power platform for oil and gas companies. Wind energy has become the fastest growing renewable energy in the world and major gains in terms of energy generation are achievable when turbines are moved offshore. The objective of this project is to propose new design of an offshore wind power platform. Offshore wind turbine (OWT) is composed of three main structures comprising the rotor/blades, the tower nacelle, and the supporting structure. The modeling analysis was focused on the nacelle and supporting structure. The completed final design was analyzed using finite element modeling tool ANSYS to obtain the structure's response towards loading conditions and to ensure it complies with guidelines laid out by classification authority Det Norske Veritas. As a result, a new model of the offshore wind power platform for 5 MW Baseline NREL turbine was proposed. PMID:26550605

  7. Satellite-Surface Perspectives of Air Quality and Aerosol-Cloud Effects on the Environment: An Overview of 7-SEAS BASELInE

    Science.gov (United States)

    Tsay, Si-Chee; Maring, Hal B.; Lin, Neng-Huei; Buntoung, Sumaman; Chantara, Somporn; Chuang, Hsiao-Chi; Gabriel, Philip M.; Goodloe, Colby S.; Holben, Brent N.; Hsiao, Ta-Chih; hide

    2016-01-01

    The objectives of 7-SEASBASELInE (Seven SouthEast Asian Studies Biomass-burning Aerosols and Stratocumulus Environment: Lifecycles and Interactions Experiment) campaigns in spring 2013-2015 were to synergize measurements from uniquely distributed ground-based networks (e.g., AERONET (AErosol RObotic NETwork)), MPLNET ( NASA Micro-Pulse Lidar Network)) and sophisticated platforms (e.g.,SMARTLabs (Surface-based Mobile Atmospheric Research and Testbed Laboratories), regional contributing instruments), along with satellite observations retrievals and regional atmospheric transport chemical models to establish a critically needed database, and to advance our understanding of biomass-burning aerosols and trace gases in Southeast Asia (SEA). We present a satellite-surface perspective of 7-SEASBASELInE and highlight scientific findings concerning: (1) regional meteorology of moisture fields conducive to the production and maintenance of low-level stratiform clouds over land; (2) atmospheric composition in a biomass-burning environment, particularly tracers-markers to serve as important indicators for assessing the state and evolution of atmospheric constituents; (3) applications of remote sensing to air quality and impact on radiative energetics, examining the effect of diurnal variability of boundary-layer height on aerosol loading; (4) aerosol hygroscopicity and ground-based cloud radar measurements in aerosol-cloud processes by advanced cloud ensemble models; and (5) implications of air quality, in terms of toxicity of nanoparticles and trace gases, to human health. This volume is the third 7-SEAS special issue (after Atmospheric Research, vol. 122, 2013; and Atmospheric Environment, vol. 78, 2013) and includes 27 papers published, with emphasis on air quality and aerosol-cloud effects on the environment. BASELInE observations of stratiform clouds over SEA are unique, such clouds are embedded in a heavy aerosol-laden environment and feature characteristically greater

  8. Dynamic baseline detection method for power data network service

    Science.gov (United States)

    Chen, Wei

    2017-08-01

    This paper proposes a dynamic baseline Traffic detection Method which is based on the historical traffic data for the Power data network. The method uses Cisco's NetFlow acquisition tool to collect the original historical traffic data from network element at fixed intervals. This method uses three dimensions information including the communication port, time, traffic (number of bytes or number of packets) t. By filtering, removing the deviation value, calculating the dynamic baseline value, comparing the actual value with the baseline value, the method can detect whether the current network traffic is abnormal.

  9. U-10Mo Baseline Fuel Fabrication Process Description

    Energy Technology Data Exchange (ETDEWEB)

    Hubbard, Lance R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Arendt, Christina L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dye, Daniel F. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Clayton, Christopher K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lerchen, Megan E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lombardo, Nicholas J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lavender, Curt A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zacher, Alan H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-09-27

    This document provides a description of the U.S. High Power Research Reactor (USHPRR) low-enriched uranium (LEU) fuel fabrication process. This document is intended to be used in conjunction with the baseline process flow diagram (PFD) presented in Appendix A. The baseline PFD is used to document the fabrication process, communicate gaps in technology or manufacturing capabilities, convey alternatives under consideration, and as the basis for a dynamic simulation model of the fabrication process. The simulation model allows for the assessment of production rates, costs, and manufacturing requirements (manpower, fabrication space, numbers and types of equipment, etc.) throughout the lifecycle of the USHPRR program. This document, along with the accompanying PFD, is updated regularly

  10. Rationing in the presence of baselines

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Moreno-Ternero, Juan D.; Østerdal, Lars Peter

    2013-01-01

    We analyze a general model of rationing in which agents have baselines, in addition to claims against the (insufficient) endowment of the good to be allocated. Many real-life problems fit this general model (e.g., bankruptcy with prioritized claims, resource allocation in the public health care...... sector, water distribution in drought periods). We introduce (and characterize) a natural class of allocation methods for this model. Any method within the class is associated with a rule in the standard rationing model, and we show that if the latter obeys some focal properties, the former obeys them...

  11. THE 2014 ALMA LONG BASELINE CAMPAIGN: AN OVERVIEW

    Energy Technology Data Exchange (ETDEWEB)

    Partnership, ALMA [Astrophysics Research Institute, Liverpool John Moores University, IC2, Liverpool Science Park, 146 Brownlow Hill, Liverpool L3 5RF (United Kingdom); Fomalont, E. B.; Vlahakis, C.; Corder, S.; Remijan, A.; Barkats, D.; Dent, W. R. F.; Phillips, N.; Cox, P.; Hales, A. S. [Joint ALMA Observatory, Alonso de Córdova 3107, Vitacura, Santiago (Chile); Lucas, R. [Institut de Planétologie et d’Astrophysique de Grenoble (UMR 5274), BP 53, F-38041 Grenoble Cedex 9 (France); Hunter, T. R.; Brogan, C. L.; Amestica, R.; Cotton, W. [National Radio Astronomy Observatory, 520 Edgemont Road, Charlottesville, VA 22903 (United States); Asaki, Y. [National Astronomical Observatory of Japan, 2-21-1 Osawa, Mitaka, Tokyo 181-8588 (Japan); Matsushita, S. [Institute of Astronomy and Astrophysics, Academia Sinica, P.O. Box 23-141, Taipei 106, Taiwan (China); Hills, R. E. [Astrophysics Group, Cavendish Laboratory, JJ Thomson Avenue, Cambridge CB3 0HE (United Kingdom); Richards, A. M. S. [Jodrell Bank Centre for Astrophysics, School of Physics and Astronomy, University of Manchester, Oxford Road, Manchester M13 9PL (United Kingdom); Broguiere, D., E-mail: efomalon@nrao.edu [Institut de Radioastronomie Millime´trique (IRAM), 300 rue de la Piscine, Domaine Universitaire, F-38406 Saint Martin d’Hères (France); and others

    2015-07-20

    A major goal of the Atacama Large Millimeter/submillimeter Array (ALMA) is to make accurate images with resolutions of tens of milliarcseconds, which at submillimeter (submm) wavelengths requires baselines up to ∼15 km. To develop and test this capability, a Long Baseline Campaign (LBC) was carried out from 2014 September to late November, culminating in end-to-end observations, calibrations, and imaging of selected Science Verification (SV) targets. This paper presents an overview of the campaign and its main results, including an investigation of the short-term coherence properties and systematic phase errors over the long baselines at the ALMA site, a summary of the SV targets and observations, and recommendations for science observing strategies at long baselines. Deep ALMA images of the quasar 3C 138 at 97 and 241 GHz are also compared to VLA 43 GHz results, demonstrating an agreement at a level of a few percent. As a result of the extensive program of LBC testing, the highly successful SV imaging at long baselines achieved angular resolutions as fine as 19 mas at ∼350 GHz. Observing with ALMA on baselines of up to 15 km is now possible, and opens up new parameter space for submm astronomy.

  12. THE 2014 ALMA LONG BASELINE CAMPAIGN: AN OVERVIEW

    International Nuclear Information System (INIS)

    Partnership, ALMA; Fomalont, E. B.; Vlahakis, C.; Corder, S.; Remijan, A.; Barkats, D.; Dent, W. R. F.; Phillips, N.; Cox, P.; Hales, A. S.; Lucas, R.; Hunter, T. R.; Brogan, C. L.; Amestica, R.; Cotton, W.; Asaki, Y.; Matsushita, S.; Hills, R. E.; Richards, A. M. S.; Broguiere, D.

    2015-01-01

    A major goal of the Atacama Large Millimeter/submillimeter Array (ALMA) is to make accurate images with resolutions of tens of milliarcseconds, which at submillimeter (submm) wavelengths requires baselines up to ∼15 km. To develop and test this capability, a Long Baseline Campaign (LBC) was carried out from 2014 September to late November, culminating in end-to-end observations, calibrations, and imaging of selected Science Verification (SV) targets. This paper presents an overview of the campaign and its main results, including an investigation of the short-term coherence properties and systematic phase errors over the long baselines at the ALMA site, a summary of the SV targets and observations, and recommendations for science observing strategies at long baselines. Deep ALMA images of the quasar 3C 138 at 97 and 241 GHz are also compared to VLA 43 GHz results, demonstrating an agreement at a level of a few percent. As a result of the extensive program of LBC testing, the highly successful SV imaging at long baselines achieved angular resolutions as fine as 19 mas at ∼350 GHz. Observing with ALMA on baselines of up to 15 km is now possible, and opens up new parameter space for submm astronomy

  13. Physics Potential of Long-Baseline Experiments

    Directory of Open Access Journals (Sweden)

    Sanjib Kumar Agarwalla

    2014-01-01

    Full Text Available The discovery of neutrino mixing and oscillations over the past decade provides firm evidence for new physics beyond the Standard Model. Recently, θ13 has been determined to be moderately large, quite close to its previous upper bound. This represents a significant milestone in establishing the three-flavor oscillation picture of neutrinos. It has opened up exciting prospects for current and future long-baseline neutrino oscillation experiments towards addressing the remaining fundamental questions, in particular the type of the neutrino mass hierarchy and the possible presence of a CP-violating phase. Another recent and crucial development is the indication of non-maximal 2-3 mixing angle, causing the octant ambiguity of θ23. In this paper, I will review the phenomenology of long-baseline neutrino oscillations with a special emphasis on sub-leading three-flavor effects, which will play a crucial role in resolving these unknowns. First, I will give a brief description of neutrino oscillation phenomenon. Then, I will discuss our present global understanding of the neutrino mass-mixing parameters and will identify the major unknowns in this sector. After that, I will present the physics reach of current generation long-baseline experiments. Finally, I will conclude with a discussion on the physics capabilities of accelerator-driven possible future long-baseline precision oscillation facilities.

  14. Re-creating missing population baselines for Pacific reef sharks.

    Science.gov (United States)

    Nadon, Marc O; Baum, Julia K; Williams, Ivor D; McPherson, Jana M; Zgliczynski, Brian J; Richards, Benjamin L; Schroeder, Robert E; Brainard, Russell E

    2012-06-01

    Sharks and other large predators are scarce on most coral reefs, but studies of their historical ecology provide qualitative evidence that predators were once numerous in these ecosystems. Quantifying density of sharks in the absence of humans (baseline) is, however, hindered by a paucity of pertinent time-series data. Recently researchers have used underwater visual surveys, primarily of limited spatial extent or nonstandard design, to infer negative associations between reef shark abundance and human populations. We analyzed data from 1607 towed-diver surveys (>1 ha transects surveyed by observers towed behind a boat) conducted at 46 reefs in the central-western Pacific Ocean, reefs that included some of the world's most pristine coral reefs. Estimates of shark density from towed-diver surveys were substantially lower (sharks observed in towed-diver surveys and human population in models that accounted for the influence of oceanic primary productivity, sea surface temperature, reef area, and reef physical complexity. We used these models to estimate the density of sharks in the absence of humans. Densities of gray reef sharks (Carcharhinus amblyrhynchos), whitetip reef sharks (Triaenodon obesus), and the group "all reef sharks" increased substantially as human population decreased and as primary productivity and minimum sea surface temperature (or reef area, which was highly correlated with temperature) increased. Simulated baseline densities of reef sharks under the absence of humans were 1.1-2.4/ha for the main Hawaiian Islands, 1.2-2.4/ha for inhabited islands of American Samoa, and 0.9-2.1/ha for inhabited islands in the Mariana Archipelago, which suggests that density of reef sharks has declined to 3-10% of baseline levels in these areas. ©2012 Society for Conservation Biology No claim to original US government works.

  15. Modeling and Simulation of Offshore Wind Power Platform for 5 MW Baseline NREL Turbine

    Directory of Open Access Journals (Sweden)

    Taufik Roni Sahroni

    2015-01-01

    Full Text Available This paper presents the modeling and simulation of offshore wind power platform for oil and gas companies. Wind energy has become the fastest growing renewable energy in the world and major gains in terms of energy generation are achievable when turbines are moved offshore. The objective of this project is to propose new design of an offshore wind power platform. Offshore wind turbine (OWT is composed of three main structures comprising the rotor/blades, the tower nacelle, and the supporting structure. The modeling analysis was focused on the nacelle and supporting structure. The completed final design was analyzed using finite element modeling tool ANSYS to obtain the structure’s response towards loading conditions and to ensure it complies with guidelines laid out by classification authority Det Norske Veritas. As a result, a new model of the offshore wind power platform for 5 MW Baseline NREL turbine was proposed.

  16. Linear MALDI-ToF simultaneous spectrum deconvolution and baseline removal.

    Science.gov (United States)

    Picaud, Vincent; Giovannelli, Jean-Francois; Truntzer, Caroline; Charrier, Jean-Philippe; Giremus, Audrey; Grangeat, Pierre; Mercier, Catherine

    2018-04-05

    Thanks to a reasonable cost and simple sample preparation procedure, linear MALDI-ToF spectrometry is a growing technology for clinical microbiology. With appropriate spectrum databases, this technology can be used for early identification of pathogens in body fluids. However, due to the low resolution of linear MALDI-ToF instruments, robust and accurate peak picking remains a challenging task. In this context we propose a new peak extraction algorithm from raw spectrum. With this method the spectrum baseline and spectrum peaks are processed jointly. The approach relies on an additive model constituted by a smooth baseline part plus a sparse peak list convolved with a known peak shape. The model is then fitted under a Gaussian noise model. The proposed method is well suited to process low resolution spectra with important baseline and unresolved peaks. We developed a new peak deconvolution procedure. The paper describes the method derivation and discusses some of its interpretations. The algorithm is then described in a pseudo-code form where the required optimization procedure is detailed. For synthetic data the method is compared to a more conventional approach. The new method reduces artifacts caused by the usual two-steps procedure, baseline removal then peak extraction. Finally some results on real linear MALDI-ToF spectra are provided. We introduced a new method for peak picking, where peak deconvolution and baseline computation are performed jointly. On simulated data we showed that this global approach performs better than a classical one where baseline and peaks are processed sequentially. A dedicated experiment has been conducted on real spectra. In this study a collection of spectra of spiked proteins were acquired and then analyzed. Better performances of the proposed method, in term of accuracy and reproductibility, have been observed and validated by an extended statistical analysis.

  17. Recommendation to include fragrance mix 2 and hydroxyisohexyl 3-cyclohexene carboxaldehyde (Lyral) in the European baseline patch test series

    DEFF Research Database (Denmark)

    Bruze, Magnus; Andersen, Klaus Ejner; Goossens, An

    2008-01-01

    various European centres when tested in consecutive dermatitis patients. CONCLUSIONS: From 2008, pet. preparations of fragrance mix 2 at 14% w/w (5.6 mg/cm(2)) and hydroxyisohexyl 3-cyclohexene carboxaldehyde at 5% w/w (2.0 mg/cm(2)) are recommended for inclusion in the baseline series. With the Finn...

  18. Baselines For Land-Use Change In The Tropics: Application ToAvoided Deforestation Projects

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Sandra; Hall, Myrna; Andrasko, Ken; Ruiz, Fernando; Marzoli, Walter; Guerrero, Gabriela; Masera, Omar; Dushku, Aaron; Dejong,Ben; Cornell, Joseph

    2007-06-01

    Although forest conservation activities particularly in thetropics offer significant potential for mitigating carbon emissions,these types of activities have faced obstacles in the policy arena causedby the difficulty in determining key elements of the project cycle,particularly the baseline. A baseline for forest conservation has twomain components: the projected land-use change and the correspondingcarbon stocks in the applicable pools such as vegetation, detritus,products and soil, with land-use change being the most difficult toaddress analytically. In this paper we focus on developing and comparingthree models, ranging from relatively simple extrapolations of pasttrends in land use based on simple drivers such as population growth tomore complex extrapolations of past trends using spatially explicitmodels of land-use change driven by biophysical and socioeconomicfactors. The three models of the latter category used in the analysis atregional scale are The Forest Area Change (FAC) model, the Land Use andCarbon Sequestration (LUCS) model, and the Geographical Modeling (GEOMOD)model. The models were used to project deforestation in six tropicalregions that featured different ecological and socioeconomic conditions,population dynamics, and uses of the land: (1) northern Belize; (2) SantaCruz State, Bolivia; (3) Parana State in Brazil; (4) Campeche, Mexico;(5) Chiapas, Mexico; and (6) Michoacan, Mexico. A comparison of all modeloutputs across all six regions shows that each model produced quitedifferent deforestation baseline. In general, the simplest FAC model,applied at the national administrative-unit scale, projected the highestamount of forest loss (four out of six) and the LUCS model the leastamount of loss (four out of five). Based on simulations of GEOMOD, wefound that readily observable physical and biological factors as well asdistance to areas of past disturbance were each about twice as importantas either sociological/demographic or economic

  19. Baseline characteristics of patients with heart failure and preserved ejection fraction included in the Karolinska Rennes (KaRen) study.

    Science.gov (United States)

    Donal, Erwan; Lund, Lars H; Oger, Emmanuel; Hage, Camilla; Persson, Hans; Reynaud, Amélie; Ennezat, Pierre-Vladimir; Bauer, Fabrice; Sportouch-Dukhan, Catherine; Drouet, Elodie; Daubert, Jean-Claude; Linde, Cecilia

    2014-02-01

    Karolinska Rennes (KaRen) is a prospective observational study to characterize heart failure patients with preserved ejection fraction (HFpEF) and to identify prognostic factors for long-term mortality and morbidity. To report characteristics and echocardiography at entry and after 4-8 weeks of follow-up. Patients were included following an acute heart failure presentation with B-type natriuretic peptide (BNP)>100 ng/L or N-terminal pro-BNP (NT-proBNP)>300 ng/L and left ventricular ejection fraction (LVEF)>45%. The mean ± SD age of 539 included patients was 77 ± 9 years and 56% were women. Patient history included hypertension (78%), atrial tachyarrhythmia (44%), prior heart failure (40%) and anemia (37%), but left bundle branch block was rare (3.8%). Median NT-proBNP was 2448 ng/L (n=438), and median BNP 429 ng/L (n=101). Overall, 101 patients did not return for the follow-up visit, including 13 patients who died (2.4%). Apart from older age (80 ± 9 vs. 76 ± 9 years; P=0.006), there were no significant differences in baseline characteristics between patients who did and did not return for follow-up. Mean LVEF was lower at entry than follow-up (56% vs. 62%; P<0.001). At follow-up, mean E/e' was 12.9 ± 6.1, left atrial volume index 49.4±17.8mL/m(2). Mean global left ventricular longitudinal strain was -14.6 ± 3.9%; LV mass index was 126.6 ± 36.2g/m(2). Patients in KaRen were old with slight female dominance and hypertension as the most prevalent etiological factor. LVEF was preserved, but with increased LV mass and depressed LV diastolic and longitudinal systolic functions. Few patients had signs of electrical dyssynchrony (ClinicalTrials.gov.- NCT00774709). Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  20. Progressive IRP Models for Power Resources Including EPP

    Directory of Open Access Journals (Sweden)

    Yiping Zhu

    2017-01-01

    Full Text Available In the view of optimizing regional power supply and demand, the paper makes effective planning scheduling of supply and demand side resources including energy efficiency power plant (EPP, to achieve the target of benefit, cost, and environmental constraints. In order to highlight the characteristics of different supply and demand resources in economic, environmental, and carbon constraints, three planning models with progressive constraints are constructed. Results of three models by the same example show that the best solutions to different models are different. The planning model including EPP has obvious advantages considering pollutant and carbon emission constraints, which confirms the advantages of low cost and emissions of EPP. The construction of progressive IRP models for power resources considering EPP has a certain reference value for guiding the planning and layout of EPP within other power resources and achieving cost and environmental objectives.

  1. Propensity score to detect baseline imbalance in cluster randomized trials: the role of the c-statistic.

    Science.gov (United States)

    Leyrat, Clémence; Caille, Agnès; Foucher, Yohann; Giraudeau, Bruno

    2016-01-22

    Despite randomization, baseline imbalance and confounding bias may occur in cluster randomized trials (CRTs). Covariate imbalance may jeopardize the validity of statistical inferences if they occur on prognostic factors. Thus, the diagnosis of a such imbalance is essential to adjust statistical analysis if required. We developed a tool based on the c-statistic of the propensity score (PS) model to detect global baseline covariate imbalance in CRTs and assess the risk of confounding bias. We performed a simulation study to assess the performance of the proposed tool and applied this method to analyze the data from 2 published CRTs. The proposed method had good performance for large sample sizes (n =500 per arm) and when the number of unbalanced covariates was not too small as compared with the total number of baseline covariates (≥40% of unbalanced covariates). We also provide a strategy for pre selection of the covariates needed to be included in the PS model to enhance imbalance detection. The proposed tool could be useful in deciding whether covariate adjustment is required before performing statistical analyses of CRTs.

  2. Relationship between visual field progression and baseline refraction in primary open-angle glaucoma.

    Science.gov (United States)

    Naito, Tomoko; Yoshikawa, Keiji; Mizoue, Shiro; Nanno, Mami; Kimura, Tairo; Suzumura, Hirotaka; Umeda, Yuzo; Shiraga, Fumio

    2016-01-01

    To analyze the relationship between visual field (VF) progression and baseline refraction in Japanese patients with primary open-angle glaucoma (POAG) including normal-tension glaucoma. In this retrospective study, the subjects were patients with POAG who had undergone VF tests at least ten times with a Humphrey Field Analyzer (Swedish interactive thresholding algorithm standard, Central 30-2 program). VF progression was defined as a significantly negative value of mean deviation (MD) slope at the final VF test. Multivariate logistic regression models were applied to detect an association between MD slope deterioration and baseline refraction. A total of 156 eyes of 156 patients were included in this analysis. Significant deterioration of MD slope was observed in 70 eyes of 70 patients (44.9%), whereas no significant deterioration was evident in 86 eyes of 86 patients (55.1%). The eyes with VF progression had significantly higher baseline refraction compared to those without apparent VF progression (-1.9±3.8 diopter [D] vs -3.5±3.4 D, P=0.0048) (mean ± standard deviation). When subject eyes were classified into four groups by the level of baseline refraction applying spherical equivalent (SE): no myopia (SE > -1D), mild myopia (-1D ≥ SE > -3D), moderate myopia (-3D ≥ SE > -6D), and severe myopia (-6D ≥ SE), the Cochran-Armitage trend analysis showed a decreasing trend in the proportion of MD slope deterioration with increasing severity of myopia (P=0.0002). The multivariate analysis revealed that baseline refraction (P=0.0108, odds ratio [OR]: 1.13, 95% confidence interval [CI]: 1.03-1.25) and intraocular pressure reduction rate (P=0.0150, OR: 0.97, 95% CI: 0.94-0.99) had a significant association with MD slope deterioration. In the current analysis of Japanese patients with POAG, baseline refraction was a factor significantly associated with MD slope deterioration as well as intraocular pressure reduction rate. When baseline refraction was classified into

  3. Parametric estimation of time varying baselines in airborne interferometric SAR

    DEFF Research Database (Denmark)

    Mohr, Johan Jacob; Madsen, Søren Nørvang

    1996-01-01

    A method for estimation of time varying spatial baselines in airborne interferometric synthetic aperture radar (SAR) is described. The range and azimuth distortions between two images acquired with a non-linear baseline are derived. A parametric model of the baseline is then, in a least square...... sense, estimated from image shifts obtained by cross correlation of numerous small patches throughout the image. The method has been applied to airborne EMISAR imagery from the 1995 campaign over the Storstrommen Glacier in North East Greenland conducted by the Danish Center for Remote Sensing. This has...... reduced the baseline uncertainties from several meters to the centimeter level in a 36 km scene. Though developed for airborne SAR the method can easily be adopted to satellite data...

  4. Parkinson’s Disease Severity at 3 Years Can Be Predicted from Non-Motor Symptoms at Baseline

    Directory of Open Access Journals (Sweden)

    Alba Ayala

    2017-10-01

    Full Text Available ObjectiveThe aim of this study is to present a predictive model of Parkinson’s disease (PD global severity, measured with the Clinical Impression of Severity Index for Parkinson’s Disease (CISI-PD.MethodsThis is an observational, longitudinal study with annual follow-up assessments over 3 years (four time points. A multilevel analysis and multiple imputation techniques were performed to generate a predictive model that estimates changes in the CISI-PD at 1, 2, and 3 years.ResultsThe clinical state of patients (CISI-PD significantly worsened in the 3-year follow-up. However, this change was of small magnitude (effect size: 0.44. The following baseline variables were significant predictors of the global severity change: baseline global severity of disease, levodopa equivalent dose, depression and anxiety symptoms, autonomic dysfunction, and cognitive state. The goodness-of-fit of the model was adequate, and the sensitive analysis showed that the data imputation method applied was suitable.ConclusionDisease progression depends more on the individual’s baseline characteristics than on the 3-year time period. Results may contribute to a better understanding of the evolution of PD including the non-motor manifestations of the disease.

  5. Consideration of the baseline environment in examples of voluntary SEAs from Scotland

    International Nuclear Information System (INIS)

    Wright, Fiona

    2007-01-01

    Evidence from analysing and evaluating examples of three voluntary SEAs prepared in Scotland in the mid-late 1990s showed that different spatial and temporal scales were used when providing a baseline environment description. The SEAs analysed were prepared for: a wind farm siting programme that looked at national and short-term impacts; a land use plan that looked at regional and short-term impacts; and a transport plan that examined local and medium-term impacts. It was found that the two SEAs prepared by local government only considered impacts on the baseline environment within their jurisdictional boundaries whilst the SEA prepared by the private business considered impacts on the national baseline. A mixture of baseline data about planning, economic, environmental and social issues were included in the SEAs, however, evidence suggested that each SEA only focussed on those baseline features that might be significantly affected by the proposal. Each SEA also made extensive use of existing baseline information available from a variety of sources including local, and central government records and information from statutory bodies. All of the SEAs acknowledged that baseline data deficiencies existed and in certain cases steps were taken to obtain primary field data to help address these, however, it was also acknowledged that resource restrictions and decision-making deadlines limited the amount of primary baseline data that could be collected

  6. Baseline atmospheric program Australia 1993

    International Nuclear Information System (INIS)

    Francey, R.J.; Dick, A.L.; Derek, N.

    1996-01-01

    This publication reports activities, program summaries and data from the Cape Grim Baseline Air Pollution Station in Tasmania, during the calendar year 1993. These activities represent Australia's main contribution to the Background Air Pollution Monitoring Network (BAPMoN), part of the World Meteorological Organization's Global Atmosphere Watch (GAW). The report includes 5 research reports covering trace gas sampling, ozone and radon interdependence, analysis of atmospheric dimethylsulfide and carbon-disulfide, sampling of trace gas composition of the troposphere, and sulfur aerosol/CCN relationship in marine air. Summaries of program reports for the calendar year 1993 are also included. Tabs., figs., refs

  7. Baseline and Target Values for PV Forecasts: Toward Improved Solar Power Forecasting: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Jie; Hodge, Bri-Mathias; Lu, Siyuan; Hamann, Hendrik F.; Lehman, Brad; Simmons, Joseph; Campos, Edwin; Banunarayanan, Venkat

    2015-08-05

    Accurate solar power forecasting allows utilities to get the most out of the solar resources on their systems. To truly measure the improvements that any new solar forecasting methods can provide, it is important to first develop (or determine) baseline and target solar forecasting at different spatial and temporal scales. This paper aims to develop baseline and target values for solar forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of solar power output. forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of solar power output.

  8. Including investment risk in large-scale power market models

    DEFF Research Database (Denmark)

    Lemming, Jørgen Kjærgaard; Meibom, P.

    2003-01-01

    Long-term energy market models can be used to examine investments in production technologies, however, with market liberalisation it is crucial that such models include investment risks and investor behaviour. This paper analyses how the effect of investment risk on production technology selection...... can be included in large-scale partial equilibrium models of the power market. The analyses are divided into a part about risk measures appropriate for power market investors and a more technical part about the combination of a risk-adjustment model and a partial-equilibrium model. To illustrate...... the analyses quantitatively, a framework based on an iterative interaction between the equilibrium model and a separate risk-adjustment module was constructed. To illustrate the features of the proposed modelling approach we examined how uncertainty in demand and variable costs affects the optimal choice...

  9. Study on the calibration and optimization of double theodolites baseline

    Science.gov (United States)

    Ma, Jing-yi; Ni, Jin-ping; Wu, Zhi-chao

    2018-01-01

    For the double theodolites measurement system baseline as the benchmark of the scale of the measurement system and affect the accuracy of the system, this paper puts forward a method for calibration and optimization of the double theodolites baseline. Using double theodolites to measure the known length of the reference ruler, and then reverse the baseline formula. Based on the error propagation law, the analyses show that the baseline error function is an important index to measure the accuracy of the system, and the reference ruler position, posture and so on have an impact on the baseline error. The optimization model is established and the baseline error function is used as the objective function, and optimizes the position and posture of the reference ruler. The simulation results show that the height of the reference ruler has no effect on the baseline error; the posture is not uniform; when the reference ruler is placed at x=500mm and y=1000mm in the measurement space, the baseline error is the smallest. The experimental results show that the experimental results are consistent with the theoretical analyses in the measurement space. In this paper, based on the study of the placement of the reference ruler, for improving the accuracy of the double theodolites measurement system has a reference value.

  10. Effects of baseline conditions on the simulated hydrologic response to projected climate change

    Science.gov (United States)

    Koczot, Kathryn M.; Markstrom, Steven L.; Hay, Lauren E.

    2011-01-01

    Changes in temperature and precipitation projected from five general circulation models, using one late-twentieth-century and three twenty-first-century emission scenarios, were downscaled to three different baseline conditions. Baseline conditions are periods of measured temperature and precipitation data selected to represent twentieth-century climate. The hydrologic effects of the climate projections are evaluated using the Precipitation-Runoff Modeling System (PRMS), which is a watershed hydrology simulation model. The Almanor Catchment in the North Fork of the Feather River basin, California, is used as a case study. Differences and similarities between PRMS simulations of hydrologic components (i.e., snowpack formation and melt, evapotranspiration, and streamflow) are examined, and results indicate that the selection of a specific time period used for baseline conditions has a substantial effect on some, but not all, hydrologic variables. This effect seems to be amplified in hydrologic variables, which accumulate over time, such as soil-moisture content. Results also indicate that uncertainty related to the selection of baseline conditions should be evaluated using a range of different baseline conditions. This is particularly important for studies in basins with highly variable climate, such as the Almanor Catchment.

  11. Wind power projects in the CDM: Methodologies and tools for baselines, carbon financing and sustainability analysis

    International Nuclear Information System (INIS)

    Ringius, L.; Grohnheit, P.E.; Nielsen, L.H.; Olivier, A.L.; Painuly, J.; Villavicencio, A.

    2002-12-01

    The report is intended to be a guidance document for project developers, investors, lenders, and CDM host countries involved in wind power projects in the CDM. The report explores in particular those issues that are important in CDM project assessment and development - that is, baseline development, carbon financing, and environmental sustainability. It does not deal in detail with those issues that are routinely covered in a standard wind power project assessment. The report tests, compares, and recommends methodologies for and approaches to baseline development. To present the application and implications of the various methodologies and approaches in a concrete context, Africa's largest wind farm-namely the 60 MW wind farm located in Zafarana, Egypt- is examined as a hypothetical CDM wind power project The report shows that for the present case example there is a difference of about 25% between the lowest (0.5496 tCO2/MWh) and the highest emission rate (0.6868 tCO 2 /MWh) estimated in accordance with these three standardized approaches to baseline development according to the Marrakesh Accord. This difference in emission factors comes about partly as a result of including hydroelectric power in the baseline scenario. Hydroelectric resources constitute around 21% of the generation capacity in Egypt, and, if excluding hydropower, the difference between the lowest and the highest baseline is reduced to 18%. Furthermore, since the two variations of the 'historical' baseline option examined result in the highest and the lowest baselines, by disregarding this baseline option altogether the difference between the lowest and the highest is reduced to 16%. The ES3-model, which the Systems Analysis Department at Risoe National Laboratory has developed, makes it possible for this report to explore the project-specific approach to baseline development in some detail. Based on quite disaggregated data on the Egyptian electricity system, including the wind power production

  12. Program Baseline Change Control Procedure

    International Nuclear Information System (INIS)

    1993-02-01

    This procedure establishes the responsibilities and process for approving initial issues of and changes to the technical, cost, and schedule baselines, and selected management documents developed by the Office of Civilian Radioactive Waste Management (OCRWM) for the Civilian Radioactive Waste Management System. This procedure implements the OCRWM Baseline Management Plan and DOE Order 4700.1, Chg 1. It streamlines the change control process to enhance integration, accountability, and traceability of Level 0 and Level I decisions through standardized Baseline Change Proposal (BCP) forms to be used by the Level 0, 1, 2, and 3 Baseline Change Control Boards (BCCBs) and to be tracked in the OCRWM-wide Configuration Information System (CIS) Database.This procedure applies to all technical, cost, and schedule baselines controlled by the Energy System Acquisition Advisory Board (ESAAB) BCCB (Level 0) and, OCRWM Program Baseline Control Board (PBCCB) (Level 1). All baseline BCPs initiated by Level 2 or lower BCCBs, which require approval from ESAAB or PBCCB, shall be processed in accordance with this procedure. This procedure also applies to all Program-level management documents controlled by the OCRWM PBCCB

  13. Nonlinear Dynamic Inversion Baseline Control Law: Architecture and Performance Predictions

    Science.gov (United States)

    Miller, Christopher J.

    2011-01-01

    A model reference dynamic inversion control law has been developed to provide a baseline control law for research into adaptive elements and other advanced flight control law components. This controller has been implemented and tested in a hardware-in-the-loop simulation; the simulation results show excellent handling qualities throughout the limited flight envelope. A simple angular momentum formulation was chosen because it can be included in the stability proofs for many basic adaptive theories, such as model reference adaptive control. Many design choices and implementation details reflect the requirements placed on the system by the nonlinear flight environment and the desire to keep the system as basic as possible to simplify the addition of the adaptive elements. Those design choices are explained, along with their predicted impact on the handling qualities.

  14. Leveraging probabilistic peak detection to estimate baseline drift in complex chromatographic samples

    NARCIS (Netherlands)

    Lopatka, M.; Barcaru, A.; Sjerps, M.J.; Vivó-Truyols, G.

    2016-01-01

    Accurate analysis of chromatographic data often requires the removal of baseline drift. A frequently employed strategy strives to determine asymmetric weights in order to fit a baseline model by regression. Unfortunately, chromatograms characterized by a very high peak saturation pose a significant

  15. Baseline conditions at Olkiluoto

    International Nuclear Information System (INIS)

    2003-09-01

    The main purpose of this report is to establish a reference point - defined as the data collected up until the end of year 2002 - for the coming phases of the Finnish spent nuclear fuel disposal programme. The focus is: to define the current surface and underground conditions at the site, both as regards the properties for which a change is expected and for the properties which are of particular interest for long-term safety or environmental impact; to establish, as far as possible, the natural fluctuation of properties that are potentially affected by construction of the underground laboratory, the ONKALO, and to provide references to data on parameters or use in model development and testing and to use models to assist in understanding and interpreting the data. The emphasis of the baseline description is on bedrock characteristics that are relevant to the long-term safety of a spent fuel repository and, hence, to include the hydrogeological, hydrogeochemical, rock mechanical, tectonic and seismic conditions of the site. The construction of the ONKALO will also affect some conditions on the surface, and, therefore, a description of the main characteristics of the nature and the man-made constructions at Olkiluoto is also given. This report is primarily a road map to the available information on the prevailing conditions at the Olkiluoto site and a framework for understanding of data collected. Hence, it refers to numerous available background reports and other archived information produced over the past 20 years or more, and forms a recapitulation and revaluation of the characterisation data of the Olkiluoto site. (orig.)

  16. The Proximal Medial Sural Nerve Biopsy Model: A Standardised and Reproducible Baseline Clinical Model for the Translational Evaluation of Bioengineered Nerve Guides

    Directory of Open Access Journals (Sweden)

    Ahmet Bozkurt

    2014-01-01

    Full Text Available Autologous nerve transplantation (ANT is the clinical gold standard for the reconstruction of peripheral nerve defects. A large number of bioengineered nerve guides have been tested under laboratory conditions as an alternative to the ANT. The step from experimental studies to the implementation of the device in the clinical setting is often substantial and the outcome is unpredictable. This is mainly linked to the heterogeneity of clinical peripheral nerve injuries, which is very different from standardized animal studies. In search of a reproducible human model for the implantation of bioengineered nerve guides, we propose the reconstruction of sural nerve defects after routine nerve biopsy as a first or baseline study. Our concept uses the medial sural nerve of patients undergoing diagnostic nerve biopsy (≥2 cm. The biopsy-induced nerve gap was immediately reconstructed by implantation of the novel microstructured nerve guide, Neuromaix, as part of an ongoing first-in-human study. Here we present (i a detailed list of inclusion and exclusion criteria, (ii a detailed description of the surgical procedure, and (iii a follow-up concept with multimodal sensory evaluation techniques. The proximal medial sural nerve biopsy model can serve as a preliminarynature of the injuries or baseline nerve lesion model. In a subsequent step, newly developed nerve guides could be tested in more unpredictable and challenging clinical peripheral nerve lesions (e.g., following trauma which have reduced comparability due to the different nature of the injuries (e.g., site of injury and length of nerve gap.

  17. A Fully Customized Baseline Removal Framework for Spectroscopic Applications.

    Science.gov (United States)

    Giguere, Stephen; Boucher, Thomas; Carey, C J; Mahadevan, Sridhar; Dyar, M Darby

    2017-07-01

    The task of proper baseline or continuum removal is common to nearly all types of spectroscopy. Its goal is to remove any portion of a signal that is irrelevant to features of interest while preserving any predictive information. Despite the importance of baseline removal, median or guessed default parameters are commonly employed, often using commercially available software supplied with instruments. Several published baseline removal algorithms have been shown to be useful for particular spectroscopic applications but their generalizability is ambiguous. The new Custom Baseline Removal (Custom BLR) method presented here generalizes the problem of baseline removal by combining operations from previously proposed methods to synthesize new correction algorithms. It creates novel methods for each technique, application, and training set, discovering new algorithms that maximize the predictive accuracy of the resulting spectroscopic models. In most cases, these learned methods either match or improve on the performance of the best alternative. Examples of these advantages are shown for three different scenarios: quantification of components in near-infrared spectra of corn and laser-induced breakdown spectroscopy data of rocks, and classification/matching of minerals using Raman spectroscopy. Software to implement this optimization is available from the authors. By removing subjectivity from this commonly encountered task, Custom BLR is a significant step toward completely automatic and general baseline removal in spectroscopic and other applications.

  18. Exploring non standard physics in long-baseline neutrino oscillation experiments

    International Nuclear Information System (INIS)

    Chatterjee, Sabya Sachi

    2015-01-01

    After the recent discovery of large th ( 13), the focus has been shifted to address the remaining fundamental issues like neutrino mass ordering and CP-violation in leptonic sector. Future proposed Long-Baseline facilities like DUNE (1300 km baseline from FNAL to Homestake) and LBNO (2290 km baseline from CERN to Pyhasalmi) are well suited to address these issues at high confidence level. Not only to the standard framework, these experiments are highly capable to look for some new physics beyond the Standard Model scenario. In this work, we explore whether these high precision future facilities are sensitive to new U(1) global symmetries and upto which confidence level. (author)

  19. Poor Baseline Pulmonary Function May Not Increase the Risk of Radiation-Induced Lung Toxicity

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Jingbo [Department of Radiation Oncology, University of Michigan/Ann Arbor Veterans Health System, Ann Arbor, Michigan (United States); Department of Radiation Oncology, Cancer Hospital, Chinese Academic Medical Sciences and Peking Union Medical College, Beijing (China); Cao, Jianzhong [Department of Radiation Oncology, Cancer Hospital, Chinese Academic Medical Sciences and Peking Union Medical College, Beijing (China); Yuan, Shuanghu [Department of Radiation Oncology, University of Michigan/Ann Arbor Veterans Health System, Ann Arbor, Michigan (United States); Ji, Wei [Department of Radiation Oncology, Cancer Hospital, Chinese Academic Medical Sciences and Peking Union Medical College, Beijing (China); Arenberg, Douglas [Department of Internal Medicine, University of Michigan/Ann Arbor Veterans Health System, Ann Arbor, Michigan (United States); Dai, Jianrong [Department of Radiation Oncology, Cancer Hospital, Chinese Academic Medical Sciences and Peking Union Medical College, Beijing (China); Stanton, Paul; Tatro, Daniel; Ten Haken, Randall K. [Department of Radiation Oncology, University of Michigan/Ann Arbor Veterans Health System, Ann Arbor, Michigan (United States); Wang, Luhua, E-mail: wlhwq@yahoo.com [Department of Radiation Oncology, Cancer Hospital, Chinese Academic Medical Sciences and Peking Union Medical College, Beijing (China); Kong, Feng-Ming, E-mail: fengkong@med.umich.edu [Department of Radiation Oncology, University of Michigan/Ann Arbor Veterans Health System, Ann Arbor, Michigan (United States)

    2013-03-01

    Purpose: Poor pulmonary function (PF) is often considered a contraindication to definitive radiation therapy for lung cancer. This study investigated whether baseline PF was associated with radiation-induced lung toxicity (RILT) in patients with non-small cell lung cancer (NSCLC) receiving conformal radiation therapy (CRT). Methods and Materials: NSCLC patients treated with CRT and tested for PF at baseline were eligible. Baseline predicted values of forced expiratory volume in 1 sec (FEV1), forced vital capacity (FVC), and diffusion capacity of lung for carbon monoxide (DLCO) were analyzed. Additional factors included age, gender, smoking status, Karnofsky performance status, coexisting chronic obstructive pulmonary disease (COPD), tumor location, histology, concurrent chemotherapy, radiation dose, and mean lung dose (MLD) were evaluated for RILT. The primary endpoint was symptomatic RILT (SRILT), including grade ≥2 radiation pneumonitis and fibrosis. Results: There was a total of 260 patients, and SRILT occurred in 58 (22.3%) of them. Mean FEV1 values for SRILT and non-SRILT patients were 71.7% and 65.9% (P=.077). Under univariate analysis, risk of SRILT increased with MLD (P=.008), the absence of COPD (P=.047), and FEV1 (P=.077). Age (65 split) and MLD were significantly associated with SRILT in multivariate analysis. The addition of FEV1 and age with the MLD-based model slightly improved the predictability of SRILT (area under curve from 0.63-0.70, P=.088). Conclusions: Poor baseline PF does not increase the risk of SRILT, and combining FEV1, age, and MLD may improve the predictive ability.

  20. Network meta-analysis of disconnected networks: How dangerous are random baseline treatment effects?

    Science.gov (United States)

    Béliveau, Audrey; Goring, Sarah; Platt, Robert W; Gustafson, Paul

    2017-12-01

    In network meta-analysis, the use of fixed baseline treatment effects (a priori independent) in a contrast-based approach is regularly preferred to the use of random baseline treatment effects (a priori dependent). That is because, often, there is not a need to model baseline treatment effects, which carry the risk of model misspecification. However, in disconnected networks, fixed baseline treatment effects do not work (unless extra assumptions are made), as there is not enough information in the data to update the prior distribution on the contrasts between disconnected treatments. In this paper, we investigate to what extent the use of random baseline treatment effects is dangerous in disconnected networks. We take 2 publicly available datasets of connected networks and disconnect them in multiple ways. We then compare the results of treatment comparisons obtained from a Bayesian contrast-based analysis of each disconnected network using random normally distributed and exchangeable baseline treatment effects to those obtained from a Bayesian contrast-based analysis of their initial connected network using fixed baseline treatment effects. For the 2 datasets considered, we found that the use of random baseline treatment effects in disconnected networks was appropriate. Because those datasets were not cherry-picked, there should be other disconnected networks that would benefit from being analyzed using random baseline treatment effects. However, there is also a risk for the normality and exchangeability assumption to be inappropriate in other datasets even though we have not observed this situation in our case study. We provide code, so other datasets can be investigated. Copyright © 2017 John Wiley & Sons, Ltd.

  1. An Energy Efficiency Evaluation Method Based on Energy Baseline for Chemical Industry

    Directory of Open Access Journals (Sweden)

    Dong-mei Yao

    2016-01-01

    Full Text Available According to the requirements and structure of ISO 50001 energy management system, this study proposes an energy efficiency evaluation method based on energy baseline for chemical industry. Using this method, the energy plan implementation effect in the processes of chemical production can be evaluated quantitatively, and evidences for system fault diagnosis can be provided. This method establishes the energy baseline models which can meet the demand of the different kinds of production processes and gives the general solving method of each kind of model according to the production data. Then the energy plan implementation effect can be evaluated and also whether the system is running normally can be determined through the baseline model. Finally, this method is used on cracked gas compressor unit of ethylene plant in some petrochemical enterprise; it can be proven that this method is correct and practical.

  2. Retrieving CO concentrations from FT-IR spectra with nonmodeled interferences and fluctuating baselines using PCR model parameters

    DEFF Research Database (Denmark)

    Bak, J.

    2001-01-01

    It is demonstrated that good predictions of gas concentrations based on measured spectra can be made even if these spectra contain totally overlapping spectral features from nonidentified and non-modeled interfering compounds and fluctuating baselines. The prediction program (CONTOUR) is based...... solely on principal component regression (PCR) model parameters, CONTOUR consists of two smaller algorithms. The first of these is used to calculate pure component spectra based on the PCR model parameters at different concentrations. In the second algorithm, the calculated pure component spectra...... remains. The assumptions are that the background and analytical signals must be additive and that no accidental match between these signals takes place. The best results are obtained with the use of spectra with a high selectivity. The use of the program is demonstrated hg applying simple single...

  3. Malignancy risk estimation of screen-detected nodules at baseline CT: comparison of the PanCan model, Lung-RADS and NCCN guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Riel, Sarah J. van; Ciompi, Francesco; Jacobs, Colin; Scholten, Ernst T.; Prokop, Mathias; Ginneken, Bram van [Radboud University Nijmegen Medical Center, Department of Radiology and Nuclear Medicine, Nijmegen (Netherlands); Winkler Wille, Mathilde M.; Naqibullah, Matiullah [University of Copenhagen, Department of Pulmonology Gentofte Hospital, Hellerup (Denmark); Lam, Stephen [British Columbia Cancer Agency, Department of Integrative Oncology, Vancouver, British Columbia (Canada); Schaefer-Prokop, Cornelia [Radboud University Nijmegen Medical Center, Department of Radiology and Nuclear Medicine, Nijmegen (Netherlands); Meander Medical Center, Department of Radiology, Amersfoort (Netherlands)

    2017-10-15

    To compare the PanCan model, Lung-RADS and the 1.2016 National Comprehensive Cancer Network (NCCN) guidelines for discriminating malignant from benign pulmonary nodules on baseline screening CT scans and the impact diameter measurement methods have on performances. From the Danish Lung Cancer Screening Trial database, 64 CTs with malignant nodules and 549 baseline CTs with benign nodules were included. Performance of the systems was evaluated applying the system's original diameter definitions: D{sup longest-C} (PanCan), D{sup meanAxial} (NCCN), both obtained from axial sections, and D{sup mean3D} (Lung-RADS). Subsequently all diameter definitions were applied uniformly to all systems. Areas under the ROC curves (AUC) were used to evaluate risk discrimination. PanCan performed superiorly to Lung-RADS and NCCN (AUC 0.874 vs. 0.813, p = 0.003; 0.874 vs. 0.836, p = 0.010), using the original diameter specifications. When uniformly applying D{sup longest-C}, D{sup mean3D} and D{sup meanAxial}, PanCan remained superior to Lung-RADS (p < 0.001 - p = 0.001) and NCCN (p < 0.001 - p = 0.016). Diameter definition significantly influenced NCCN's performance with D{sup longest-C} being the worst (D{sup longest-C} vs. D{sup mean3D}, p = 0.005; D{sup longest-C} vs. D{sup meanAxial}, p = 0.016). Without follow-up information, the PanCan model performs significantly superiorly to Lung-RADS and the 1.2016 NCCN guidelines for discriminating benign from malignant nodules. The NCCN guidelines are most sensitive to nodule size definition. (orig.)

  4. Choice of baseline climate data impacts projected species' responses to climate change.

    Science.gov (United States)

    Baker, David J; Hartley, Andrew J; Butchart, Stuart H M; Willis, Stephen G

    2016-07-01

    Climate data created from historic climate observations are integral to most assessments of potential climate change impacts, and frequently comprise the baseline period used to infer species-climate relationships. They are often also central to downscaling coarse resolution climate simulations from General Circulation Models (GCMs) to project future climate scenarios at ecologically relevant spatial scales. Uncertainty in these baseline data can be large, particularly where weather observations are sparse and climate dynamics are complex (e.g. over mountainous or coastal regions). Yet, importantly, this uncertainty is almost universally overlooked when assessing potential responses of species to climate change. Here, we assessed the importance of historic baseline climate uncertainty for projections of species' responses to future climate change. We built species distribution models (SDMs) for 895 African bird species of conservation concern, using six different climate baselines. We projected these models to two future periods (2040-2069, 2070-2099), using downscaled climate projections, and calculated species turnover and changes in species-specific climate suitability. We found that the choice of baseline climate data constituted an important source of uncertainty in projections of both species turnover and species-specific climate suitability, often comparable with, or more important than, uncertainty arising from the choice of GCM. Importantly, the relative contribution of these factors to projection uncertainty varied spatially. Moreover, when projecting SDMs to sites of biodiversity importance (Important Bird and Biodiversity Areas), these uncertainties altered site-level impacts, which could affect conservation prioritization. Our results highlight that projections of species' responses to climate change are sensitive to uncertainty in the baseline climatology. We recommend that this should be considered routinely in such analyses. © 2016 John Wiley

  5. The Baselines Project: Establishing Reference Environmental Conditions for Marine Habitats in the Gulf of Mexico using Forecast Models and Satellite Data

    Science.gov (United States)

    Jolliff, J. K.; Gould, R. W.; deRada, S.; Teague, W. J.; Wijesekera, H. W.

    2012-12-01

    We provide an overview of the NASA-funded project, "High-Resolution Subsurface Physical and Optical Property Fields in the Gulf of Mexico: Establishing Baselines and Assessment Tools for Resource Managers." Data assimilative models, analysis fields, and multiple satellite data streams were used to construct temperature and photon flux climatologies for the Flower Garden Banks National Marine Sanctuary (FGBNMS) and similar habitats in the northwestern Gulf of Mexico where geologic features provide a platform for unique coral reef ecosystems. Comparison metrics of the products to in situ data collected during complimentary projects are also examined. Similarly, high-resolution satellite-data streams and advanced processing techniques were used to establish baseline suspended sediment load and turbidity conditions in selected northern Gulf of Mexico estuaries. The results demonstrate the feasibility of blending models and data into accessible web-based analysis products for resource managers, policy makers, and the public.

  6. Unsteady panel method for complex configurations including wake modeling

    CSIR Research Space (South Africa)

    Van Zyl, Lourens H

    2008-01-01

    Full Text Available implementations of the DLM are however not very versatile in terms of geometries that can be modeled. The ZONA6 code offers a versatile surface panel body model including a separated wake model, but uses a pressure panel method for lifting surfaces. This paper...

  7. Perceived Family Functioning Predicts Baseline Psychosocial Characteristics in U.S. Participants of a Family Focused Grief Therapy Trial.

    Science.gov (United States)

    Schuler, Tammy A; Zaider, Talia I; Li, Yuelin; Masterson, Melissa; McDonnell, Glynnis A; Hichenberg, Shira; Loeb, Rebecca; Kissane, David W

    2017-07-01

    Screening and baseline data on 170 American families (620 individuals), selected by screening from a palliative care population for inclusion in a randomized controlled trial of family-focused grief therapy, were examined to determine whether family dysfunction conferred higher levels of psychosocial morbidity. We hypothesized that greater family dysfunction would, indeed, be associated with poorer psychosocial outcomes among palliative care patients and their family members. Screened families were classified according to their functioning on the Family Relationships Index (FRI) and consented families completed baseline assessments. Mixed-effects modeling with post hoc tests compared individuals' baseline psychosocial outcomes (psychological distress, social functioning, and family functioning on a different measure) according to the classification of their family on the FRI. Covariates were included in all models as appropriate. For those who completed baseline measures, 191 (30.0%) individuals were in low-communicating families, 313 (50.5%) in uninvolved families, and 116 (18.7%) in conflictual families. Family class was significantly associated (at ps ≤ 0.05) with increased psychological distress (Beck Depression Inventory and Brief Symptom Inventory) and poorer social adjustment (Social Adjustment Scale) for individual family members. The family assessment device supported the concurrent accuracy of the FRI. As predicted, significantly greater levels of individual psychosocial morbidity were present in American families whose functioning as a group was poorer. Support was generated for a clinical approach that screens families to identify those at high risk. Overall, these baseline data point to the importance of a family-centered model of care. Copyright © 2017 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  8. Baseline Optimization for the Measurement of CP Violation, Mass Hierarchy, and $\\theta_{23}$ Octant in a Long-Baseline Neutrino Oscillation Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Bass, M. [Colorado State U.; Bishai, M. [Brookhaven; Cherdack, D. [Colorado State U.; Diwan, M. [Brookhaven; Djurcic, Z. [Argonne; Hernandez, J. [Houston U.; Lundberg, B. [Fermilab; Paolone, V. [Pittsburgh U.; Qian, X. [Brookhaven; Rameika, R. [Fermilab; Whitehead, L. [Houston U.; Wilson, R. J. [Colorado State U.; Worcester, E. [Brookhaven; Zeller, G. [Fermilab

    2015-03-19

    Next-generation long-baseline electron neutrino appearance experiments will seek to discover CP violation, determine the mass hierarchy and resolve the θ23 octant. In light of the recent precision measurements of θ13, we consider the sensitivity of these measurements in a study to determine the optimal baseline, including practical considerations regarding beam and detector performance. We conclude that a detector at a baseline of at least 1000 km in a wide-band muon neutrino beam is the optimal configuration.

  9. DairyBISS Baseline report

    NARCIS (Netherlands)

    Buizer, N.N.; Berhanu, Tinsae; Murutse, Girmay; Vugt, van S.M.

    2015-01-01

    This baseline report of the Dairy Business Information Service and Support (DairyBISS) project presents the findings of a baseline survey among 103 commercial farms and 31 firms and advisors working in the dairy value chain. Additional results from the survey among commercial dairy farms are

  10. Validity and Reliability of Baseline Testing in a Standardized Environment.

    Science.gov (United States)

    Higgins, Kathryn L; Caze, Todd; Maerlender, Arthur

    2017-08-11

    The Immediate Postconcussion Assessment and Cognitive Testing (ImPACT) is a computerized neuropsychological test battery commonly used to determine cognitive recovery from concussion based on comparing post-injury scores to baseline scores. This model is based on the premise that ImPACT baseline test scores are a valid and reliable measure of optimal cognitive function at baseline. Growing evidence suggests that this premise may not be accurate and a large contributor to invalid and unreliable baseline test scores may be the protocol and environment in which baseline tests are administered. This study examined the effects of a standardized environment and administration protocol on the reliability and performance validity of athletes' baseline test scores on ImPACT by comparing scores obtained in two different group-testing settings. Three hundred-sixty one Division 1 cohort-matched collegiate athletes' baseline data were assessed using a variety of indicators of potential performance invalidity; internal reliability was also examined. Thirty-one to thirty-nine percent of the baseline cases had at least one indicator of low performance validity, but there were no significant differences in validity indicators based on environment in which the testing was conducted. Internal consistency reliability scores were in the acceptable to good range, with no significant differences between administration conditions. These results suggest that athletes may be reliably performing at levels lower than their best effort would produce. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Mechanical Thrombectomy in Elderly Stroke Patients with Mild-to-Moderate Baseline Disability.

    Science.gov (United States)

    Slawski, Diana E; Salahuddin, Hisham; Shawver, Julie; Kenmuir, Cynthia L; Tietjen, Gretchen E; Korsnack, Andrea; Zaidi, Syed F; Jumaa, Mouhammad A

    2018-04-01

    The number of elderly patients suffering from ischemic stroke is rising. Randomized trials of mechanical thrombectomy (MT) generally exclude patients over the age of 80 years with baseline disability. The aim of this study was to understand the efficacy and safety of MT in elderly patients, many of whom may have baseline impairment. Between January 2015 and April 2017, 96 patients ≥80 years old who underwent MT for stroke were selected for a chart review. The data included baseline characteristics, time to treatment, the rate of revascularization, procedural complications, mortality, and 90-day good outcome defined as a modified Rankin Scale (mRS) score of 0-2 or return to baseline. Of the 96 patients, 50 had mild baseline disability (mRS score 0-1) and 46 had moderate disability (mRS score 2-4). Recanalization was achieved in 84% of the patients, and the rate of symptomatic hemorrhage was 6%. At 90 days, 34% of the patients had a good outcome. There were no significant differences in good outcome between those with mild and those with moderate baseline disability (43 vs. 24%, p = 0.08), between those aged ≤85 and those aged > 85 years (40.8 vs. 26.1%, p = 0.19), and between those treated within and those treated beyond 8 h (39 vs. 20%, p = 0.1). The mortality rate was 38.5% at 90 days. The Alberta Stroke Program Early CT Score (ASPECTS) and the National Institutes of Health Stroke Scale (NIHSS) predicted good outcome regardless of baseline disability ( p baseline disability, and delayed treatment are associated with sub-optimal outcomes after MT. However, redefining good outcome to include return to baseline functioning demonstrates that one-third of this patient population benefits from MT, suggesting the real-life utility of this treatment.

  12. Baseline rationing

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Moreno-Ternero, Juan D.; Østerdal, Lars Peter Raahave

    The standard problem of adjudicating conflicting claims describes a situation in which a given amount of a divisible good has to be allocated among agents who hold claims against it exceeding the available amount. This paper considers more general rationing problems in which, in addition to claims...... to international protocols for the reduction of greenhouse emissions, or water distribution in drought periods. We define a family of allocation methods for such general rationing problems - called baseline rationing rules - and provide an axiomatic characterization for it. Any baseline rationing rule within...... the family is associated with a standard rule and we show that if the latter obeys some properties reflecting principles of impartiality, priority and solidarity, the former obeys them too....

  13. Reactive flow modeling of small scale detonation failure experiments for a baseline non-ideal explosive

    Energy Technology Data Exchange (ETDEWEB)

    Kittell, David E.; Cummock, Nick R.; Son, Steven F. [School of Mechanical Engineering, Purdue University, West Lafayette, Indiana 47907 (United States)

    2016-08-14

    Small scale characterization experiments using only 1–5 g of a baseline ammonium nitrate plus fuel oil (ANFO) explosive are discussed and simulated using an ignition and growth reactive flow model. There exists a strong need for the small scale characterization of non-ideal explosives in order to adequately survey the wide parameter space in sample composition, density, and microstructure of these materials. However, it is largely unknown in the scientific community whether any useful or meaningful result may be obtained from detonation failure, and whether a minimum sample size or level of confinement exists for the experiments. In this work, it is shown that the parameters of an ignition and growth rate law may be calibrated using the small scale data, which is obtained from a 35 GHz microwave interferometer. Calibration is feasible when the samples are heavily confined and overdriven; this conclusion is supported with detailed simulation output, including pressure and reaction contours inside the ANFO samples. The resulting shock wave velocity is most likely a combined chemical-mechanical response, and simulations of these experiments require an accurate unreacted equation of state (EOS) in addition to the calibrated reaction rate. Other experiments are proposed to gain further insight into the detonation failure data, as well as to help discriminate between the role of the EOS and reaction rate in predicting the measured outcome.

  14. The California Baseline Methane Survey

    Science.gov (United States)

    Duren, R. M.; Thorpe, A. K.; Hopkins, F. M.; Rafiq, T.; Bue, B. D.; Prasad, K.; Mccubbin, I.; Miller, C. E.

    2017-12-01

    The California Baseline Methane Survey is the first systematic, statewide assessment of methane point source emissions. The objectives are to reduce uncertainty in the state's methane budget and to identify emission mitigation priorities for state and local agencies, utilities and facility owners. The project combines remote sensing of large areas with airborne imaging spectroscopy and spatially resolved bottom-up data sets to detect, quantify and attribute emissions from diverse sectors including agriculture, waste management, oil and gas production and the natural gas supply chain. Phase 1 of the project surveyed nearly 180,000 individual facilities and infrastructure components across California in 2016 - achieving completeness rates ranging from 20% to 100% per emission sector at < 5 meters spatial resolution. Additionally, intensive studies of key areas and sectors were performed to assess source persistence and variability at times scales ranging from minutes to months. Phase 2 of the project continues with additional data collection in Spring and Fall 2017. We describe the survey design and measurement, modeling and analysis methods. We present initial findings regarding the spatial, temporal and sectoral distribution of methane point source emissions in California and their estimated contribution to the state's total methane budget. We provide case-studies and lessons learned about key sectors including examples where super-emitters were identified and mitigated. We summarize challenges and recommendations for future methane research, inventories and mitigation guidance within and beyond California.

  15. Baseline restoration using current conveyors

    International Nuclear Information System (INIS)

    Morgado, A.M.L.S.; Simoes, J.B.; Correia, C.M.

    1996-01-01

    A good performance of high resolution nuclear spectrometry systems, at high pulse rates, demands restoration of baseline between pulses, in order to remove rate dependent baseline shifts. This restoration is performed by circuits named baseline restorers (BLRs) which also remove low frequency noise, such as power supply hum and detector microphonics. This paper presents simple circuits for baseline restoration based on a commercial current conveyor (CCII01). Tests were performed, on two circuits, with periodic trapezoidal shaped pulses in order to measure the baseline restoration for several pulse rates and restorer duty cycles. For the current conveyor based Robinson restorer, the peak shift was less than 10 mV, for duty cycles up to 60%, at high pulse rates. Duty cycles up to 80% were also tested, being the maximum peak shift 21 mV. The peak shift for the current conveyor based Grubic restorer was also measured. The maximum value found was 30 mV at 82% duty cycle. Keeping the duty cycle below 60% improves greatly the restorer performance. The ability of both baseline restorer architectures to reject low frequency modulation is also measured, with good results on both circuits

  16. A publication database for optical long baseline interferometry

    Science.gov (United States)

    Malbet, Fabien; Mella, Guillaume; Lawson, Peter; Taillifet, Esther; Lafrasse, Sylvain

    2010-07-01

    Optical long baseline interferometry is a technique that has generated almost 850 refereed papers to date. The targets span a large variety of objects from planetary systems to extragalactic studies and all branches of stellar physics. We have created a database hosted by the JMMC and connected to the Optical Long Baseline Interferometry Newsletter (OLBIN) web site using MySQL and a collection of XML or PHP scripts in order to store and classify these publications. Each entry is defined by its ADS bibcode, includes basic ADS informations and metadata. The metadata are specified by tags sorted in categories: interferometric facilities, instrumentation, wavelength of operation, spectral resolution, type of measurement, target type, and paper category, for example. The whole OLBIN publication list has been processed and we present how the database is organized and can be accessed. We use this tool to generate statistical plots of interest for the community in optical long baseline interferometry.

  17. Large short-baseline νμ disappearance

    International Nuclear Information System (INIS)

    Giunti, Carlo; Laveder, Marco

    2011-01-01

    We analyze the LSND, KARMEN, and MiniBooNE data on short-baseline ν μ →ν e oscillations and the data on short-baseline ν e disappearance obtained in the Bugey-3 and CHOOZ reactor experiments in the framework of 3+1 antineutrino mixing, taking into account the MINOS observation of long-baseline ν μ disappearance and the KamLAND observation of very-long-baseline ν e disappearance. We show that the fit of the data implies that the short-baseline disappearance of ν μ is relatively large. We obtain a prediction of an effective amplitude sin 2 2θ μμ > or approx. 0.1 for short-baseline ν μ disappearance generated by 0.2 2 2 , which could be measured in future experiments.

  18. Unintended cultivation, shifting baselines, and conflict between objectives for fisheries and conservation.

    Science.gov (United States)

    Brown, Christopher J; Trebilco, Rowan

    2014-06-01

    The effects of fisheries on marine ecosystems, and their capacity to drive shifts in ecosystem states, have been widely documented. Less well appreciated is that some commercially valuable species respond positively to fishing-induced ecosystem change and can become important fisheries resources in modified ecosystems. Thus, the ecological effects of one fishery can unintentionally increase the abundance and productivity of other fished species (i.e., cultivate). We reviewed examples of this effect in the peer-reviewed literature. We found 2 underlying ecosystem drivers of the effect: trophic release of prey species when predators are overfished and habitat change. Key ecological, social, and economic conditions required for one fishery to unintentionally cultivate another include strong top-down control of prey by predators, the value of the new fishery, and the capacity of fishers to adapt to a new fishery. These unintended cultivation effects imply strong trade-offs between short-term fishery success and conservation efforts to restore ecosystems toward baseline conditions because goals for fisheries and conservation may be incompatible. Conflicts are likely to be exacerbated if fisheries baselines shift relative to conservation baselines and there is investment in the new fishery. However, in the long-term, restoration toward ecosystem baselines may often benefit both fishery and conservation goals. Unintended cultivation can be identified and predicted using a combination of time-series data, dietary studies, models of food webs, and socioeconomic data. Identifying unintended cultivation is necessary for management to set compatible goals for fisheries and conservation. © 2014 Society for Conservation Biology.

  19. Stochastic modelling of two-phase flows including phase change

    International Nuclear Information System (INIS)

    Hurisse, O.; Minier, J.P.

    2011-01-01

    Stochastic modelling has already been developed and applied for single-phase flows and incompressible two-phase flows. In this article, we propose an extension of this modelling approach to two-phase flows including phase change (e.g. for steam-water flows). Two aspects are emphasised: a stochastic model accounting for phase transition and a modelling constraint which arises from volume conservation. To illustrate the whole approach, some remarks are eventually proposed for two-fluid models. (authors)

  20. Leveraging probabilistic peak detection to estimate baseline drift in complex chromatographic samples.

    Science.gov (United States)

    Lopatka, Martin; Barcaru, Andrei; Sjerps, Marjan J; Vivó-Truyols, Gabriel

    2016-01-29

    Accurate analysis of chromatographic data often requires the removal of baseline drift. A frequently employed strategy strives to determine asymmetric weights in order to fit a baseline model by regression. Unfortunately, chromatograms characterized by a very high peak saturation pose a significant challenge to such algorithms. In addition, a low signal-to-noise ratio (i.e. s/npeak detection algorithm. A posterior probability of being affected by a peak is computed for each point in the chromatogram, leading to a set of weights that allow non-iterative calculation of a baseline estimate. For extremely saturated chromatograms, the peak weighted (PW) method demonstrates notable improvement compared to the other methods examined. However, in chromatograms characterized by low-noise and well-resolved peaks, the asymmetric least squares (ALS) and the more sophisticated Mixture Model (MM) approaches achieve superior results in significantly less time. We evaluate the performance of these three baseline correction methods over a range of chromatographic conditions to demonstrate the cases in which each method is most appropriate. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Baseline Screening Mammography: Performance of Full-Field Digital Mammography Versus Digital Breast Tomosynthesis.

    Science.gov (United States)

    McDonald, Elizabeth S; McCarthy, Anne Marie; Akhtar, Amana L; Synnestvedt, Marie B; Schnall, Mitchell; Conant, Emily F

    2015-11-01

    Baseline mammography studies have significantly higher recall rates than mammography studies with available comparison examinations. Digital breast tomosynthesis reduces recalls when compared with digital mammographic screening alone, but many sites operate in a hybrid environment. To maximize the effect of screening digital breast tomosynthesis with limited resources, choosing which patient populations will benefit most is critical. This study evaluates digital breast tomosynthesis in the baseline screening population. Outcomes were compared for 10,728 women who underwent digital mammography screening, including 1204 (11.2%) baseline studies, and 15,571 women who underwent digital breast tomosynthesis screening, including 1859 (11.9%) baseline studies. Recall rates, cancer detection rates, and positive predictive values were calculated. Logistic regression estimated the odds ratios of recall for digital mammography versus digital breast tomosynthesis for patients undergoing baseline screening and previously screened patients, adjusted for age, race, and breast density. In the baseline subgroup, recall rates for digital mammography and digital breast tomosynthesis screening were 20.5% and 16.0%, respectively (p = 0.002); digital breast tomosynthesis screening in the baseline subgroup resulted in a 22% reduction in recall compared with digital mammography, or 45 fewer patients recalled per 1000 patients screened. Digital breast tomosynthesis screening in the previously screened patients resulted in recall reduction of 14.3% (p tomosynthesis than from digital mammography alone.

  2. Very long baseline interferometry applied to polar motion, relativity, and geodesy. Ph.D. thesis

    International Nuclear Information System (INIS)

    Ma, C.

    1978-01-01

    The causes and effects of diurnal polar motion are described. An algorithm was developed for modeling the effects on very long baseline interferometry observables. A selection was made between two three-station networks for monitoring polar motion. The effects of scheduling and the number of sources observed on estimated baseline errors are discussed. New hardware and software techniques in very long baseline interferometry are described

  3. Environmental Baseline File National Transportation

    International Nuclear Information System (INIS)

    Harris, M.

    1999-01-01

    This Environmental Baseline File summarizes and consolidates information related to the national-level transportation of commercial spent nuclear fuel. Topics addressed include: shipments of commercial spent nuclear fuel based on mostly truck and mostly rail shipping scenarios; transportation routing for commercial spent nuclear fuel sites and DOE sites; radionuclide inventories for various shipping container capacities; transportation routing; populations along transportation routes; urbanized area population densities; the impacts of historical, reasonably foreseeable, and general transportation; state-level food transfer factors; Federal Guidance Report No. 11 and 12 radionuclide dose conversion factors; and national average atmospheric conditions

  4. A funding model for health visiting: baseline requirements--part 1.

    Science.gov (United States)

    Cowley, Sarah

    2007-11-01

    A funding model proposed in two papers will outline the health visiting resource, including team skill mix, required to deliver the recommended approach of 'progressive universalism,' taking account of health inequalities, best evidence and impact on outcomes that might be anticipated. The model has been discussed as far as possible across the professional networks of both the Community Practitioners' and Health Visitors' Association (CPHVA) and United Kingdom Public Health Association (UKPHA), and is a consensus statement agreed by all who have participated.

  5. Baseline Neurocognitive Test Results In Non-concussed Athletes: Does Sleep Matter?

    OpenAIRE

    McClure, D. Jake; Zuckerman, Scott L.; Kutscher, Scott J.; Gregory, Andrew; Solomon, Gary S.

    2013-01-01

    Objectives: When managing sport-related concussions (SRC), sports medicine physicians utilize serial neurocognitive assessments and self-reported symptom inventories when evaluating athlete recovery and safety for returning to play (RTP). Since post-concussive RTP goals include symptom resolution and return to neurocognitive baseline, clinical decisions rest on an understanding of modifiers of baseline performance. Several studies have reported the influence of age, gender and sport on baseli...

  6. Modeling Electric Double-Layers Including Chemical Reaction Effects

    DEFF Research Database (Denmark)

    Paz-Garcia, Juan Manuel; Johannesson, Björn; Ottosen, Lisbeth M.

    2014-01-01

    A physicochemical and numerical model for the transient formation of an electric double-layer between an electrolyte and a chemically-active flat surface is presented, based on a finite elements integration of the nonlinear Nernst-Planck-Poisson model including chemical reactions. The model works...... for symmetric and asymmetric multi-species electrolytes and is not limited to a range of surface potentials. Numerical simulations are presented, for the case of a CaCO3 electrolyte solution in contact with a surface with rate-controlled protonation/deprotonation reactions. The surface charge and potential...... are determined by the surface reactions, and therefore they depends on the bulk solution composition and concentration...

  7. Why are models unable to reproduce multi-decadal trends in lower tropospheric baseline ozone levels?

    Science.gov (United States)

    Hu, L.; Liu, J.; Mickley, L. J.; Strahan, S. E.; Steenrod, S.

    2017-12-01

    Assessments of tropospheric ozone radiative forcing rely on accurate model simulations. Parrish et al (2014) found that three chemistry-climate models (CCMs) overestimate present-day O3 mixing ratios and capture only 50% of the observed O3 increase over the last five decades at 12 baseline sites in the northern mid-latitudes, indicating large uncertainties in our understanding of the ozone trends and their implications for radiative forcing. Here we present comparisons of outputs from two chemical transport models (CTMs) - GEOS-Chem and the Global Modeling Initiative model - with O3 observations from the same sites and from the global ozonesonde network. Both CTMs are driven by reanalysis meteorological data (MERRA or MERRA2) and thus are expected to be different in atmospheric transport processes relative to those freely running CCMs. We test whether recent model developments leading to more active ozone chemistry affect the computed ozone sensitivity to perturbations in emissions. Preliminary results suggest these CTMs can reproduce present-day ozone levels but fail to capture the multi-decadal trend since 1980. Both models yield widespread overpredictions of free tropospheric ozone in the 1980s. Sensitivity studies in GEOS-Chem suggest that the model estimate of natural background ozone is too high. We discuss factors that contribute to the variability and trends of tropospheric ozone over the last 30 years, with a focus on intermodel differences in spatial resolution and in the representation of stratospheric chemistry, stratosphere-troposphere exchange, halogen chemistry, and biogenic VOC emissions and chemistry. We also discuss uncertainty in the historical emission inventories used in models, and how these affect the simulated ozone trends.

  8. Addendum to the 2015 Eastern Interconnect Baselining and Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Follum, James D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-06-30

    This report serves as an addendum to the report 2015 Eastern Interconnect Baselining and Analysis Report (Amidan, Follum, and Freeman, 2015). This addendum report investigates the following: the impact of shorter record lengths and of adding a daily regularization term to the date/time models for angle pair measurements, additional development of a method to monitor the trend in phase angle pairs, the effect of changing the length of time to determine a baseline, when calculating atypical events, and a comparison between quantitatively discovered atypical events and actual events.

  9. 40 CFR 1042.825 - Baseline determination.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Baseline determination. 1042.825... Provisions for Remanufactured Marine Engines § 1042.825 Baseline determination. (a) For the purpose of this... not valid. (f) Use good engineering judgment for all aspects of the baseline determination. We may...

  10. Ecological baseline studies in Los Alamos and Guaje Canyons County of Los Alamos, New Mexico. A two-year study

    Energy Technology Data Exchange (ETDEWEB)

    Foxx, T.S. [comp.

    1995-11-01

    During the summers of 1993 and 1994, the Biological Resource Evaluations Team (BRET) of the Environmental Protection Group (ESH-8) conducted baseline studies within two canyon systems, Los Alamos and Guaje Canyons. Biological data was collected within each canyon to provide background and baseline information for Ecological Risk models. Baseline studies included establishment of permanent vegetation plots within each canyon along the elevational gradient. Then, in association with the various vegetation types, surveys were conducted for ground dwelling insects, birds, and small mammals. The stream channels associated with the permanent vegetation plots were characterized and aquatic macroinvertebrates collected within the stream monthly throughout a six-month period. The Geographic Position System (GPS) in combination with ARC INFO was used to map the study areas. Considerable data was collected during these surveys and are summarized in individual chapters.

  11. MODEL OF THE TOKAMAK EDGE DENSITY PEDESTAL INCLUDING DIFFUSIVE NEUTRALS

    International Nuclear Information System (INIS)

    BURRELL, K.H.

    2003-01-01

    OAK-B135 Several previous analytic models of the tokamak edge density pedestal have been based on diffusive transport of plasma plus free-streaming of neutrals. This latter neutral model includes only the effect of ionization and neglects charge exchange. The present work models the edge density pedestal using diffusive transport for both the plasma and the neutrals. In contrast to the free-streaming model, a diffusion model for the neutrals includes the effect of both charge exchange and ionization and is valid when charge exchange is the dominant interaction. Surprisingly, the functional forms for the electron and neutral density profiles from the present calculation are identical to the results of the previous analytic models. There are some differences in the detailed definition of various parameters in the solution. For experimentally relevant cases where ionization and charge exchange rate are comparable, both models predict approximately the same width for the edge density pedestal

  12. BASELINE DESIGN/ECONOMICS FOR ADVANCED FISCHER-TROPSCH TECHNOLOGY; FINAL

    International Nuclear Information System (INIS)

    None

    1998-01-01

    Bechtel, along with Amoco as the main subcontractor, developed a Baseline design, two alternative designs, and computer process simulation models for indirect coal liquefaction based on advanced Fischer-Tropsch (F-T) technology for the U. S. Department of Energy's (DOE's) Federal Energy Technology Center (FETC)

  13. The Dutch CAFE baseline: In or out of line?

    NARCIS (Netherlands)

    Jimmink BA; Folkert RJM; Thomas R; Beck JP; Eerdt MM van; Elzenga HE; Hoek KW van der; Hoen A; Peek CJ; LED; KMD; NMD; LVM; RIM; LDL

    2004-01-01

    The European Commission is constructing a strategy on air pollution within the Clean Air For Europe (CAFE) programme. This strategy will be based on assessments using the RAINS model for different policy ambitions where the CAFE baseline scenario and control strategies are employed. The Netherlands

  14. Base-line studies for DAE establishments

    International Nuclear Information System (INIS)

    Puranik, V.D.

    2012-01-01

    The Department of Atomic Energy has establishments located in various regions of the country and they include front-end fuel cycle facilities, nuclear power stations, back-end fuel cycle facilities and facilities for research and societal applications. These facilities handle naturally occurring radionuclides such as uranium, thorium and a variety of man-made radionuclides. These radionuclides are handled with utmost care so that they do not affect adversely the occupational workers or the members of public residing nearby. There is safety culture of the highest standard existing in all DAE establishments and it matches with the international standards. In addition, there is a perpetual environmental monitoring program carried out by the Environmental Survey Laboratories (ESLs) located at all DAE establishments. The environmental data generated by such program is studied regularly by experts to ensure compliance with the regulatory requirements. The regulatory requirements in the country are of international standards and ensure adequate protection of workers and members of public. In addition to such continued monitoring program and studies being carried out for the ongoing projects, base-line studies are carried out for all the new projects of the DAE. The purpose of the base-line studies is to establish a detailed base-line data set for a new DAE location well before the foundation stone is laid, so that the data collected when there is no departmental activity can be compared with the data generated later by the ESL. The data so generated is site specific and it varies from place to place depending upon the location of the site, e.g., inland or coastal, the presence of water bodies and pattern of irrigation, the geological characteristics of the location, the local culture and habits of the people, population density and urban or rural background. The data to be recorded as base-line data is generated over a period of at least one year covering all the seasons

  15. Predicting Baseline for Analysis of Electricity Pricing

    Energy Technology Data Exchange (ETDEWEB)

    Kim, T. [Ulsan National Inst. of Science and Technology (Korea, Republic of); Lee, D. [Ulsan National Inst. of Science and Technology (Korea, Republic of); Choi, J. [Ulsan National Inst. of Science and Technology (Korea, Republic of); Spurlock, A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sim, A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Todd, A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wu, K. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-05-03

    To understand the impact of new pricing structure on residential electricity demands, we need a baseline model that captures every factor other than the new price. The standard baseline is a randomized control group, however, a good control group is hard to design. This motivates us to devlop data-driven approaches. We explored many techniques and designed a strategy, named LTAP, that could predict the hourly usage years ahead. The key challenge in this process is that the daily cycle of electricity demand peaks a few hours after the temperature reaching its peak. Existing methods rely on the lagged variables of recent past usages to enforce this daily cycle. These methods have trouble making predictions years ahead. LTAP avoids this trouble by assuming the daily usage profile is determined by temperature and other factors. In a comparison against a well-designed control group, LTAP is found to produce accurate predictions.

  16. MO-FG-CAMPUS-TeP3-01: A Model of Baseline Shift to Improve Robustness of Proton Therapy Treatments of Moving Tumors

    Energy Technology Data Exchange (ETDEWEB)

    Souris, K; Barragan Montero, A; Di Perri, D; Geets, X; Lee, J [Universite catholique de Louvain, Brussels (Belgium); Sterpin, E [Universite catholique de Louvain, Brussels (Belgium); KU Leuven, Leuven (Belgium)

    2016-06-15

    Purpose: The shift in mean position of a moving tumor also known as “baseline shift”, has been modeled, in order to automatically generate uncertainty scenarios for the assessment and robust optimization of proton therapy treatments in lung cancer. Methods: An average CT scan and a Mid-Position CT scan (MidPCT) of the patient at the planning time are first generated from a 4D-CT data. The mean position of the tumor along the breathing cycle is represented by the GTV contour in the MidPCT. Several studies reported both systematic and random variations of the mean tumor position from fraction to fraction. Our model can simulate this baseline shift by generating a local deformation field that moves the tumor on all phases of the 4D-CT, without creating any non-physical artifact. The deformation field is comprised of normal and tangential components with respect to the lung wall in order to allow the tumor to slip within the lung instead of deforming the lung surface. The deformation field is eventually smoothed in order to enforce its continuity. Two 4D-CT series acquired at 1 week of interval were used to validate the model. Results: Based on the first 4D-CT set, the model was able to generate a third 4D-CT that reproduced the 5.8 mm baseline-shift measured in the second 4D-CT. Water equivalent thickness (WET) of the voxels have been computed for the 3 average CTs. The root mean square deviation of the WET in the GTV is 0.34 mm between week 1 and week 2, and 0.08 mm between the simulated data and week 2. Conclusion: Our model can be used to automatically generate uncertainty scenarios for robustness analysis of a proton therapy plan. The generated scenarios can also feed a TPS equipped with a robust optimizer. Kevin Souris, Ana Barragan, and Dario Di Perri are financially supported by Televie Grants from F.R.S.-FNRS.

  17. Longitudinal predictive ability of mapping models: examining post-intervention EQ-5D utilities derived from baseline MHAQ data in rheumatoid arthritis patients.

    Science.gov (United States)

    Kontodimopoulos, Nick; Bozios, Panagiotis; Yfantopoulos, John; Niakas, Dimitris

    2013-04-01

    The purpose of this methodological study was to to provide insight into the under-addressed issue of the longitudinal predictive ability of mapping models. Post-intervention predicted and reported utilities were compared, and the effect of disease severity on the observed differences was examined. A cohort of 120 rheumatoid arthritis (RA) patients (60.0% female, mean age 59.0) embarking on therapy with biological agents completed the Modified Health Assessment Questionnaire (MHAQ) and the EQ-5D at baseline, and at 3, 6 and 12 months post-intervention. OLS regression produced a mapping equation to estimate post-intervention EQ-5D utilities from baseline MHAQ data. Predicted and reported utilities were compared with t test, and the prediction error was modeled, using fixed effects, in terms of covariates such as age, gender, time, disease duration, treatment, RF, DAS28 score, predicted and reported EQ-5D. The OLS model (RMSE = 0.207, R(2) = 45.2%) consistently underestimated future utilities, with a mean prediction error of 6.5%. Mean absolute differences between reported and predicted EQ-5D utilities at 3, 6 and 12 months exceeded the typically reported MID of the EQ-5D (0.03). According to the fixed-effects model, time, lower predicted EQ-5D and higher DAS28 scores had a significant impact on prediction errors, which appeared increasingly negative for lower reported EQ-5D scores, i.e., predicted utilities tended to be lower than reported ones in more severe health states. This study builds upon existing research having demonstrated the potential usefulness of mapping disease-specific instruments onto utility measures. The specific issue of longitudinal validity is addressed, as mapping models derived from baseline patients need to be validated on post-therapy samples. The underestimation of post-treatment utilities in the present study, at least in more severe patients, warrants further research before it is prudent to conduct cost-utility analyses in the context

  18. Baseline cerebral oximetry values depend on non-modifiable patient characteristics.

    Science.gov (United States)

    Valencia, Lucía; Rodríguez-Pérez, Aurelio; Ojeda, Nazario; Santana, Romen Yone; Morales, Laura; Padrón, Oto

    2015-12-01

    The aim of the present study was to evaluate baseline regional cerebral oxygen saturation (rSO2) values and identify factors influencing preoperative rSO2 in elective minor surgery. Observational analysis post-hoc. Observational post-hoc analysis of data for the patient sample (n=50) of a previously conducted clinical trial in patients undergoing tumourectomy for breast cancer or inguinal hernia repair. Exclusion criteria included pre-existing cerebrovascular diseases, anaemia, baseline pulse oximetry values were recorded while the patient breathed room air, using the INVOS 5100C monitor™ (Covidien, Dublin, Ireland). Thirty-seven women (72%) and 13 men (28%) 48 ± 13 years of age were enrolled in this study. Baseline rSO2 was 62.01 ± 10.38%. Baseline rSO2 was significantly different between men (67.6 ± 11.2%) and women (60 ± 9.4%), (P=0.023). There were also differences between baseline rSO2 and ASA physical status (ASA I: 67.6 ± 10.7%, ASA II: 61.6 ± 8.4%, ASA III: 55.8 ± 13.9%, P=0.045). Baseline rSO2 had a positive correlation with body weight (r=0.347, P=0.014) and height (r=0.345, P=0.014). We also found significant differences in baseline rSO2 among patients with and without chronic renal failure (P=0.005). No differences were found in any other studied variables. Non-modifiable patient characteristics (ASA physical status, sex, chronic renal failure, body weight and height) influence baseline rSO2. Copyright © 2015 Société française d’anesthésie et de réanimation (Sfar). Published by Elsevier Masson SAS. All rights reserved.

  19. 33 CFR 2.20 - Territorial sea baseline.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Territorial sea baseline. 2.20... JURISDICTION Jurisdictional Terms § 2.20 Territorial sea baseline. Territorial sea baseline means the line.... Normally, the territorial sea baseline is the mean low water line along the coast of the United States...

  20. BALANCED SCORECARDS EVALUATION MODEL THAT INCLUDES ELEMENTS OF ENVIRONMENTAL MANAGEMENT SYSTEM USING AHP MODEL

    Directory of Open Access Journals (Sweden)

    Jelena Jovanović

    2010-03-01

    Full Text Available The research is oriented on improvement of environmental management system (EMS using BSC (Balanced Scorecard model that presents strategic model of measurem ents and improvement of organisational performance. The research will present approach of objectives and environmental management me trics involvement (proposed by literature review in conventional BSC in "Ad Barska plovi dba" organisation. Further we will test creation of ECO-BSC model based on business activities of non-profit organisations in order to improve envir onmental management system in parallel with other systems of management. Using this approach we may obtain 4 models of BSC that includ es elements of environmen tal management system for AD "Barska plovidba". Taking into acc ount that implementation and evaluation need long period of time in AD "Barska plovidba", the final choice will be based on 14598 (Information technology - Software product evaluation and ISO 9126 (Software engineering - Product quality using AHP method. Those standards are usually used for evaluation of quality software product and computer programs that serve in organisation as support and factors for development. So, AHP model will be bas ed on evolution criteria based on suggestion of ISO 9126 standards and types of evaluation from two evaluation teams. Members of team & will be experts in BSC and environmental management system that are not em ployed in AD "Barska Plovidba" organisation. The members of team 2 will be managers of AD "Barska Plovidba" organisation (including manage rs from environmental department. Merging results based on previously cr eated two AHP models, one can obtain the most appropriate BSC that includes elements of environmental management system. The chosen model will present at the same time suggestion for approach choice including ecological metrics in conventional BSC model for firm that has at least one ECO strategic orientation.

  1. Baseline LAW Glass Formulation Testing

    International Nuclear Information System (INIS)

    Kruger, Albert A.; Mooers, Cavin; Bazemore, Gina; Pegg, Ian L.; Hight, Kenneth; Lai, Shan Tao; Buechele, Andrew; Rielley, Elizabeth; Gan, Hao; Muller, Isabelle S.; Cecil, Richard

    2013-01-01

    The major objective of the baseline glass formulation work was to develop and select glass formulations that are compliant with contractual and processing requirements for each of the LAW waste streams. Other objectives of the work included preparation and characterization of glasses with respect to the properties of interest, optimization of sulfate loading in the glasses, evaluation of ability to achieve waste loading limits, testing to demonstrate compatibility of glass melts with melter materials of construction, development of glass formulations to support ILAW qualification activities, and identification of glass formulation issues with respect to contract specifications and processing requirements

  2. Double-gate junctionless transistor model including short-channel effects

    International Nuclear Information System (INIS)

    Paz, B C; Pavanello, M A; Ávila-Herrera, F; Cerdeira, A

    2015-01-01

    This work presents a physically based model for double-gate junctionless transistors (JLTs), continuous in all operation regimes. To describe short-channel transistors, short-channel effects (SCEs), such as increase of the channel potential due to drain bias, carrier velocity saturation and mobility degradation due to vertical and longitudinal electric fields, are included in a previous model developed for long-channel double-gate JLTs. To validate the model, an analysis is made by using three-dimensional numerical simulations performed in a Sentaurus Device Simulator from Synopsys. Different doping concentrations, channel widths and channel lengths are considered in this work. Besides that, the series resistance influence is numerically included and validated for a wide range of source and drain extensions. In order to check if the SCEs are appropriately described, besides drain current, transconductance and output conductance characteristics, the following parameters are analyzed to demonstrate the good agreement between model and simulation and the SCEs occurrence in this technology: threshold voltage (V TH ), subthreshold slope (S) and drain induced barrier lowering. (paper)

  3. Model for safety reports including descriptive examples

    International Nuclear Information System (INIS)

    1995-12-01

    Several safety reports will be produced in the process of planning and constructing the system for disposal of high-level radioactive waste in Sweden. The present report gives a model, with detailed examples, of how these reports should be organized and what steps they should include. In the near future safety reports will deal with the encapsulation plant and the repository. Later reports will treat operation of the handling systems and the repository

  4. A proposal to create an extension to the European baseline series.

    Science.gov (United States)

    Wilkinson, Mark; Gallo, Rosella; Goossens, An; Johansen, Jeanne D; Rustemeyer, Thomas; Sánchez-Pérez, Javier; Schuttelaar, Marie L; Uter, Wolfgang

    2018-02-01

    The current European baseline series consists of 30 allergens, and was last updated in 2015. To use data from the European Surveillance System on Contact Allergies (ESSCA) to propose an extension to the European baseline series in response to changes in environmental exposures. Data from departmental and national extensions to the baseline series, together with some temporary additions from departments contributing to the ESSCA, were collated during 2013-2014. In total, 31689 patients were patch tested in 46 European departments. Many departments and national groups already consider the current European baseline series to be a suboptimal screen, and use their own extensions to it. The haptens tested are heterogeneous, although there are some consistent themes. Potential haptens to include in an extension to the European baseline series comprise sodium metabisulfite, formaldehyde-releasing preservatives, additional markers of fragrance allergy, propolis, Compositae mix, and 2-hydroxyethyl methacrylate. In combination with other published work from the ESSCA, changes to the current European baseline series are proposed for discussion. As well as addition of the allergens listed above, it is suggested that primin and clioquinol should be deleted from the series, owing to reduced environmental exposure. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. New light Higgs boson and short-baseline neutrino anomalies

    Science.gov (United States)

    Asaadi, J.; Church, E.; Guenette, R.; Jones, B. J. P.; Szelc, A. M.

    2018-04-01

    The low-energy excesses observed by the MiniBooNE experiment have, to date, defied a convincing explanation under the standard model even with accommodation for nonzero neutrino mass. In this paper we explore a new oscillation mechanism to explain these anomalies, invoking a light neutrinophilic Higgs boson, conceived to induce a low Dirac neutrino mass in accord with experimental limits. Beam neutrinos forward scattering off of a locally overdense relic neutrino background give rise to a novel matter effect with an energy-specific resonance. An enhanced oscillation around this resonance peak produces flavor transitions which are highly consistent with the MiniBooNE neutrino- and antineutrino-mode data sets. The model provides substantially improved χ2 values beyond either the no-oscillation hypothesis or the more commonly explored 3 +1 sterile neutrino hypothesis. This mechanism would introduce distinctive signatures at each baseline in the upcoming short-baseline neutrino program at Fermilab, presenting opportunities for further exploration.

  6. Program Baseline Change Control Board charter

    International Nuclear Information System (INIS)

    1993-02-01

    The purpose of this Charter is to establish the Program Baseline Change Control Board (PBCCB) for the Office of Civilian Radioactive Waste Management (OCRWM) Program, and to describe its organization, responsibilities, and basic methods of operation. Guidance for implementing this Charter is provided by the OCRWM Baseline Management Plan (BMP) and OCRWM Program Baseline Change Control Procedure

  7. Baseline Plasma C-Reactive Protein Concentrations and Motor Prognosis in Parkinson Disease.

    Directory of Open Access Journals (Sweden)

    Atsushi Umemura

    Full Text Available C-reactive protein (CRP, a blood inflammatory biomarker, is associated with the development of Alzheimer disease. In animal models of Parkinson disease (PD, systemic inflammatory stimuli can promote neuroinflammation and accelerate dopaminergic neurodegeneration. However, the association between long-term systemic inflammations and neurodegeneration has not been assessed in PD patients.To investigate the longitudinal effects of baseline CRP concentrations on motor prognosis in PD.Retrospective analysis of 375 patients (mean age, 69.3 years; mean PD duration, 6.6 years. Plasma concentrations of high-sensitivity CRP were measured in the absence of infections, and the Unified Parkinson's Disease Rating Scale Part III (UPDRS-III scores were measured at five follow-up intervals (Days 1-90, 91-270, 271-450, 451-630, and 631-900.Change of UPDRS-III scores from baseline to each of the five follow-up periods.Change in UPDRS-III scores was significantly greater in PD patients with CRP concentrations ≥0.7 mg/L than in those with CRP concentrations <0.7 mg/L, as determined by a generalized estimation equation model (P = 0.021 for the entire follow-up period and by a generalized regression model (P = 0.030 for the last follow-up interval (Days 631-900. The regression coefficients of baseline CRP for the two periods were 1.41 (95% confidence interval [CI] 0.21-2.61 and 2.62 (95% CI 0.25-4.98, respectively, after adjusting for sex, age, baseline UPDRS-III score, dementia, and incremental L-dopa equivalent dose.Baseline plasma CRP levels were associated with motor deterioration and predicted motor prognosis in patients with PD. These associations were independent of sex, age, PD severity, dementia, and anti-Parkinsonian agents, suggesting that subclinical systemic inflammations could accelerate neurodegeneration in PD.

  8. Modelling a linear PM motor including magnetic saturation

    NARCIS (Netherlands)

    Polinder, H.; Slootweg, J.G.; Compter, J.C.; Hoeijmakers, M.J.

    2002-01-01

    The use of linear permanent-magnet (PM) actuators increases in a wide variety of applications because of the high force density, robustness and accuracy. The paper describes the modelling of a linear PM motor applied in, for example, wafer steppers, including magnetic saturation. This is important

  9. Conditional analysis of mixed Poisson processes with baseline counts: implications for trial design and analysis.

    Science.gov (United States)

    Cook, Richard J; Wei, Wei

    2003-07-01

    The design of clinical trials is typically based on marginal comparisons of a primary response under two or more treatments. The considerable gains in efficiency afforded by models conditional on one or more baseline responses has been extensively studied for Gaussian models. The purpose of this article is to present methods for the design and analysis of clinical trials in which the response is a count or a point process, and a corresponding baseline count is available prior to randomization. The methods are based on a conditional negative binomial model for the response given the baseline count and can be used to examine the effect of introducing selection criteria on power and sample size requirements. We show that designs based on this approach are more efficient than those proposed by McMahon et al. (1994).

  10. Baseline factors that influence ASAS 20 response in patients with ankylosing spondylitis treated with etanercept.

    Science.gov (United States)

    Davis, John C; Van der Heijde, Désirée M F M; Dougados, Maxime; Braun, Jurgen; Cush, John J; Clegg, Daniel O; Inman, Robert D; de Vries, Todd; Tsuji, Wayne H

    2005-09-01

    To examine the baseline demographic and disease characteristics that might influence improvement as measured by the Assessment in Ankylosing Spondylitis Response Criteria (ASAS 20) in patients with ankylosing spondylitis (AS). A multicenter Phase 3 study was performed to compare the safety and efficacy of 24 weeks of etanercept 25 mg subcutaneous injection twice weekly (n = 138) and placebo (n = 139) in patients with AS. The ASAS 20 was measured at multiple time points. Using a significance level of 0.05, a repeated measures logistic regression model was used to determine which baseline factors influenced response in the etanercept-treated patients during the 24-week double blind portion of the trial. The following baseline factors were used in the model: demographic and disease severity variables, concomitant medications, extra-articular manifestations, and HLA-B27 status. The predictive capability of the model was then tested on the patients receiving placebo after they had received open-label etanercept treatment. Baseline factors that were significant predictors of an ASAS 20 response in etanercept-treated patients were C-reactive protein (CRP), back pain score, and Bath Ankylosing Spondylitis Functional Index (BASFI) score. Although clinical response to etanercept was seen at all levels of baseline disease activity, responses were consistently more likely with higher CRP levels or back pain scores and less likely with increased BASFI scores at baseline. Higher CRP values and back pain scores and lower BASFI scores at baseline were significant predictors of a higher ASAS 20 response in patients with AS receiving etanercept but predictive value was of insufficient magnitude to determine treatment in individual patients.

  11. Regional geochemical baselines for Portuguese shelf sediments

    International Nuclear Information System (INIS)

    Mil-Homens, M.; Stevens, R.L.; Cato, I.; Abrantes, F.

    2007-01-01

    Metal concentrations (Al, Cr, Cu, Ni, Pb and Zn) from the DGM-INETI archive data set have been examined for sediments collected during the 1970s from 267 sites on the Portuguese shelf. Due to the differences in the oceanographic and sedimentological settings between western and Algarve coasts, the archive data set is split in two segments. For both shelf segments, regional geochemical baselines (RGB) are defined using aluminium as a reference element. Seabed samples recovered in 2002 from four distinct areas of the Portuguese shelf are superimposed on these models to identify and compare possible metal enrichments relative to the natural distribution. Metal enrichments associated with anthropogenic influences are identified in three samples collected nearby the Tejo River and are characterised by the highest enrichment factors (EF; EF Pb Zn < 4). EF values close to 1 suggest a largely natural origin for metal distributions in sediments from the other areas included in the study. - Background metal concentrations and their natural variability must be established before assessing anthropogenic impacts

  12. Establishing a store baseline during interim storage of waste packages and a review of potential technologies for base-lining

    Energy Technology Data Exchange (ETDEWEB)

    McTeer, Jennifer; Morris, Jenny; Wickham, Stephen [Galson Sciences Ltd. Oakham, Rutland (United Kingdom); Bolton, Gary [National Nuclear Laboratory Risley, Warrington (United Kingdom); McKinney, James; Morris, Darrell [Nuclear Decommissioning Authority Moor Row, Cumbria (United Kingdom); Angus, Mike [National Nuclear Laboratory Risley, Warrington (United Kingdom); Cann, Gavin; Binks, Tracey [National Nuclear Laboratory Sellafield (United Kingdom)

    2013-07-01

    Interim storage is an essential component of the waste management lifecycle, providing a safe, secure environment for waste packages awaiting final disposal. In order to be able to monitor and detect change or degradation of the waste packages, storage building or equipment, it is necessary to know the original condition of these components (the 'waste storage system'). This paper presents an approach to establishing the baseline for a waste-storage system, and provides guidance on the selection and implementation of potential base-lining technologies. The approach is made up of two sections; assessment of base-lining needs and definition of base-lining approach. During the assessment of base-lining needs a review of available monitoring data and store/package records should be undertaken (if the store is operational). Evolutionary processes (affecting safety functions), and their corresponding indicators, that can be measured to provide a baseline for the waste-storage system should then be identified in order for the most suitable indicators to be selected for base-lining. In defining the approach, identification of opportunities to collect data and constraints is undertaken before selecting the techniques for base-lining and developing a base-lining plan. Base-lining data may be used to establish that the state of the packages is consistent with the waste acceptance criteria for the storage facility and to support the interpretation of monitoring and inspection data collected during store operations. Opportunities and constraints are identified for different store and package types. Technologies that could potentially be used to measure baseline indicators are also reviewed. (authors)

  13. Should Studies of Diabetes Treatment Stratification Correct for Baseline HbA1c?

    Science.gov (United States)

    Jones, Angus G.; Lonergan, Mike; Henley, William E.; Pearson, Ewan R.; Hattersley, Andrew T.; Shields, Beverley M.

    2016-01-01

    Aims Baseline HbA1c is a major predictor of response to glucose lowering therapy and therefore a potential confounder in studies aiming to identify other predictors. However, baseline adjustment may introduce error if the association between baseline HbA1c and response is substantially due to measurement error and regression to the mean. We aimed to determine whether studies of predictors of response should adjust for baseline HbA1c. Methods We assessed the relationship between baseline HbA1c and glycaemic response in 257 participants treated with GLP-1R agonists and assessed whether it reflected measurement error and regression to the mean using duplicate ‘pre-baseline’ HbA1c measurements not included in the response variable. In this cohort and an additional 2659 participants treated with sulfonylureas we assessed the relationship between covariates associated with baseline HbA1c and treatment response with and without baseline adjustment, and with a bias correction using pre-baseline HbA1c to adjust for the effects of error in baseline HbA1c. Results Baseline HbA1c was a major predictor of response (R2 = 0.19,β = -0.44,pHbA1c were associated with response, however these associations were weak or absent after adjustment for baseline HbA1c. Bias correction did not substantially alter associations. Conclusions Adjustment for the baseline HbA1c measurement is a simple and effective way to reduce bias in studies of predictors of response to glucose lowering therapy. PMID:27050911

  14. NETWORK DESIGN IN CLOSE-RANGE PHOTOGRAMMETRY WITH SHORT BASELINE IMAGES

    Directory of Open Access Journals (Sweden)

    L. Barazzetti

    2017-08-01

    Full Text Available The avaibility of automated software for image-based 3D modelling has changed the way people acquire images for photogrammetric applications. Short baseline images are required to match image points with SIFT-like algorithms, obtaining more images than those necessary for “old fashioned” photogrammetric projects based on manual measurements. This paper describes some considerations on network design for short baseline image sequences, especially on precision and reliability of bundle adjustment. Simulated results reveal that the large number of 3D points used for image orientation has very limited impact on network precision.

  15. Baseline process description for simulating plutonium oxide production for precalc project

    Energy Technology Data Exchange (ETDEWEB)

    Pike, J. A. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-10-26

    Savannah River National Laboratory (SRNL) started a multi-year project, the PreCalc Project, to develop a computational simulation of a plutonium oxide (PuO2) production facility with the objective to study the fundamental relationships between morphological and physicochemical properties. This report provides a detailed baseline process description to be used by SRNL personnel and collaborators to facilitate the initial design and construction of the simulation. The PreCalc Project team selected the HB-Line Plutonium Finishing Facility as the basis for a nominal baseline process since the facility is operational and significant model validation data can be obtained. The process boundary as well as process and facility design details necessary for multi-scale, multi-physics models are provided.

  16. Emergency Response Capability Baseline Needs Assessment - Requirements Document

    Energy Technology Data Exchange (ETDEWEB)

    Sharry, J A

    2016-10-04

    This document was prepared by John A. Sharry, LLNL Fire Marshal and LLNL Division Leader for Fire Protection and reviewed by LLNL Emergency Management Department Head James Colson. The document follows and expands upon the format and contents of the DOE Model Fire Protection Baseline Capabilities Assessment document contained on the DOE Fire Protection Web Site, but only addresses emergency response.

  17. An Energy Efficiency Evaluation Method Based on Energy Baseline for Chemical Industry

    OpenAIRE

    Yao, Dong-mei; Zhang, Xin; Wang, Ke-feng; Zou, Tao; Wang, Dong; Qian, Xin-hua

    2016-01-01

    According to the requirements and structure of ISO 50001 energy management system, this study proposes an energy efficiency evaluation method based on energy baseline for chemical industry. Using this method, the energy plan implementation effect in the processes of chemical production can be evaluated quantitatively, and evidences for system fault diagnosis can be provided. This method establishes the energy baseline models which can meet the demand of the different kinds of production proce...

  18. Olkiluoto surface hydrological modelling: Update 2012 including salt transport modelling

    International Nuclear Information System (INIS)

    Karvonen, T.

    2013-11-01

    Posiva Oy is responsible for implementing a final disposal program for spent nuclear fuel of its owners Teollisuuden Voima Oyj and Fortum Power and Heat Oy. The spent nuclear fuel is planned to be disposed at a depth of about 400-450 meters in the crystalline bedrock at the Olkiluoto site. Leakages located at or close to spent fuel repository may give rise to the upconing of deep highly saline groundwater and this is a concern with regard to the performance of the tunnel backfill material after the closure of the tunnels. Therefore a salt transport sub-model was added to the Olkiluoto surface hydrological model (SHYD). The other improvements include update of the particle tracking algorithm and possibility to estimate the influence of open drillholes in a case where overpressure in inflatable packers decreases causing a hydraulic short-circuit between hydrogeological zones HZ19 and HZ20 along the drillhole. Four new hydrogeological zones HZ056, HZ146, BFZ100 and HZ039 were added to the model. In addition, zones HZ20A and HZ20B intersect with each other in the new structure model, which influences salinity upconing caused by leakages in shafts. The aim of the modelling of long-term influence of ONKALO, shafts and repository tunnels provide computational results that can be used to suggest limits for allowed leakages. The model input data included all the existing leakages into ONKALO (35-38 l/min) and shafts in the present day conditions. The influence of shafts was computed using eight different values for total shaft leakage: 5, 11, 20, 30, 40, 50, 60 and 70 l/min. The selection of the leakage criteria for shafts was influenced by the fact that upconing of saline water increases TDS-values close to the repository areas although HZ20B does not intersect any deposition tunnels. The total limit for all leakages was suggested to be 120 l/min. The limit for HZ20 zones was proposed to be 40 l/min: about 5 l/min the present day leakages to access tunnel, 25 l/min from

  19. Olkiluoto surface hydrological modelling: Update 2012 including salt transport modelling

    Energy Technology Data Exchange (ETDEWEB)

    Karvonen, T. [WaterHope, Helsinki (Finland)

    2013-11-15

    Posiva Oy is responsible for implementing a final disposal program for spent nuclear fuel of its owners Teollisuuden Voima Oyj and Fortum Power and Heat Oy. The spent nuclear fuel is planned to be disposed at a depth of about 400-450 meters in the crystalline bedrock at the Olkiluoto site. Leakages located at or close to spent fuel repository may give rise to the upconing of deep highly saline groundwater and this is a concern with regard to the performance of the tunnel backfill material after the closure of the tunnels. Therefore a salt transport sub-model was added to the Olkiluoto surface hydrological model (SHYD). The other improvements include update of the particle tracking algorithm and possibility to estimate the influence of open drillholes in a case where overpressure in inflatable packers decreases causing a hydraulic short-circuit between hydrogeological zones HZ19 and HZ20 along the drillhole. Four new hydrogeological zones HZ056, HZ146, BFZ100 and HZ039 were added to the model. In addition, zones HZ20A and HZ20B intersect with each other in the new structure model, which influences salinity upconing caused by leakages in shafts. The aim of the modelling of long-term influence of ONKALO, shafts and repository tunnels provide computational results that can be used to suggest limits for allowed leakages. The model input data included all the existing leakages into ONKALO (35-38 l/min) and shafts in the present day conditions. The influence of shafts was computed using eight different values for total shaft leakage: 5, 11, 20, 30, 40, 50, 60 and 70 l/min. The selection of the leakage criteria for shafts was influenced by the fact that upconing of saline water increases TDS-values close to the repository areas although HZ20B does not intersect any deposition tunnels. The total limit for all leakages was suggested to be 120 l/min. The limit for HZ20 zones was proposed to be 40 l/min: about 5 l/min the present day leakages to access tunnel, 25 l/min from

  20. A method to establish seismic noise baselines for automated station assessment

    Science.gov (United States)

    McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.

    2009-01-01

    We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).

  1. Wind power projects in the CDM: Methodologies and tools for baselines, carbon financing and substainability analysis[CDM=Clean Development Mechanism

    Energy Technology Data Exchange (ETDEWEB)

    Ringius, L.; Grohnheit, P.E.; Nielsen, L.H.; Olivier, A.L.; Painuly, J.; Villavicencio, A.

    2002-12-01

    The report is intended to be a guidance document for project developers, investors, lenders, and CDM host countries involved in wind power projects in the CDM. The report explores in particular those issues that are important in CDM project assessment and development - that is, baseline development, carbon financing, and environmental sustainability. It does not deal in detail with those issues that are routinely covered in a standard wind power project assessment. The report tests, compares, and recommends methodologies for and approaches to baseline development. To present the application and implications of the various methodologies and approaches in a concrete context, Africa's largest wind farm-namely the 60 MW wind farm located in Zafarana, Egypt- is examined as a hypothetical CDM wind power project The report shows that for the present case example there is a difference of about 25% between the lowest (0.5496 tCO2/MWh) and the highest emission rate (0.6868 tCO{sub 2}/MWh) estimated in accordance with these three standardized approaches to baseline development according to the Marrakesh Accord. This difference in emission factors comes about partly as a result of including hydroelectric power in the baseline scenario. Hydroelectric resources constitute around 21% of the generation capacity in Egypt, and, if excluding hydropower, the difference between the lowest and the highest baseline is reduced to 18%. Furthermore, since the two variations of the 'historical' baseline option examined result in the highest and the lowest baselines, by disregarding this baseline option altogether the difference between the lowest and the highest is reduced to 16%. The ES3-model, which the Systems Analysis Department at Risoe National Laboratory has developed, makes it possible for this report to explore the project-specific approach to baseline development in some detail. Based on quite disaggregated data on the Egyptian electricity system, including the wind

  2. The impact of GPS receiver modifications and ionospheric activity on Swarm baseline determination

    Science.gov (United States)

    Mao, X.; Visser, P. N. A. M.; van den IJssel, J.

    2018-05-01

    The European Space Agency (ESA) Swarm mission is a satellite constellation launched on 22 November 2013 aiming at observing the Earth geomagnetic field and its temporal variations. The three identical satellites are equipped with high-precision dual-frequency Global Positioning System (GPS) receivers, which make the constellation an ideal test bed for baseline determination. From October 2014 to August 2016, a number of GPS receiver modifications and a new GPS Receiver Independent Exchange Format (RINEX) converter were implemented. Moreover, the on-board GPS receiver performance has been influenced by the ionospheric scintillations. The impact of these factors is assessed for baseline determination of the pendulum formation flying Swarm-A and -C satellites. In total 30 months of data - from 15 July 2014 to the end of 2016 - is analyzed. The assessment includes analysis of observation residuals, success rate of GPS carrier phase ambiguity fixing, a consistency check between the so-called kinematic and reduced-dynamic baseline solution, and validations of orbits by comparing with Satellite Laser Ranging (SLR) observations. External baseline solutions from The German Space Operations Center (GSOC) and Astronomisches Institut - Universität Bern (AIUB) are also included in the comparison. Results indicate that the GPS receiver modifications and RINEX converter changes are effective to improve the baseline determination. This research eventually shows a consistency level of 9.3/4.9/3.0 mm between kinematic and reduced-dynamic baselines in the radial/along-track/cross-track directions. On average 98.3% of the epochs have kinematic solutions. Consistency between TU Delft and external reduced-dynamic baseline solutions is at a level of 1 mm level in all directions.

  3. Item response theory analysis of the mechanics baseline test

    Science.gov (United States)

    Cardamone, Caroline N.; Abbott, Jonathan E.; Rayyan, Saif; Seaton, Daniel T.; Pawl, Andrew; Pritchard, David E.

    2012-02-01

    Item response theory is useful in both the development and evaluation of assessments and in computing standardized measures of student performance. In item response theory, individual parameters (difficulty, discrimination) for each item or question are fit by item response models. These parameters provide a means for evaluating a test and offer a better measure of student skill than a raw test score, because each skill calculation considers not only the number of questions answered correctly, but the individual properties of all questions answered. Here, we present the results from an analysis of the Mechanics Baseline Test given at MIT during 2005-2010. Using the item parameters, we identify questions on the Mechanics Baseline Test that are not effective in discriminating between MIT students of different abilities. We show that a limited subset of the highest quality questions on the Mechanics Baseline Test returns accurate measures of student skill. We compare student skills as determined by item response theory to the more traditional measurement of the raw score and show that a comparable measure of learning gain can be computed.

  4. Hazard Baseline Downgrade Effluent Treatment Facility

    International Nuclear Information System (INIS)

    Blanchard, A.

    1998-01-01

    This Hazard Baseline Downgrade reviews the Effluent Treatment Facility, in accordance with Department of Energy Order 5480.23, WSRC11Q Facility Safety Document Manual, DOE-STD-1027-92, and DOE-EM-STD-5502-94. It provides a baseline grouping based on the chemical and radiological hazards associated with the facility. The Determination of the baseline grouping for ETF will aid in establishing the appropriate set of standards for the facility

  5. Study of a diffusion flamelet model, with preferential diffusion effects included

    NARCIS (Netherlands)

    Delhaye, S.; Somers, L.M.T.; Bongers, H.; Oijen, van J.A.; Goey, de L.P.H.; Dias, V.

    2005-01-01

    The non-premixed flamelet model of Peters [1] (model1), which does not include preferential diffusion effects is investigated. Two similar models are presented, but without the assumption of unity Lewis numbers. One of these models was derived by Peters & Pitsch [2] (model2), while the other one was

  6. Classifying vulnerability to sleep deprivation using baseline measures of psychomotor vigilance.

    Science.gov (United States)

    Patanaik, Amiya; Kwoh, Chee Keong; Chua, Eric C P; Gooley, Joshua J; Chee, Michael W L

    2015-05-01

    To identify measures derived from baseline psychomotor vigilance task (PVT) performance that can reliably predict vulnerability to sleep deprivation. Subjects underwent total sleep deprivation and completed a 10-min PVT every 1-2 h in a controlled laboratory setting. Participants were categorized as vulnerable or resistant to sleep deprivation, based on a median split of lapses that occurred following sleep deprivation. Standard reaction time, drift diffusion model (DDM), and wavelet metrics were derived from PVT response times collected at baseline. A support vector machine model that incorporated maximum relevance and minimum redundancy feature selection and wrapper-based heuristics was used to classify subjects as vulnerable or resistant using rested data. Two academic sleep laboratories. Independent samples of 135 (69 women, age 18 to 25 y), and 45 (3 women, age 22 to 32 y) healthy adults. In both datasets, DDM measures, number of consecutive reaction times that differ by more than 250 ms, and two wavelet features were selected by the model as features predictive of vulnerability to sleep deprivation. Using the best set of features selected in each dataset, classification accuracy was 77% and 82% using fivefold stratified cross-validation, respectively. In both datasets, DDM measures, number of consecutive reaction times that differ by more than 250 ms, and two wavelet features were selected by the model as features predictive of vulnerability to sleep deprivation. Using the best set of features selected in each dataset, classification accuracy was 77% and 82% using fivefold stratified cross-validation, respectively. Despite differences in experimental conditions across studies, drift diffusion model parameters associated reliably with individual differences in performance during total sleep deprivation. These results demonstrate the utility of drift diffusion modeling of baseline performance in estimating vulnerability to psychomotor vigilance decline

  7. SEEPAGE MODEL FOR PA INCLUDING DRIFT COLLAPSE

    International Nuclear Information System (INIS)

    C. Tsang

    2004-01-01

    The purpose of this report is to document the predictions and analyses performed using the seepage model for performance assessment (SMPA) for both the Topopah Spring middle nonlithophysal (Tptpmn) and lower lithophysal (Tptpll) lithostratigraphic units at Yucca Mountain, Nevada. Look-up tables of seepage flow rates into a drift (and their uncertainty) are generated by performing numerical simulations with the seepage model for many combinations of the three most important seepage-relevant parameters: the fracture permeability, the capillary-strength parameter 1/a, and the percolation flux. The percolation flux values chosen take into account flow focusing effects, which are evaluated based on a flow-focusing model. Moreover, multiple realizations of the underlying stochastic permeability field are conducted. Selected sensitivity studies are performed, including the effects of an alternative drift geometry representing a partially collapsed drift from an independent drift-degradation analysis (BSC 2004 [DIRS 166107]). The intended purpose of the seepage model is to provide results of drift-scale seepage rates under a series of parameters and scenarios in support of the Total System Performance Assessment for License Application (TSPA-LA). The SMPA is intended for the evaluation of drift-scale seepage rates under the full range of parameter values for three parameters found to be key (fracture permeability, the van Genuchten 1/a parameter, and percolation flux) and drift degradation shape scenarios in support of the TSPA-LA during the period of compliance for postclosure performance [Technical Work Plan for: Performance Assessment Unsaturated Zone (BSC 2002 [DIRS 160819], Section I-4-2-1)]. The flow-focusing model in the Topopah Spring welded (TSw) unit is intended to provide an estimate of flow focusing factors (FFFs) that (1) bridge the gap between the mountain-scale and drift-scale models, and (2) account for variability in local percolation flux due to

  8. 2017 Annual Technology Baseline

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Wesley J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hand, M. M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Eberle, Annika [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Beiter, Philipp C [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Kurup, Parthiv [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Turchi, Craig S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Feldman, David J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Margolis, Robert M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Augustine, Chad R [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Maness, Michael [Formerly NREL; O' Connor, Patrick [Oak Ridge National Laboratory

    2018-03-26

    Consistent cost and performance data for various electricity generation technologies can be difficult to find and may change frequently for certain technologies. With the Annual Technology Baseline (ATB), the National Renewable Energy Laboratory annually provides an organized and centralized set of such cost and performance data. The ATB uses the best information from the Department of Energy national laboratories' renewable energy analysts as well as information from the Energy Information Administration for fuel-based technologies. The ATB has been reviewed by experts and it includes the following electricity generation technologies: land-based wind, offshore wind, utility-scale solar photovoltaics (PV), commercial-scale solar PV, residential-scale solar PV, concentrating solar power, geothermal power, hydropower, coal, natural gas, nuclear, and conventional biopower. This webinar presentation introduces the 2017 ATB.

  9. Long-Baseline Neutrino Facility (LBNF) and Deep Underground Neutrino Experiment (DUNE): Conceptual Design Report. Volume 3: Long-Baseline Neutrino Facility for DUNE

    Energy Technology Data Exchange (ETDEWEB)

    Strait, James [Fermi National Accelerator Laboratory (FNAL), Batavia, IL (United States); McCluskey, Elaine [Fermi National Accelerator Laboratory (FNAL), Batavia, IL (United States); Lundin, Tracy [Fermi National Accelerator Laboratory (FNAL), Batavia, IL (United States); Willhite, Joshua [Fermi National Accelerator Laboratory (FNAL), Batavia, IL (United States); Hamernik, Thomas [Fermi National Accelerator Laboratory (FNAL), Batavia, IL (United States); Papadimitriou, Vaia [Fermi National Accelerator Laboratory (FNAL), Batavia, IL (United States); Marchionni, Alberto [Fermi National Accelerator Laboratory (FNAL), Batavia, IL (United States); Kim, Min Jeong [National Inst. of Nuclear Physics (INFN), Frascati (Italy). National Lab. of Frascati (INFN-LNF); Nessi, Marzio [Univ. of Geneva (Switzerland); Montanari, David [Fermi National Accelerator Laboratory (FNAL), Batavia, IL (United States); Heavey, Anne [Fermi National Accelerator Laboratory (FNAL), Batavia, IL (United States)

    2016-01-21

    This volume of the LBNF/DUNE Conceptual Design Report covers the Long-Baseline Neutrino Facility for DUNE and describes the LBNF Project, which includes design and construction of the beamline at Fermilab, the conventional facilities at both Fermilab and SURF, and the cryostat and cryogenics infrastructure required for the DUNE far detector.

  10. Environmental Baseline File for National Transportation

    International Nuclear Information System (INIS)

    1999-01-01

    This Environmental Baseline File summarizes and consolidates information related to the national-level transportation of commercial spent nuclear fuel. Topics address include: shipments of commercial spent nuclear fuel based on mostly truck and mostly rail shipping scenarios; transportation routing for commercial spent nuclear fuel sites and DOE sites; radionuclide inventories for various shipping container capacities; transportation routing; populations along transportation routes; urbanized area population densities; the impacts of historical, reasonably foreseeable, and general transportation; state-level food transfer factors; Federal Guidance Report No. 11 and 12 radionuclide dose conversion factors; and national average atmospheric conditions

  11. Atmosphere-soil-vegetation model including CO2 exchange processes: SOLVEG2

    International Nuclear Information System (INIS)

    Nagai, Haruyasu

    2004-11-01

    A new atmosphere-soil-vegetation model named SOLVEG2 (SOLVEG version 2) was developed to study the heat, water, and CO 2 exchanges between the atmosphere and land-surface. The model consists of one-dimensional multilayer sub-models for the atmosphere, soil, and vegetation. It also includes sophisticated processes for solar and long-wave radiation transmission in vegetation canopy and CO 2 exchanges among the atmosphere, soil, and vegetation. Although the model usually simulates only vertical variation of variables in the surface-layer atmosphere, soil, and vegetation canopy by using meteorological data as top boundary conditions, it can be used by coupling with a three-dimensional atmosphere model. In this paper, details of SOLVEG2, which includes the function of coupling with atmosphere model MM5, are described. (author)

  12. Wide-Baseline Stereo-Based Obstacle Mapping for Unmanned Surface Vehicles

    Science.gov (United States)

    Mou, Xiaozheng; Wang, Han

    2018-01-01

    This paper proposes a wide-baseline stereo-based static obstacle mapping approach for unmanned surface vehicles (USVs). The proposed approach eliminates the complicated calibration work and the bulky rig in our previous binocular stereo system, and raises the ranging ability from 500 to 1000 m with a even larger baseline obtained from the motion of USVs. Integrating a monocular camera with GPS and compass information in this proposed system, the world locations of the detected static obstacles are reconstructed while the USV is traveling, and an obstacle map is then built. To achieve more accurate and robust performance, multiple pairs of frames are leveraged to synthesize the final reconstruction results in a weighting model. Experimental results based on our own dataset demonstrate the high efficiency of our system. To the best of our knowledge, we are the first to address the task of wide-baseline stereo-based obstacle mapping in a maritime environment. PMID:29617293

  13. Atmospheric pressure loading parameters from very long baseline interferometry observations

    Science.gov (United States)

    Macmillan, D. S.; Gipson, John M.

    1994-01-01

    Atmospheric mass loading produces a primarily vertical displacement of the Earth's crust. This displacement is correlated with surface pressure and is large enough to be detected by very long baseline interferometry (VLBI) measurements. Using the measured surface pressure at VLBI stations, we have estimated the atmospheric loading term for each station location directly from VLBI data acquired from 1979 to 1992. Our estimates of the vertical sensitivity to change in pressure range from 0 to -0.6 mm/mbar depending on the station. These estimates agree with inverted barometer model calculations (Manabe et al., 1991; vanDam and Herring, 1994) of the vertical displacement sensitivity computed by convolving actual pressure distributions with loading Green's functions. The pressure sensitivity tends to be smaller for stations near the coast, which is consistent with the inverted barometer hypothesis. Applying this estimated pressure loading correction in standard VLBI geodetic analysis improves the repeatability of estimated lengths of 25 out of 37 baselines that were measured at least 50 times. In a root-sum-square (rss) sense, the improvement generally increases with baseline length at a rate of about 0.3 to 0.6 ppb depending on whether the baseline stations are close to the coast. For the 5998-km baseline from Westford, Massachusetts, to Wettzell, Germany, the rss improvement is about 3.6 mm out of 11.0 mm. The average rss reduction of the vertical scatter for inland stations ranges from 2.7 to 5.4 mm.

  14. Baseline disability in activities of daily living predicts dementia risk even after controlling for baseline global cognitive ability and depressive symptoms.

    Science.gov (United States)

    Fauth, Elizabeth B; Schwartz, Sarah; Tschanz, Joann T; Østbye, Truls; Corcoran, Christopher; Norton, Maria C

    2013-06-01

    Late-life disability in activities of daily living (ADL) is theorized to be driven by underlying cognitive and/or physical impairment, interacting with psychological and environmental factors. Although we expect that cognitive deficits would explain associations between ADL disability and dementia risk, the current study examined ADL as a predictor of future dementia after controlling for global cognitive status. The population-based Cache County Memory Study (N = 3547) assessed individuals in four triennial waves (average age 74.9 years, years of education 13.36 years; 57.9% were women). Cox proportional hazards regression models assessed whether baseline ADL disability (presence of 2+ Instrumental ADL and/or 1+ Personal ADL) predicted incident dementia after controlling for APOE status, gender, age, baseline cognitive ability (Modified Mini-mental State Exam, 3MS-R; adjusted for education level), and baseline depressive symptoms (Diagnostic Interview Schedule). Over the course of study, 571 cases of incident dementia were identified through in-depth cognitive assessment, ending in expert consensus diagnosis. Results from Cox models suggest that ADL disability is a statistically significant predictor of incident dementia (adjusted hazard ratio = 1.83, p controlling for covariates. Findings suggest that ADL disability offers unique contributions in risk for incident dementia, even after controlling for global cognitive status. We discuss how physical impairment and executive function may play important roles in this relationship, and how ADL is useful, not just a diagnostic tool at, or after dementia onset, but also as a risk factor for future dementia, even in individuals not impaired on global cognitive tests. Copyright © 2012 John Wiley & Sons, Ltd.

  15. Exclusive queueing model including the choice of service windows

    Science.gov (United States)

    Tanaka, Masahiro; Yanagisawa, Daichi; Nishinari, Katsuhiro

    2018-01-01

    In a queueing system involving multiple service windows, choice behavior is a significant concern. This paper incorporates the choice of service windows into a queueing model with a floor represented by discrete cells. We contrived a logit-based choice algorithm for agents considering the numbers of agents and the distances to all service windows. Simulations were conducted with various parameters of agent choice preference for these two elements and for different floor configurations, including the floor length and the number of service windows. We investigated the model from the viewpoint of transit times and entrance block rates. The influences of the parameters on these factors were surveyed in detail and we determined that there are optimum floor lengths that minimize the transit times. In addition, we observed that the transit times were determined almost entirely by the entrance block rates. The results of the presented model are relevant to understanding queueing systems including the choice of service windows and can be employed to optimize facility design and floor management.

  16. Baseline Retinal Examinations among SLE Patients Newly Initiating Hydroxychloroquine in a U.S. Medicaid SLE Population, 2000-2010.

    Science.gov (United States)

    Lin, Tzu-Chieh; Marmor, Michael F; Barbhaiya, Medha; Guan, Hongshu; Chen, Sarah K; Feldman, Candace H; Costenbader, Karen H

    2018-02-06

    Baseline retinal examination has long been recommended at hydroxychloroquine (HCQ) initiation, but it is unknown how well this guideline is followed. We investigated baseline eye examinations among U.S. Medicaid SLE patients initiating HCQ. Using billing codes, we identified SLE patients aged 18-65 enrolled in Medicaid, residing in the 29 most populated U.S. states from 2000-2010. New HCQ users were identified by filling a prescription, with none in the preceding 12 months. Baseline retinal exams were identified within 30 days before to one year after this index prescription. We examined proportions of patients receiving retinal exams over the study years and compared characteristics of those who did and did not receive exams using bivariable and multivariable logistic regression models. Of 12,755 SLE patients newly starting HCQ, 32.5% received baseline dilated eye exams. The proportions of individuals receiving baseline eye exams did not significantly change during these years (31.0% to 34.4%, p for trend 0.12). Factors associated with increased likelihood of examinations included female sex, Asian versus White race, and receiving a higher number of laboratory tests during the preceding year. Lower proportions of Black and Native American versus White SLE patients had baseline retinal exams. Only one third of Medicaid SLE patients newly initiating HCQ received recommended baseline retinal examinations and this proportion did not significantly increase during these years. The sociodemographic variation in this indicated care has been observed for other recommended medical care for SLE and requires both further investigation and interventions to address it. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  17. Causal Mediation Analysis for the Cox Proportional Hazards Model with a Smooth Baseline Hazard Estimator.

    Science.gov (United States)

    Wang, Wei; Albert, Jeffrey M

    2017-08-01

    An important problem within the social, behavioral, and health sciences is how to partition an exposure effect (e.g. treatment or risk factor) among specific pathway effects and to quantify the importance of each pathway. Mediation analysis based on the potential outcomes framework is an important tool to address this problem and we consider the estimation of mediation effects for the proportional hazards model in this paper. We give precise definitions of the total effect, natural indirect effect, and natural direct effect in terms of the survival probability, hazard function, and restricted mean survival time within the standard two-stage mediation framework. To estimate the mediation effects on different scales, we propose a mediation formula approach in which simple parametric models (fractional polynomials or restricted cubic splines) are utilized to approximate the baseline log cumulative hazard function. Simulation study results demonstrate low bias of the mediation effect estimators and close-to-nominal coverage probability of the confidence intervals for a wide range of complex hazard shapes. We apply this method to the Jackson Heart Study data and conduct sensitivity analysis to assess the impact on the mediation effects inference when the no unmeasured mediator-outcome confounding assumption is violated.

  18. 10 CFR 850.20 - Baseline beryllium inventory.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Baseline beryllium inventory. 850.20 Section 850.20 Energy... Baseline beryllium inventory. (a) The responsible employer must develop a baseline inventory of the... inventory, the responsible employer must: (1) Review current and historical records; (2) Interview workers...

  19. 40 CFR 80.92 - Baseline auditor requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Baseline auditor requirements. 80.92... (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Anti-Dumping § 80.92 Baseline auditor requirements. (a... determination methodology, resulting baseline fuel parameter, volume and emissions values verified by an auditor...

  20. Baseline Hemodynamics and Response to Contrast Media During Diagnostic Cardiac Catheterization Predict Adverse Events in Heart Failure Patients.

    Science.gov (United States)

    Denardo, Scott J; Vock, David M; Schmalfuss, Carsten M; Young, Gregory D; Tcheng, James E; O'Connor, Christopher M

    2016-07-01

    Contrast media administered during cardiac catheterization can affect hemodynamic variables. However, little is documented about the effects of contrast on hemodynamics in heart failure patients or the prognostic value of baseline and changes in hemodynamics for predicting subsequent adverse events. In this prospective study of 150 heart failure patients, we measured hemodynamics at baseline and after administration of iodixanol or iopamidol contrast. One-year Kaplan-Meier estimates of adverse event-free survival (death, heart failure hospitalization, and rehospitalization) were generated, grouping patients by baseline measures of pulmonary capillary wedge pressure (PCWP) and cardiac index (CI), and by changes in those measures after contrast administration. We used Cox proportional hazards modeling to assess sequentially adding baseline PCWP and change in CI to 5 validated risk models (Seattle Heart Failure Score, ESCAPE [Evaluation Study of Congestive Heart Failure and Pulmonary Artery Catheterization Effectiveness], CHARM [Candesartan in Heart Failure: Assessment of Reduction in Mortality and Morbidity], CORONA [Controlled Rosuvastatin Multinational Trial in Heart Failure], and MAGGIC [Meta-Analysis Global Group in Chronic Heart Failure]). Median contrast volume was 109 mL. Both contrast media caused similarly small but statistically significant changes in most hemodynamic variables. There were 39 adverse events (26.0%). Adverse event rates increased using the composite metric of baseline PCWP and change in CI (Pcontrast correlated with the poorest prognosis. Adding both baseline PCWP and change in CI to the 5 risk models universally improved their predictive value (P≤0.02). In heart failure patients, the administration of contrast causes small but significant changes in hemodynamics. Calculating baseline PCWP with change in CI after contrast predicts adverse events and increases the predictive value of existing models. Patients with elevated baseline PCWP and

  1. Integrated planning: A baseline development perspective

    International Nuclear Information System (INIS)

    Clauss, L.; Chang, D.

    1994-01-01

    The FEMP Baseline establishes the basis for integrating environmental activity technical requirements with their cost and schedule elements. The result is a path forward to successfully achieving the FERMCO mission. Specific to cost management, the FEMP Baseline has been incorporate into the FERMCO Project Control System (PCS) to provide a time-phased budget plan against which contractor performance is measured with an earned value management system. The result is the Performance Measurement Baseline (PMB), an important tool for keeping cost under control

  2. Time of exposure to night work and carotid atherosclerosis: a structural equation modeling approach using baseline data from ELSA-Brasil.

    Science.gov (United States)

    Silva-Costa, Aline; Guimarães, Joanna; Chor, Dora; de Jesus Mendes da Fonseca, Maria; Bensenor, Isabela; Santos, Itamar; Barreto, Sandhi; Griep, Rosane Härter

    2018-04-02

    The study of cardiovascular diseases (CVD) associated with night work is difficult due to the long period required for conditions to manifest and the healthy-worker effect. Analyzing asymptomatic pre-clinical changes in the atherosclerotic process is a way to assess the pathways between exposure to night work and CVD. To evaluate the associations between night work and subclinical atherosclerosis measured by carotid intima-media thickness (CIMT) using baseline data from the Brazilian Longitudinal Study of Adult Health (ELSA-Brasil). We conducted cross-sectional analyses using baseline data (2008-2010) from 9785 civil servants, aged 35-74 years. The associations between time of exposure to night work and mean CIMT were examined using a structural equation model. The sample included 4259 men and 5526 women, mean age of 51.6 years. A total of 1778 (18.2%) individuals were exposed to night work (594 current and 1184 former night workers), and the mean years of night work exposed was 11.47 (SD = 9.45) years. On average, mean CIMT was 0.606 (SD = 0.130) mm. Among men, the increase in exposure to night work was significantly associated with an increase in BMI and CIMT. Among women, night work was not associated with increased CIMT. In relation to the indirect associations, results suggest a possible mediation by BMI, diabetes and hypertension on the association between the years of night work and mean CIMT only among men. Night work was associated with increased CIMT only among men. These findings add to the knowledge of the possible pathways that link night work and carotid atherosclerosis. Additionally, these results contribute to the recognition of work schedules as a public health problem that should be addressed by the medical community and policy makers.

  3. The prognostic utility of baseline alpha-fetoprotein for hepatocellular carcinoma patients.

    Science.gov (United States)

    Silva, Jack P; Gorman, Richard A; Berger, Nicholas G; Tsai, Susan; Christians, Kathleen K; Clarke, Callisia N; Mogal, Harveshp; Gamblin, T Clark

    2017-12-01

    Alpha-fetoprotein (AFP) has a valuable role in postoperative surveillance for hepatocellular carcinoma (HCC) recurrence. The utility of pretreatment or baseline AFP remains controversial. The present study hypothesized that elevated baseline AFP levels are associated with worse overall survival in HCC patients. Adult HCC patients were identified using the National Cancer Database (2004-2013). Patients were stratified according to baseline AFP measurements into the following groups: Negative (2000). The primary outcome was overall survival (OS), which was analyzed by log-rank test and graphed using Kaplan-Meier method. Multivariate regression modeling was used to determine hazard ratios (HR) for OS. Of 41 107 patients identified, 15 809 (33.6%) were Negative. Median overall survival was highest in the Negative group, followed by Borderline, Elevated, and Highly Elevated (28.7 vs 18.9 vs 8.8 vs 3.2 months; P < 0.001). On multivariate analysis, overall survival hazard ratios for the Borderline, Elevated, and Highly Elevated groups were 1.18 (P = 0.267), 1.94 (P < 0.001), and 1.77 (P = 0.007), respectively (reference Negative). Baseline AFP independently predicted overall survival in HCC patients regardless of treatment plan. A baseline AFP value is a simple and effective method to assist in expected survival for HCC patients. © 2017 Wiley Periodicals, Inc.

  4. The London low emission zone baseline study.

    Science.gov (United States)

    Kelly, Frank; Armstrong, Ben; Atkinson, Richard; Anderson, H Ross; Barratt, Ben; Beevers, Sean; Cook, Derek; Green, Dave; Derwent, Dick; Mudway, Ian; Wilkinson, Paul

    2011-11-01

    On February 4, 2008, the world's largest low emission zone (LEZ) was established. At 2644 km2, the zone encompasses most of Greater London. It restricts the entry of the oldest and most polluting diesel vehicles, including heavy-goods vehicles (haulage trucks), buses and coaches, larger vans, and minibuses. It does not apply to cars or motorcycles. The LEZ scheme will introduce increasingly stringent Euro emissions standards over time. The creation of this zone presented a unique opportunity to estimate the effects of a stepwise reduction in vehicle emissions on air quality and health. Before undertaking such an investigation, robust baseline data were gathered on air quality and the oxidative activity and metal content of particulate matter (PM) from air pollution monitors located in Greater London. In addition, methods were developed for using databases of electronic primary-care records in order to evaluate the zone's health effects. Our study began in 2007, using information about the planned restrictions in an agreed-upon LEZ scenario and year-on-year changes in the vehicle fleet in models to predict air pollution concentrations in London for the years 2005, 2008, and 2010. Based on this detailed emissions and air pollution modeling, the areas in London were then identified that were expected to show the greatest changes in air pollution concentrations and population exposures after the implementation of the LEZ. Using these predictions, the best placement of a pollution monitoring network was determined and the feasibility of evaluating the health effects using electronic primary-care records was assessed. To measure baseline pollutant concentrations before the implementation of the LEZ, a comprehensive monitoring network was established close to major roadways and intersections. Output-difference plots from statistical modeling for 2010 indicated seven key areas likely to experience the greatest change in concentrations of nitrogen dioxide (NO2) (at least 3

  5. Analysis of baseline gene expression levels from ...

    Science.gov (United States)

    The use of gene expression profiling to predict chemical mode of action would be enhanced by better characterization of variance due to individual, environmental, and technical factors. Meta-analysis of microarray data from untreated or vehicle-treated animals within the control arm of toxicogenomics studies has yielded useful information on baseline fluctuations in gene expression. A dataset of control animal microarray expression data was assembled by a working group of the Health and Environmental Sciences Institute's Technical Committee on the Application of Genomics in Mechanism Based Risk Assessment in order to provide a public resource for assessments of variability in baseline gene expression. Data from over 500 Affymetrix microarrays from control rat liver and kidney were collected from 16 different institutions. Thirty-five biological and technical factors were obtained for each animal, describing a wide range of study characteristics, and a subset were evaluated in detail for their contribution to total variability using multivariate statistical and graphical techniques. The study factors that emerged as key sources of variability included gender, organ section, strain, and fasting state. These and other study factors were identified as key descriptors that should be included in the minimal information about a toxicogenomics study needed for interpretation of results by an independent source. Genes that are the most and least variable, gender-selectiv

  6. EML Chester - 1982. Annual report of the Regional Baseline Station at Chester, New Jersey

    International Nuclear Information System (INIS)

    Volchok, H.L.

    1982-11-01

    The Environmental Measurements Laboratory (EML) has maintained a regional baseline station at Chester, New Jersey since 1976. The site provides EML with a remote, rural facility for carrying out regional baseline research and for testing field equipment. This report updates the various programs underway at the Chester site. Separate abstracts have been prepared for the included papers

  7. Neutrino oscillations on the way to long-baseline experiments

    CERN Document Server

    Ryabov, V A

    2003-01-01

    The motivations and physical objectives of experiments in the search for nu /sub mu / to nu /sub e/, nu /sub tau / oscillations in long- baseline accelerator neutrino beams are reviewed. Neutrino beams, detectors, and methods for detecting oscillations (detection of the disappearance of nu /sub mu /, and the appearance of nu /sub e/ and nu /sub tau /) in the current K2K (KEK to Super Kamiokande) experiment and in the MINOS (FNAL to Soudan) and OPERA (CERN to Gran Sasso) near-future experiments are discussed. Possibilities of measuring the oscillation parameters in these experiments are considered in connection with new data obtained in CHOOZ and Palo Verde reactor experiments, the solar neutrino deficit and nu /sub mu // nu /sub e/ anomaly of atmospheric neutrinos, which are observed in large-scale underground detectors, and the excess of nu /sub e/ events in the LSND experiment. Neutrino-oscillation scenarios used in models with three and four (including sterile) types of neutrino, as well as the possibility...

  8. Mixed Waste Focus Area integrated technical baseline report, Phase 1: Volume 1

    International Nuclear Information System (INIS)

    1996-01-01

    The Department of Energy (DOE) established the Mixed Waste Characterization, Treatment, and Disposal Focus Area (MWFA) to develop and facilitate implementation of technologies required to meet the Department's commitments for treatment of mixed low-level and transuranic wastes. The mission of the MWFA is to provide acceptable treatment systems, developed in partnership with users and with participation of stakeholders, tribal governments, and regulators, that are capable of treating DOE's mixed waste. These treatment systems include all necessary steps such as characterization, pretreatment, and disposal. To accomplish this mission, a technical baseline is being established that forms the basis for determining which technology development activities will be supported by the MWFA. The technical baseline is the prioritized list of deficiencies, and the resulting technology development activities needed to overcome these deficiencies. This document presents Phase I of the technical baseline development process, which resulted in the prioritized list of deficiencies that the MWFA will address. A summary of the data and the assumptions upon which this work was based is included, as well as information concerning the DOE Office of Environmental Management (EM) mixed waste technology development needs. The next phase in the technical baseline development process, Phase II, will result in the identification of technology development activities that will be conducted through the MWFA to resolve the identified deficiencies

  9. Three-dimensional thermal analysis of a baseline spent fuel repository

    International Nuclear Information System (INIS)

    Altenbach, T.J.; Lowry, W.E.

    1980-01-01

    A three-dimensional thermal analysis has been performed using finite difference techniques to determine the near-field response of a baseline spent fuel repository in a deep geologic salt medium. A baseline design incorporates previous thermal modeling experience and OWI recommendations for areal thermal loading in specifying the waste form properties, package details, and emplacement configuration. The base case in this thermal analysis considers one 10-year old PWR spent fuel assembly emplaced to yield a 36 kw/acre (8.9 w/m 2 ) loading. A unit cell model in an infinite array is used to simplify the problem and provide upper-bound temperatures. Boundary conditions are imposed which allow simulations to 1000 years. Variations studied include a comparison of ventilated and unventilated storage room conditions, emplacement packages with and without air gaps surrounding the canister, and room cool-down scenarios with ventilation following an unventilated state for retrieval purposes. At this low power level ventilating the emplacement room has an immediate cooling influence on the canister and effectively maintains the emplacement room floor near the temperature of the ventilating air. The annular gap separating the canister and sleeve causes the peak temperature of the canister surface to rise by 10 0 F (5.6 0 C) over that from a no gap case assuming perfect thermal contact. It was also shown that the time required for the emplacement room to cool down to 100 0 F (38 0 C) from an unventilated state ranged from 2 weeks to 6 months; when ventilation initiated after times of 5 years to 50 years, respectively. As the work was performed for the Nuclear Regulatory Commission, these results provide a significant addition to the regulatory data base for spent fuel performance in a geologic repository

  10. SRP Baseline Hydrogeologic Investigation, Phase 3

    Energy Technology Data Exchange (ETDEWEB)

    Bledsoe, H.W.

    1988-08-01

    The SRP Baseline Hydrogeologic Investigation was implemented for the purpose of updating and improving the knowledge and understanding of the hydrogeologic systems underlying the SRP site. Phase III, which is discussed in this report, includes the drilling of 7 deep coreholes (sites P-24 through P-30) and the installation of 53 observation wells ranging in depth from approximately 50 ft to more than 970 ft below the ground surface. In addition to the collection of geologic cores for lithologic and stratigraphic study, samples were also collected for the determination of physical characteristics of the sediments and for the identification of microorganisms.

  11. Idiopathic Pulmonary Fibrosis: Data-driven Textural Analysis of Extent of Fibrosis at Baseline and 15-Month Follow-up.

    Science.gov (United States)

    Humphries, Stephen M; Yagihashi, Kunihiro; Huckleberry, Jason; Rho, Byung-Hak; Schroeder, Joyce D; Strand, Matthew; Schwarz, Marvin I; Flaherty, Kevin R; Kazerooni, Ella A; van Beek, Edwin J R; Lynch, David A

    2017-10-01

    Purpose To evaluate associations between pulmonary function and both quantitative analysis and visual assessment of thin-section computed tomography (CT) images at baseline and at 15-month follow-up in subjects with idiopathic pulmonary fibrosis (IPF). Materials and Methods This retrospective analysis of preexisting anonymized data, collected prospectively between 2007 and 2013 in a HIPAA-compliant study, was exempt from additional institutional review board approval. The extent of lung fibrosis at baseline inspiratory chest CT in 280 subjects enrolled in the IPF Network was evaluated. Visual analysis was performed by using a semiquantitative scoring system. Computer-based quantitative analysis included CT histogram-based measurements and a data-driven textural analysis (DTA). Follow-up CT images in 72 of these subjects were also analyzed. Univariate comparisons were performed by using Spearman rank correlation. Multivariate and longitudinal analyses were performed by using a linear mixed model approach, in which models were compared by using asymptotic χ 2 tests. Results At baseline, all CT-derived measures showed moderate significant correlation (P pulmonary function. At follow-up CT, changes in DTA scores showed significant correlation with changes in both forced vital capacity percentage predicted (ρ = -0.41, P pulmonary function (P fibrosis at CT yields an index of severity that correlates with visual assessment and functional change in subjects with IPF. © RSNA, 2017.

  12. Key Characteristics of Combined Accident including TLOFW accident for PSA Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Bo Gyung; Kang, Hyun Gook [KAIST, Daejeon (Korea, Republic of); Yoon, Ho Joon [Khalifa University of Science, Technology and Research, Abu Dhabi (United Arab Emirates)

    2015-05-15

    The conventional PSA techniques cannot adequately evaluate all events. The conventional PSA models usually focus on single internal events such as DBAs, the external hazards such as fire, seismic. However, the Fukushima accident of Japan in 2011 reveals that very rare event is necessary to be considered in the PSA model to prevent the radioactive release to environment caused by poor treatment based on lack of the information, and to improve the emergency operation procedure. Especially, the results from PSA can be used to decision making for regulators. Moreover, designers can consider the weakness of plant safety based on the quantified results and understand accident sequence based on human actions and system availability. This study is for PSA modeling of combined accidents including total loss of feedwater (TLOFW) accident. The TLOFW accident is a representative accident involving the failure of cooling through secondary side. If the amount of heat transfer is not enough due to the failure of secondary side, the heat will be accumulated to the primary side by continuous core decay heat. Transients with loss of feedwater include total loss of feedwater accident, loss of condenser vacuum accident, and closure of all MSIVs. When residual heat removal by the secondary side is terminated, the safety injection into the RCS with direct primary depressurization would provide alternative heat removal. This operation is called feed and bleed (F and B) operation. Combined accidents including TLOFW accident are very rare event and partially considered in conventional PSA model. Since the necessity of F and B operation is related to plant conditions, the PSA modeling for combined accidents including TLOFW accident is necessary to identify the design and operational vulnerabilities.The PSA is significant to assess the risk of NPPs, and to identify the design and operational vulnerabilities. Even though the combined accident is very rare event, the consequence of combined

  13. Highly active antiretroviral therapy including protease inhibitors does not confer a unique CD4 cell benefit. The AVANTI and INCAS Study Groups.

    Science.gov (United States)

    2000-07-07

    To determine if triple combination therapy, particularly including HIV protease inhibitors (PI), confers an unique immunological benefit that is independent of reductions of plasma viral load (pVL). The correlation between changes from baseline in CD4 cell count and pVL was examined at all time points up to 52 weeks in three randomized clinical trials (AVANTI-2, AVANTI-3 and INCAS) that compared dual nucleoside therapy with triple combination therapy. Individual pVL and CD4 cell counts changes from baseline were entered into multivariate linear regression models for patients receiving double therapy and for those receiving triple therapy including a PI and/or a non-nucleoside reverse transcriptase inhibitor (NNRTI), and the null hypothesis was tested. After 52 weeks of therapy, the relationship between changes from baseline CD4 cell count and pVL was independent of whether patients were assigned double or triple therapy (P = 0.23 and 0.69 for intercept and slope, respectively), or whether patients were assigned triple therapy including a PI or triple therapy including an NNRTI (P = 0.92 and 0.95, respectively). Less than 5% of patients ever had 'discordant' increases in both CD4 cell count and pVL compared with baseline, and this proportion was unrelated to the class of therapy used. 'Discordant' decreases from baseline in both parameters were observed in up to 35% of individuals. The correlation between pVL and CD4 cell count changes from baseline improved over time on therapy, regardless of the therapeutic regimen involved. The data provide no evidence for a CD4 cell count benefit of highly active antiretroviral therapy (HAART) unique to triple therapy or PI-containing regimens.

  14. Modeling the worldwide spread of pandemic influenza: baseline case and containment interventions.

    Directory of Open Access Journals (Sweden)

    Vittoria Colizza

    2007-01-01

    Full Text Available BACKGROUND: The highly pathogenic H5N1 avian influenza virus, which is now widespread in Southeast Asia and which diffused recently in some areas of the Balkans region and Western Europe, has raised a public alert toward the potential occurrence of a new severe influenza pandemic. Here we study the worldwide spread of a pandemic and its possible containment at a global level taking into account all available information on air travel. METHODS AND FINDINGS: We studied a metapopulation stochastic epidemic model on a global scale that considers airline travel flow data among urban areas. We provided a temporal and spatial evolution of the pandemic with a sensitivity analysis of different levels of infectiousness of the virus and initial outbreak conditions (both geographical and seasonal. For each spreading scenario we provided the timeline and the geographical impact of the pandemic in 3,100 urban areas, located in 220 different countries. We compared the baseline cases with different containment strategies, including travel restrictions and the therapeutic use of antiviral (AV drugs. We investigated the effect of the use of AV drugs in the event that therapeutic protocols can be carried out with maximal coverage for the populations in all countries. In view of the wide diversity of AV stockpiles in different regions of the world, we also studied scenarios in which only a limited number of countries are prepared (i.e., have considerable AV supplies. In particular, we compared different plans in which, on the one hand, only prepared and wealthy countries benefit from large AV resources, with, on the other hand, cooperative containment scenarios in which countries with large AV stockpiles make a small portion of their supplies available worldwide. CONCLUSIONS: We show that the inclusion of air transportation is crucial in the assessment of the occurrence probability of global outbreaks. The large-scale therapeutic usage of AV drugs in all hit

  15. Effect of Baseline Nutritional Status on Long-term Multivitamin Use and Cardiovascular Disease Risk

    Science.gov (United States)

    Rautiainen, Susanne; Gaziano, J. Michael; Christen, William G.; Bubes, Vadim; Kotler, Gregory; Glynn, Robert J.; Manson, JoAnn E.; Buring, Julie E.

    2017-01-01

    Importance Long-term multivitamin use had no effect on risk of cardiovascular disease (CVD) in the Physicians’ Health Study II. Baseline nutritional status may have modified the lack of effect. Objective To investigate effect modification by various baseline dietary factors on CVD risk in the Physicians’ Health Study II. Design, Setting, and Participants The Physicians’ Health Study II was a randomized, double-blind, placebo-controlled trial testing multivitamin use (multivitamin [Centrum Silver] or placebo daily) among US male physicians. The Physicians’ Health Study II included 14 641 male physicians 50 years or older, 13 316 of whom (91.0%) completed a baseline 116-item semiquantitative food frequency questionnaire and were included in the analyses. This study examined effect modification by baseline intake of key foods, individual nutrients, dietary patterns (Alternate Healthy Eating Index and Alternate Mediterranean Diet Score), and dietary supplement use. The study began in 1997, with continued treatment and follow-up through June 1, 2011. Interventions Multivitamin or placebo daily. Main Outcomes and Measures Major cardiovascular events, including nonfatal myocardial infarction, nonfatal stroke, and CVD mortality. Secondary outcomes included myocardial infarction, total stroke, CVD mortality, and total mortality individually. Results In total, 13 316 male physicians (mean [SD] age at randomization, 64.0 [9.0] years in those receiving the active multivitamin and 64.0 [9.1] years in those receiving the placebo) were observed for a mean (SD) follow-up of 11.4 (2.3) years. There was no consistent evidence of effect modification by various foods, nutrients, dietary patterns, or baseline supplement use on the effect of multivitamin use on CVD end points. Statistically significant interaction effects were observed between multivitamin use and vitamin B6 intake on myocardial infarction, between multivitamin use and vitamin D intake on CVD mortality

  16. THE US LONG BASELINE NEUTRINO EXPERIMENT STUDY.

    Energy Technology Data Exchange (ETDEWEB)

    BISHAI,M.

    2007-08-06

    The US Long Baseline Neutrino Experiment Study was commissioned jointly by Brookhaven National Laboratory (BNL)and Fermi National Accelerator Laboratory (FNAL) to investigate the potential for future U.S. based long baseline neutrino oscillation experiments using MW class conventional neutrino beams that can be produced at FNAL. The experimental baselines are based on two possible detector locations: (1) off-axis to the existing FNAL NuMI beamline at baselines of 700 to 810 km and (2) NSF's proposed future Deep Underground Science and Engineering Laboratory (DUSEL) at baselines greater than 1000km. Two detector technologies are considered: a megaton class Water Cherenkov detector deployed deep underground at a DUSEL site, or a 100kT Liquid Argon Time-Projection Chamber (TPC) deployed on the surface at any of the proposed sites. The physics sensitivities of the proposed experiments are summarized. We find that conventional horn focused wide-band neutrino beam options from FNAL aimed at a massive detector with a baseline of > 1000km have the best sensitivity to CP violation and the neutrino mass hierarchy for values of the mixing angle {theta}{sub 13} down to 2{sup o}.

  17. 200-UP-2 Operable Unit technical baseline report

    International Nuclear Information System (INIS)

    Deford, D.H.

    1991-02-01

    This report is prepared in support of the development of a Remedial Investigation/Feasibility Study (RI/FS) Work Plan for the 200-UP-2 Operable Unit by EBASCO Environmental, Incorporated. It provides a technical baseline of the 200-UP-2 Operable Unit and results from an environmental investigation undertaken by the Technical Baseline Section of the Environmental Engineering Group, Westinghouse Hanford Company (Westinghouse Hanford). The 200-UP-2 Operable Unit Technical Baseline Report is based on review and evaluation of numerous Hanford Site current and historical reports, Hanford Site drawings and photographs and is supplemented with Hanford Site inspections and employee interviews. No field investigations or sampling were conducted. Each waste site in the 200-UP-2 Operable Unit is described separately. Close relationships between waste units, such as overflow from one to another, are also discussed. The 200-UP-2 Operable Unit consists of liquid-waste disposal sites in the vicinity of, and related to, U Plant operations in the 200 West Area of the Hanford Site. The ''U Plant'' refers to the 221-U Process Canyon Building, a chemical separations facility constructed during World War 2. It also includes the Uranium Oxide (UO 3 ) Plant, which was constructed at the same time and, like the 221-U Process Canyon Building, was later converted for other missions. Waste sites in the 200-UP-2 Operable Unit are associated with the U Plant Uranium Metal Recovery Program mission that occurred between 1952 and 1958 and the UO 3 Plant's ongoing uranium oxide mission and include one or more cribs, reverse wells, french drains, septic tanks and drain fields, trenches, catch tanks, settling tanks, diversion boxes, waste vaults, and the lines and encasements that connect them. 11 refs., 1 tab

  18. S Plant Aggregate Area Management study technical baseline report

    International Nuclear Information System (INIS)

    DeFord, D.H.; Carpenter, R.W.

    1995-05-01

    This document is prepared in support of an Aggregate Area Management Study of S Plant, 200 West Area, at the US Department of Energy's (DOE) Hanford Site near Richland, Washington. It provides a technical baseline of the aggregate area and the results from an environmental investigation undertaken by the Technical Baseline Section of the Environmental Engineering Group, Westinghouse Hanford Company (WHC). This document is based on review and evaluation of numerous Hanford Site current and historical reports, drawings and photographs, supplemented with site inspections and employee interviews. This report describes the REDOX facility and its waste sites, including cribs, french drains, septic tanks and drain fields, trenches, catch tanks, settling tanks, diversion boxes, underground tank farms designed for high-level liquid wastes, and the lines and encasements that connect them

  19. Evaluation of alternative school feeding models on nutrition, education, agriculture and other social outcomes in Ghana: rationale, randomised design and baseline data.

    Science.gov (United States)

    Gelli, Aulo; Masset, Edoardo; Folson, Gloria; Kusi, Anthoni; Arhinful, Daniel K; Asante, Felix; Ayi, Irene; Bosompem, Kwabena M; Watkins, Kristie; Abdul-Rahman, Lutuf; Agble, Rosanna; Ananse-Baden, Getrude; Mumuni, Daniel; Aurino, Elisabetta; Fernandes, Meena; Drake, Lesley

    2016-01-20

    'Home-grown' school feeding programmes are complex interventions with the potential to link the increased demand for school feeding goods and services to community-based stakeholders, including smallholder farmers and women's groups. There is limited rigorous evidence, however, that this is the case in practice. This evaluation will examine explicitly, and from a holistic perspective, the simultaneous impact of a national school meals programme on micronutrient status, alongside outcomes in nutrition, education and agriculture domains. The 3-year study involves a cluster-randomised control trial designed around the scale-up of the national school feeding programme, including 116 primary schools in 58 districts in Ghana. The randomly assigned interventions are: 1) a school feeding programme group, including schools and communities where the standard government programme is implemented; 2) 'home-grown' school feeding, including schools and communities where the standard programme is implemented alongside an innovative pilot project aimed at enhancing nutrition and agriculture; and 3) a control group, including schools and households from communities where the intervention will be delayed by at least 3 years, preferably without informing schools and households. Primary outcomes include child health and nutritional status, school participation and learning, and smallholder farmer income. Intermediate outcomes along the agriculture and nutrition pathways will also be measured. The evaluation will follow a mixed-method approach, including child-, household-, school- and community-level surveys as well as focus group discussions with project stakeholders. The baseline survey was completed in August 2013 and the endline survey is planned for November 2015. The tests of balance show significant differences in the means of a number of outcome and control variables across the intervention groups. Important differences across groups include marketed surplus, livestock income

  20. Effect of Enamel Caries Lesion Baseline Severity on Fluoride Dose-Response

    Directory of Open Access Journals (Sweden)

    Frank Lippert

    2017-01-01

    Full Text Available This study aimed to investigate the effect of enamel caries lesion baseline severity on fluoride dose-response under pH cycling conditions. Early caries lesions were created in human enamel specimens at four different severities (8, 16, 24, and 36 h. Lesions were allocated to treatment groups (0, 83, and 367 ppm fluoride as sodium fluoride based on Vickers surface microhardness (VHN and pH cycled for 5 d. The cycling model comprised 3 × 1 min fluoride treatments sandwiched between 2 × 60 min demineralization challenges with specimens stored in artificial saliva in between. VHN was measured again and changes versus lesion baseline were calculated (ΔVHN. Data were analyzed using two-way ANOVA (p<0.05. Increased demineralization times led to increased surface softening. The lesion severity×fluoride concentration interaction was significant (p<0.001. Fluoride dose-response was observed in all groups. Lesions initially demineralized for 16 and 8 h showed similar overall rehardening (ΔVHN and more than 24 and 36 h lesions, which were similar. The 8 h lesions showed the greatest fluoride response differential (367 versus 0 ppm F which diminished with increasing lesion baseline severity. The extent of rehardening as a result of the 0 ppm F treatment increased with increasing lesion baseline severity, whereas it decreased for the fluoride treatments. In conclusion, lesion baseline severity impacts the extent of the fluoride dose-response.

  1. RELAP5-3D Code Includes ATHENA Features and Models

    International Nuclear Information System (INIS)

    Riemke, Richard A.; Davis, Cliff B.; Schultz, Richard R.

    2006-01-01

    Version 2.3 of the RELAP5-3D computer program includes all features and models previously available only in the ATHENA version of the code. These include the addition of new working fluids (i.e., ammonia, blood, carbon dioxide, glycerol, helium, hydrogen, lead-bismuth, lithium, lithium-lead, nitrogen, potassium, sodium, and sodium-potassium) and a magnetohydrodynamic model that expands the capability of the code to model many more thermal-hydraulic systems. In addition to the new working fluids along with the standard working fluid water, one or more noncondensable gases (e.g., air, argon, carbon dioxide, carbon monoxide, helium, hydrogen, krypton, nitrogen, oxygen, SF 6 , xenon) can be specified as part of the vapor/gas phase of the working fluid. These noncondensable gases were in previous versions of RELAP5-3D. Recently four molten salts have been added as working fluids to RELAP5-3D Version 2.4, which has had limited release. These molten salts will be in RELAP5-3D Version 2.5, which will have a general release like RELAP5-3D Version 2.3. Applications that use these new features and models are discussed in this paper. (authors)

  2. Baseline Design and Performance Analysis of Laser Altimeter for Korean Lunar Orbiter

    Directory of Open Access Journals (Sweden)

    Hyung-Chul Lim

    2016-09-01

    Full Text Available Korea’s lunar exploration project includes the launching of an orbiter, a lander (including a rover, and an experimental orbiter (referred to as a lunar pathfinder. Laser altimeters have played an important scientific role in lunar, planetary, and asteroid exploration missions since their first use in 1971 onboard the Apollo 15 mission to the Moon. In this study, a laser altimeter was proposed as a scientific instrument for the Korean lunar orbiter, which will be launched by 2020, to study the global topography of the surface of the Moon and its gravitational field and to support other payloads such as a terrain mapping camera or spectral imager. This study presents the baseline design and performance model for the proposed laser altimeter. Additionally, the study discusses the expected performance based on numerical simulation results. The simulation results indicate that the design of system parameters satisfies performance requirements with respect to detection probability and range error even under unfavorable conditions.

  3. Descriptive Analysis of a Baseline Concussion Battery Among U.S. Service Academy Members: Results from the Concussion Assessment, Research, and Education (CARE) Consortium.

    Science.gov (United States)

    O'Connor, Kathryn L; Dain Allred, C; Cameron, Kenneth L; Campbell, Darren E; D'Lauro, Christopher J; Houston, Megan N; Johnson, Brian R; Kelly, Tim F; McGinty, Gerald; O'Donnell, Patrick G; Peck, Karen Y; Svoboda, Steven J; Pasquina, Paul; McAllister, Thomas; McCrea, Michael; Broglio, Steven P

    2018-03-28

    The prevalence and possible long-term consequences of concussion remain an increasing concern to the U.S. military, particularly as it pertains to maintaining a medically ready force. Baseline testing is being used both in the civilian and military domains to assess concussion injury and recovery. Accurate interpretation of these baseline assessments requires one to consider other influencing factors not related to concussion. To date, there is limited understanding, especially within the military, of what factors influence normative test performance. Given the significant physical and mental demands placed on service academy members (SAM), and their relatively high risk for concussion, it is important to describe demographics and normative profile of SAMs. Furthermore, the absence of available baseline normative data on female and non-varsity SAMs makes interpretation of post-injury assessments challenging. Understanding how individuals perform at baseline, given their unique individual characteristics (e.g., concussion history, sex, competition level), will inform post-concussion assessment and management. Thus, the primary aim of this manuscript is to characterize the SAM population and determine normative values on a concussion baseline testing battery. All data were collected as part of the Concussion Assessment, Research and Education (CARE) Consortium. The baseline test battery included a post-concussion symptom checklist (Sport Concussion Assessment Tool (SCAT), psychological health screening inventory (Brief Symptom Inventory (BSI-18) and neurocognitive evaluation (ImPACT), Balance Error Scoring System (BESS), and Standardized Assessment of Concussion (SAC). Linear regression models were used to examine differences across sexes, competition levels, and varsity contact levels while controlling for academy, freshman status, race, and previous concussion. Zero inflated negative binomial models estimated symptom scores due to the high frequency of zero scores

  4. Long-term changes in lower tropospheric baseline ozone concentrations: Comparing chemistry-climate models and observations at northern midlatitudes

    Science.gov (United States)

    Parrish, D. D.; Lamarque, J.-F.; Naik, V.; Horowitz, L.; Shindell, D. T.; Staehelin, J.; Derwent, R.; Cooper, O. R.; Tanimoto, H.; Volz-Thomas, A.; Gilge, S.; Scheel, H.-E.; Steinbacher, M.; Fröhlich, M.

    2014-05-01

    Two recent papers have quantified long-term ozone (O3) changes observed at northern midlatitude sites that are believed to represent baseline (here understood as representative of continental to hemispheric scales) conditions. Three chemistry-climate models (NCAR CAM-chem, GFDL-CM3, and GISS-E2-R) have calculated retrospective tropospheric O3 concentrations as part of the Atmospheric Chemistry and Climate Model Intercomparison Project and Coupled Model Intercomparison Project Phase 5 model intercomparisons. We present an approach for quantitative comparisons of model results with measurements for seasonally averaged O3 concentrations. There is considerable qualitative agreement between the measurements and the models, but there are also substantial and consistent quantitative disagreements. Most notably, models (1) overestimate absolute O3 mixing ratios, on average by 5 to 17 ppbv in the year 2000, (2) capture only 50% of O3 changes observed over the past five to six decades, and little of observed seasonal differences, and (3) capture 25 to 45% of the rate of change of the long-term changes. These disagreements are significant enough to indicate that only limited confidence can be placed on estimates of present-day radiative forcing of tropospheric O3 derived from modeled historic concentration changes and on predicted future O3 concentrations. Evidently our understanding of tropospheric O3, or the incorporation of chemistry and transport processes into current chemical climate models, is incomplete. Modeled O3 trends approximately parallel estimated trends in anthropogenic emissions of NOx, an important O3 precursor, while measured O3 changes increase more rapidly than these emission estimates.

  5. The optimized baseline project: Reinventing environmental restoration at Hanford

    International Nuclear Information System (INIS)

    Goodenough, J.D.; Janaskie, M.T.; Kleinen, P.J.

    1994-01-01

    The U.S. Department of Energy Richland Operations Office (DOE-RL) is using a strategic planning effort (termed the Optimized Baseline Project) to develop a new approach to the Hanford Environmental Restoration program. This effort seeks to achieve a quantum leap improvement in performance through results oriented prioritization of activities. This effort was conducted in parallel with the renegotiation of the Tri-Party Agreement and provided DOE with an opportunity to propose innovative initiatives to promote cost effectiveness, accelerate progress in the Hanford Environmental Restoration Program and involve stakeholders in the decision-making process. The Optimized Baseline project is an innovative approach to program planning and decision-making in several respects. First, the process is a top down, value driven effort that responds to values held by DOE, the regulatory community and the public. Second, planning is conducted in a way that reinforces the technical management process at Richland, involves the regulatory community in substantive decisions, and includes the public. Third, the Optimized Baseline Project is being conducted as part of a sitewide Hanford initiative to reinvent Government. The planning process used for the Optimized Baseline Project has many potential applications at other sites and in other programs where there is a need to build consensus among diverse, independent groups of stakeholders and decisionmakers. The project has successfully developed and demonstrated an innovative approach to program planning that accelerates the pace of cleanup, involves the regulators as partners with DOE in priority setting, and builds public understanding and support for the program through meaningful opportunities for involvement

  6. Long Baseline Observatory (LBO)

    Data.gov (United States)

    Federal Laboratory Consortium — The Long Baseline Observatory (LBO) comprises ten radio telescopes spanning 5,351 miles. It's the world's largest, sharpest, dedicated telescope array. With an eye...

  7. Over Target Baseline: Lessons Learned from the NASA SLS Booster Element

    Science.gov (United States)

    Carroll, Truman J.

    2016-01-01

    Goal of the presentation is to teach, and then model, the steps necessary to implement an Over Target Baseline (OTB). More than a policy and procedure session, participants will learn from recent first hand experience the challenges and benefits that come from successfully executing an OTB.

  8. Improving weather predictability by including land-surface model parameter uncertainty

    Science.gov (United States)

    Orth, Rene; Dutra, Emanuel; Pappenberger, Florian

    2016-04-01

    The land surface forms an important component of Earth system models and interacts nonlinearly with other parts such as ocean and atmosphere. To capture the complex and heterogenous hydrology of the land surface, land surface models include a large number of parameters impacting the coupling to other components of the Earth system model. Focusing on ECMWF's land-surface model HTESSEL we present in this study a comprehensive parameter sensitivity evaluation using multiple observational datasets in Europe. We select 6 poorly constrained effective parameters (surface runoff effective depth, skin conductivity, minimum stomatal resistance, maximum interception, soil moisture stress function shape, total soil depth) and explore their sensitivity to model outputs such as soil moisture, evapotranspiration and runoff using uncoupled simulations and coupled seasonal forecasts. Additionally we investigate the possibility to construct ensembles from the multiple land surface parameters. In the uncoupled runs we find that minimum stomatal resistance and total soil depth have the most influence on model performance. Forecast skill scores are moreover sensitive to the same parameters as HTESSEL performance in the uncoupled analysis. We demonstrate the robustness of our findings by comparing multiple best performing parameter sets and multiple randomly chosen parameter sets. We find better temperature and precipitation forecast skill with the best-performing parameter perturbations demonstrating representativeness of model performance across uncoupled (and hence less computationally demanding) and coupled settings. Finally, we construct ensemble forecasts from ensemble members derived with different best-performing parameterizations of HTESSEL. This incorporation of parameter uncertainty in the ensemble generation yields an increase in forecast skill, even beyond the skill of the default system. Orth, R., E. Dutra, and F. Pappenberger, 2016: Improving weather predictability by

  9. GRI baseline projection of U.S. energy supply and demand to 2010. 1992 edition

    International Nuclear Information System (INIS)

    Holtberg, P.D.; Woods, T.J.; Lihn, M.L.; Koklauner, A.B.

    1992-04-01

    The annual GRI baseline projection is the result of a complex modeling effort that seeks to achieve an internally consistent energy supply and demand outlook across all energy sources and end-use demand sectors. The year's projection includes the adoption of a new petroleum refinery methodology, the incorporation of a new approach to determining electric utility generating capacity heat rates, the extensive update of both the residential and commercial databases and methodologies, and the continued update of the GRI Hydrocarbon Model. The report presents a series of summary tables, sectoral breakdowns of energy demand, and the natural gas supply and price trends. The appendices include a discussion of the methodology and assumptions used to prepare the 1992 edition of the projection, an analysis of the potential for higher levels of gas demand, a description of industrial and commercial cogeneration, a description of the independent power producer projection, a comparison of the 1992 edition of the projection with previous GRI projections, and a discussion of additional data used in developing the projection

  10. Collisional-radiative model including recombination processes for W27+ ion★

    Science.gov (United States)

    Murakami, Izumi; Sasaki, Akira; Kato, Daiji; Koike, Fumihiro

    2017-10-01

    We have constructed a collisional-radiative (CR) model for W27+ ions including 226 configurations with n ≤ 9 and ł ≤ 5 for spectroscopic diagnostics. We newly include recombination processes in the model and this is the first result of extreme ultraviolet spectrum calculated for recombining plasma component. Calculated spectra in 40-70 Å range in ionizing and recombining plasma components show similar 3 strong lines and 1 line weak in recombining plasma component at 45-50 Å and many weak lines at 50-65 Å for both components. Recombination processes do not contribute much to the spectrum at around 60 Å for W27+ ion. Dielectronic satellite lines are also minor contribution to the spectrum of recombining plasma component. Dielectronic recombination (DR) rate coefficient from W28+ to W27+ ions is also calculated with the same atomic data in the CR model. We found that larger set of energy levels including many autoionizing states gave larger DR rate coefficients but our rate agree within factor 6 with other works at electron temperature around 1 keV in which W27+ and W28+ ions are usually observed in plasmas. Contribution to the Topical Issue "Atomic and Molecular Data and their Applications", edited by Gordon W.F. Drake, Jung-Sik Yoon, Daiji Kato, and Grzegorz Karwasz.

  11. A new mouse model of mild ornithine transcarbamylase deficiency (spf-j displays cerebral amino acid perturbations at baseline and upon systemic immune activation.

    Directory of Open Access Journals (Sweden)

    Tatyana N Tarasenko

    Full Text Available Ornithine transcarbamylase deficiency (OTCD, OMIM# 311250 is an inherited X-linked urea cycle disorder that is characterized by hyperammonemia and orotic aciduria. In this report, we describe a new animal model of OTCD caused by a spontaneous mutation in the mouse Otc gene (c.240T>A, p.K80N. This transversion in exon 3 of ornithine transcarbamylase leads to normal levels of mRNA with low levels of mature protein and is homologous to a mutation that has also been described in a single patient affected with late-onset OTCD. With higher residual enzyme activity, spf-J were found to have normal plasma ammonia and orotate. Baseline plasma amino acid profiles were consistent with mild OTCD: elevated glutamine, and lower citrulline and arginine. In contrast to WT, spf-J displayed baseline elevations in cerebral amino acids with depletion following immune challenge with polyinosinic:polycytidylic acid. Our results indicate that the mild spf-J mutation constitutes a new mouse model that is suitable for mechanistic studies of mild OTCD and the exploration of cerebral pathophysiology during acute decompensation that characterizes proximal urea cycle dysfunction in humans.

  12. Baseline energy forecasts and analysis of alternative strategies for airline fuel conservation

    Energy Technology Data Exchange (ETDEWEB)

    1976-07-01

    To evaluate the impact of fuel conservation strategies, baseline forecasts of airline activity and energy consumption to 1990 were developed. Alternative policy options to reduce fuel consumption were identified and analyzed for three baseline levels of aviation activity within the framework of an aviation activity/energy consumption model. By combining the identified policy options, a strategy was developed to provide incentives for airline fuel conservation. Strategies and policy options were evaluated in terms of their impact on airline fuel conservation and the functioning of the airline industry as well as the associated social, environmental, and economic costs. (GRA)

  13. Contribution of BeiDou satellite system for long baseline GNSS measurement in Indonesia

    Science.gov (United States)

    Gumilar, I.; Bramanto, B.; Kuntjoro, W.; Abidin, H. Z.; Trihantoro, N. F.

    2018-05-01

    The demand for more precise positioning method using GNSS (Global Navigation Satellite System) in Indonesia continue to rise. The accuracy of GNSS positioning depends on the length of baseline and the distribution of observed satellites. BeiDou Navigation Satellite System (BDS) is a positioning system owned by China that operating in Asia-Pacific region, including Indonesia. This research aims to find out the contribution of BDS in increasing the accuracy of long baseline static positioning in Indonesia. The contributions are assessed by comparing the accuracy of measurement using only GPS (Global Positioning System) and measurement using the combination of GPS and BDS. The data used is 5 days of GPS and BDS measurement data for baseline with 120 km in length. The software used is open-source RTKLIB and commercial software Compass Solution. This research will explain in detail the contribution of BDS to the accuracy of position in long baseline static GNSS measurement.

  14. Can baseline ultrasound results help to predict failure to achieve DAS28 remission after 1 year of tight control treatment in early RA patients?

    Science.gov (United States)

    Ten Cate, D F; Jacobs, J W G; Swen, W A A; Hazes, J M W; de Jager, M H; Basoski, N M; Haagsma, C J; Luime, J J; Gerards, A H

    2018-01-30

    At present, there are no prognostic parameters unequivocally predicting treatment failure in early rheumatoid arthritis (RA) patients. We investigated whether baseline ultrasonography (US) findings of joints, when added to baseline clinical, laboratory, and radiographical data, could improve prediction of failure to achieve Disease Activity Score assessing 28 joints (DAS28) remission (baseline. Clinical, laboratory, and radiographical parameters were recorded. Primary analysis was the prediction by logistic regression of the absence of DAS28 remission 12 months after diagnosis and start of therapy. Of 194 patients included, 174 were used for the analysis, with complete data available for 159. In a multivariate model with baseline DAS28 (odds ratio (OR) 1.6, 95% confidence interval (CI) 1.2-2.2), the presence of rheumatoid factor (OR 2.3, 95% CI 1.1-5.1), and type of monitoring strategy (OR 0.2, 95% CI 0.05-0.85), the addition of baseline US results for joints (OR 0.96, 95% CI 0.89-1.04) did not significantly improve the prediction of failure to achieve DAS28 remission (likelihood ratio test, 1.04; p = 0.31). In an early RA population, adding baseline ultrasonography of the hands, wrists, and feet to commonly available baseline characteristics did not improve prediction of failure to achieve DAS28 remission at 12 months. Clinicaltrials.gov, NCT01752309 . Registered on 19 December 2012.

  15. U Plant Aggregate Area Management study technical baseline report

    International Nuclear Information System (INIS)

    DeFord, D.H.; Carpenter, R.W.

    1995-05-01

    This document was prepared in support of an Aggregate Area Management Study of U Plant. It provides a technical baseline of the aggregate area and results from an environmental investigation that was undertaken by the Technical Baseline Section of the Environmental Engineering Group, Westinghouse Hanford Company (WHC), which is currently the Waste Site and Facility Research Office, Natural Resources, Bechtel Hanford, Inc. (BHI). It is based upon review and evaluation of numerous Hanford Site current and historical reports, drawings and photographs, supplemented with site inspections and employee interviews. U Plant refers to the 221-U Process Canyon Building, a chemical separation facility constructed during World War II. It also includes the Uranium Oxide (UO 3 ) Plant constructed at the same time as 221-U as an adjunct to the original plutonium separation process but which, like 221-U, was converted for other missions. Waste sites are associated primarily with U Plant's 1952 through 1958 Uranium Metal Recovery Program mission and the U0 3 Plant's ongoing U0 3 mission. Waste sites include cribs, reverse wells, french drains, septic tanks and drain fields, trenches, catch tanks, settling tanks, diversion boxes, a waste vault, and the lines and encasements that connect them. It also includes the U Pond and its feed ditches and an underground tank farm designed for high-level liquid wastes

  16. Assessment of early treatment response to neoadjuvant chemotherapy in breast cancer using non-mono-exponential diffusion models: a feasibility study comparing the baseline and mid-treatment MRI examinations

    Energy Technology Data Exchange (ETDEWEB)

    Bedair, Reem; Manavaki, Roido; Gill, Andrew B.; Abeyakoon, Oshaani; Gilbert, Fiona J. [University of Cambridge, Department of Radiology, School of Clinical Medicine, Cambridge (United Kingdom); Priest, Andrew N.; Patterson, Andrew J. [Cambridge University Hospitals NHS Foundation Trust, Department of Radiology, Addenbrookes Hospital, Cambridge (United Kingdom); McLean, Mary A. [Cambridge University Hospitals NHS Foundation Trust, Department of Radiology, Addenbrookes Hospital, Cambridge (United Kingdom); University of Cambridge, Li Ka Shing Centre, Cancer Research UK Cambridge Institute, Cambridge (United Kingdom); Graves, Martin J. [University of Cambridge, Department of Radiology, School of Clinical Medicine, Cambridge (United Kingdom); Cambridge University Hospitals NHS Foundation Trust, Department of Radiology, Addenbrookes Hospital, Cambridge (United Kingdom); Griffiths, John R. [University of Cambridge, Li Ka Shing Centre, Cancer Research UK Cambridge Institute, Cambridge (United Kingdom)

    2017-07-15

    To assess the feasibility of the mono-exponential, bi-exponential and stretched-exponential models in evaluating response of breast tumours to neoadjuvant chemotherapy (NACT) at 3 T. Thirty-six female patients (median age 53, range 32-75 years) with invasive breast cancer undergoing NACT were enrolled for diffusion-weighted MRI (DW-MRI) prior to the start of treatment. For assessment of early response, changes in parameters were evaluated on mid-treatment MRI in 22 patients. DW-MRI was performed using eight b values (0, 30, 60, 90, 120, 300, 600, 900 s/mm{sup 2}). Apparent diffusion coefficient (ADC), tissue diffusion coefficient (D{sub t}), vascular fraction (Florin), distributed diffusion coefficient (DDC) and alpha (α) parameters were derived. Then t tests compared the baseline and changes in parameters between response groups. Repeatability was assessed at inter- and intraobserver levels. All patients underwent baseline MRI whereas 22 lesions were available at mid-treatment. At pretreatment, mean diffusion coefficients demonstrated significant differences between groups (p < 0.05). At mid-treatment, percentage increase in ADC and DDC showed significant differences between responders (49 % and 43 %) and non-responders (21 % and 32 %) (p = 0.03, p = 0.04). Overall, stretched-exponential parameters showed excellent repeatability. DW-MRI is sensitive to baseline and early treatment changes in breast cancer using non-mono-exponential models, and the stretched-exponential model can potentially monitor such changes. (orig.)

  17. Assessment of early treatment response to neoadjuvant chemotherapy in breast cancer using non-mono-exponential diffusion models: a feasibility study comparing the baseline and mid-treatment MRI examinations

    International Nuclear Information System (INIS)

    Bedair, Reem; Manavaki, Roido; Gill, Andrew B.; Abeyakoon, Oshaani; Gilbert, Fiona J.; Priest, Andrew N.; Patterson, Andrew J.; McLean, Mary A.; Graves, Martin J.; Griffiths, John R.

    2017-01-01

    To assess the feasibility of the mono-exponential, bi-exponential and stretched-exponential models in evaluating response of breast tumours to neoadjuvant chemotherapy (NACT) at 3 T. Thirty-six female patients (median age 53, range 32-75 years) with invasive breast cancer undergoing NACT were enrolled for diffusion-weighted MRI (DW-MRI) prior to the start of treatment. For assessment of early response, changes in parameters were evaluated on mid-treatment MRI in 22 patients. DW-MRI was performed using eight b values (0, 30, 60, 90, 120, 300, 600, 900 s/mm"2). Apparent diffusion coefficient (ADC), tissue diffusion coefficient (D_t), vascular fraction (Florin), distributed diffusion coefficient (DDC) and alpha (α) parameters were derived. Then t tests compared the baseline and changes in parameters between response groups. Repeatability was assessed at inter- and intraobserver levels. All patients underwent baseline MRI whereas 22 lesions were available at mid-treatment. At pretreatment, mean diffusion coefficients demonstrated significant differences between groups (p < 0.05). At mid-treatment, percentage increase in ADC and DDC showed significant differences between responders (49 % and 43 %) and non-responders (21 % and 32 %) (p = 0.03, p = 0.04). Overall, stretched-exponential parameters showed excellent repeatability. DW-MRI is sensitive to baseline and early treatment changes in breast cancer using non-mono-exponential models, and the stretched-exponential model can potentially monitor such changes. (orig.)

  18. 75 FR 66748 - Notice of Baseline Filings

    Science.gov (United States)

    2010-10-29

    ...- 000] Notice of Baseline Filings October 22, 2010. ONEOK Gas Transportation, L.L.C Docket No. PR11-68... above submitted a revised baseline filing of their Statement of Operating Conditions for services...

  19. Identifying misbehaving models using baseline climate variance

    Science.gov (United States)

    Schultz, Colin

    2011-06-01

    The majority of projections made using general circulation models (GCMs) are conducted to help tease out the effects on a region, or on the climate system as a whole, of changing climate dynamics. Sun et al., however, used model runs from 20 different coupled atmosphere-ocean GCMs to try to understand a different aspect of climate projections: how bias correction, model selection, and other statistical techniques might affect the estimated outcomes. As a case study, the authors focused on predicting the potential change in precipitation for the Murray-Darling Basin (MDB), a 1-million- square- kilometer area in southeastern Australia that suffered a recent decade of drought that left many wondering about the potential impacts of climate change on this important agricultural region. The authors first compared the precipitation predictions made by the models with 107 years of observations, and they then made bias corrections to adjust the model projections to have the same statistical properties as the observations. They found that while the spread of the projected values was reduced, the average precipitation projection for the end of the 21st century barely changed. Further, the authors determined that interannual variations in precipitation for the MDB could be explained by random chance, where the precipitation in a given year was independent of that in previous years.

  20. 2016 Annual Technology Baseline (ATB) - Webinar Presentation

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Wesley; Kurup, Parthiv; Hand, Maureen; Feldman, David; Sigrin, Benjamin; Lantz, Eric; Stehly, Tyler; Augustine, Chad; Turchi, Craig; Porro, Gian; O' Connor, Patrick; Waldoch, Connor

    2016-09-13

    This deck was presented for the 2016 Annual Technology Baseline Webinar. The presentation describes the Annual Technology Baseline, which is a compilation of current and future cost and performance data for electricity generation technologies.

  1. Utilities and offsites design baseline. Outside Battery Limits Facility 6000 tpd SRC-I Demonstration Plant. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    None

    1984-05-25

    As part of the overall Solvent Refined Coal (SRC-1) project baseline being prepared by International Coal Refining Company (ICRC), the RUST Engineering Company is providing necessary input for the Outside Battery Limits (OSBL) Facilities. The project baseline is comprised of: design baseline - technical definition of work; schedule baseline - detailed and management level 1 schedules; and cost baseline - estimates and cost/manpower plan. The design baseline (technical definition) for the OSBL Facilities has been completed and is presented in Volumes I, II, III, IV, V and VI. The OSBL technical definition is based on, and compatible with, the ICRC defined statement of work, design basis memorandum, master project procedures, process and mechanical design criteria, and baseline guidance documents. The design basis memorandum is included in Paragraph 1.3 of Volume I. The baseline design data is presented in 6 volumes. Volume I contains the introduction section and utility systems data through steam and feedwater. Volume II continues with utility systems data through fuel system, and contains the interconnecting systems and utility system integration information. Volume III contains the offsites data through water and waste treatment. Volume IV continues with offsites data, including site development and buildings, and contains raw materials and product handling and storage information. Volume V contains wastewater treatment and solid wastes landfill systems developed by Catalytic, Inc. to supplement the information contained in Volume III. Volume VI contains proprietary information of Resources Conservation Company related to the evaporator/crystallizer system of the wastewater treatment area.

  2. Associations between baseline allergens and polysensitization

    DEFF Research Database (Denmark)

    Carlsen, B.C.; Menne, T.; Johansen, J.D.

    2008-01-01

    Background: Identification of patients at risk of developing polysensitization is not possible at present. An association between weak sensitizers and polysensitization has been hypothesized. Objectives: To examine associations of 21 allergens in the European baseline series to polysensitization....... Patients/Methods: From a database-based study with 14 998 patients patch tested with the European baseline series between 1985 and 2005, a group of 759 (5.1%) patients were polysensitized. Odds ratios were calculated to determine the relative contribution of each allergen to polysensitization. Results...... denominator for the association between the allergens and the polysensitization was apparent, and any association, whether positive or negative, was relatively low. Based on these results, sensitization to specific baseline allergens cannot be used as risk indicators for polysensitization Udgivelsesdato: 2008...

  3. Associations between baseline allergens and polysensitization

    DEFF Research Database (Denmark)

    Carlsen, Berit Christina; Menné, Torkil; Johansen, Jeanne Duus

    2008-01-01

    BACKGROUND: Identification of patients at risk of developing polysensitization is not possible at present. An association between weak sensitizers and polysensitization has been hypothesized. OBJECTIVES: To examine associations of 21 allergens in the European baseline series to polysensitization....... PATIENTS/METHODS: From a database-based study with 14 998 patients patch tested with the European baseline series between 1985 and 2005, a group of 759 (5.1%) patients were polysensitized. Odds ratios were calculated to determine the relative contribution of each allergen to polysensitization. RESULTS...... denominator for the association between the allergens and the polysensitization was apparent, and any association, whether positive or negative, was relatively low. Based on these results, sensitization to specific baseline allergens cannot be used as risk indicators for polysensitization....

  4. Estimating representative background PM2.5 concentration in heavily polluted areas using baseline separation technique and chemical mass balance model

    Science.gov (United States)

    Gao, Shuang; Yang, Wen; Zhang, Hui; Sun, Yanling; Mao, Jian; Ma, Zhenxing; Cong, Zhiyuan; Zhang, Xian; Tian, Shasha; Azzi, Merched; Chen, Li; Bai, Zhipeng

    2018-02-01

    The determination of background concentration of PM2.5 is important to understand the contribution of local emission sources to total PM2.5 concentration. The purpose of this study was to exam the performance of baseline separation techniques to estimate PM2.5 background concentration. Five separation methods, which included recursive digital filters (Lyne-Hollick, one-parameter algorithm, and Boughton two-parameter algorithm), sliding interval and smoothed minima, were applied to one-year PM2.5 time-series data in two heavily polluted cities, Tianjin and Jinan. To obtain the proper filter parameters and recession constants for the separation techniques, we conducted regression analysis at a background site during the emission reduction period enforced by the Government for the 2014 Asia-Pacific Economic Cooperation (APEC) meeting in Beijing. Background concentrations in Tianjin and Jinan were then estimated by applying the determined filter parameters and recession constants. The chemical mass balance (CMB) model was also applied to ascertain the effectiveness of the new approach. Our results showed that the contribution of background PM concentration to ambient pollution was at a comparable level to the contribution obtained from the previous study. The best performance was achieved using the Boughton two-parameter algorithm. The background concentrations were estimated at (27 ± 2) μg/m3 for the whole year, (34 ± 4) μg/m3 for the heating period (winter), (21 ± 2) μg/m3 for the non-heating period (summer), and (25 ± 2) μg/m3 for the sandstorm period in Tianjin. The corresponding values in Jinan were (30 ± 3) μg/m3, (40 ± 4) μg/m3, (24 ± 5) μg/m3, and (26 ± 2) μg/m3, respectively. The study revealed that these baseline separation techniques are valid for estimating levels of PM2.5 air pollution, and that our proposed method has great potential for estimating the background level of other air pollutants.

  5. TWRS technical baseline database manager definition document

    International Nuclear Information System (INIS)

    Acree, C.D.

    1997-01-01

    This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager

  6. Pakistan, Sindh Province - Baseline Indicators System : Baseline Procurement Performance Assessment Report

    OpenAIRE

    World Bank

    2009-01-01

    This document provides an assessment of the public procurement system in Sindh province using the baseline indicators system developed by the Development Assistance Committee of the Organization for Economic Cooperation and Development (OECD-DAC). This assessment, interviews and discussions were held with stakeholders from the public and private sectors as well as civil society. Developing...

  7. The steroids for corneal ulcers trial: study design and baseline characteristics.

    Science.gov (United States)

    Srinivasan, Muthiah; Mascarenhas, Jeena; Rajaraman, Revathi; Ravindran, Meenakshi; Lalitha, Prajna; Glidden, David V; Ray, Kathryn J; Hong, Kevin C; Oldenburg, Catherine E; Lee, Salena M; Zegans, Michael E; McLeod, Stephen D; Lietman, Thomas M; Acharya, Nisha R

    2012-02-01

    To provide comprehensive trial methods and baseline data for the Steroids for Corneal Ulcers Trial and to present epidemiological characteristics such as risk factors, causative organisms, and ulcer severity. Baseline data from a 1:1 randomized, placebo-controlled, double-masked clinical trial comparing prednisolone phosphate, 1%, with placebo as adjunctive therapy for the treatment of bacterial corneal ulcers. Eligible patients had a culture-positive bacterial corneal ulcer and had been taking moxifloxacin for 48 hours. The primary outcome for the trial is best spectacle-corrected visual acuity at 3 months from enrollment. This report provides comprehensive baseline data, including best spectacle-corrected visual acuity, infiltrate size, microbiological results, and patient demographics, for patients enrolled in the trial. Of 500 patients enrolled, 97% were in India. Two hundred twenty patients (44%) were agricultural workers. Median baseline visual acuity was 0.84 logMAR (Snellen, 20/125) (interquartile range, 0.36-1.7; Snellen, 20/50 to counting fingers). Baseline visual acuity was not significantly different between the United States and India. Ulcers in India had larger infiltrate/scar sizes (P = .04) and deeper infiltrates (P = .04) and were more likely to be localized centrally (P = .002) than ulcers enrolled in the United States. Gram-positive bacteria were the most common organisms isolated from the ulcers (n = 366, 72%). The Steroids for Corneal Ulcers Trial will compare the use of a topical corticosteroid with placebo as adjunctive therapy for bacterial corneal ulcers. Patients enrolled in this trial had diverse ulcer severity and on average significantly reduced visual acuity at presentation. clinicaltrials.gov Identifier: NCT00324168.

  8. Extending Primitive Spatial Data Models to Include Semantics

    Science.gov (United States)

    Reitsma, F.; Batcheller, J.

    2009-04-01

    Our traditional geospatial data model involves associating some measurable quality, such as temperature, or observable feature, such as a tree, with a point or region in space and time. When capturing data we implicitly subscribe to some kind of conceptualisation. If we can make this explicit in an ontology and associate it with the captured data, we can leverage formal semantics to reason with the concepts represented in our spatial data sets. To do so, we extend our fundamental representation of geospatial data in a data model by including a URI in our basic data model that links it to our ontology defining our conceptualisation, We thus extend Goodchild et al's geo-atom [1] with the addition of a URI: (x, Z, z(x), URI) . This provides us with pixel or feature level knowledge and the ability to create layers of data from a set of pixels or features that might be drawn from a database based on their semantics. Using open source tools, we present a prototype that involves simple reasoning as a proof of concept. References [1] M.F. Goodchild, M. Yuan, and T.J. Cova. Towards a general theory of geographic representation in gis. International Journal of Geographical Information Science, 21(3):239-260, 2007.

  9. Association of Baseline Depressive Symptoms with Prevalent and Incident Pre-Hypertension and Hypertension in Postmenopausal Hispanic Women: Results from the Women's Health Initiative.

    Directory of Open Access Journals (Sweden)

    Ruth E Zambrana

    Full Text Available Depression and depressive symptoms are risk factors for hypertension (HTN and cardiovascular disease (CVD. Hispanic women have higher rates of depressive symptoms compared to other racial/ethnic groups yet few studies have investigated its association with incident prehypertension and hypertension among postmenopausal Hispanic women. This study aims to assess if an association exists between baseline depression and incident hypertension at 3 years follow-up among postmenopausal Hispanic women.Prospective cohort study, Women's Health Initiative (WHI, included 4,680 Hispanic women who participated in the observational and clinical trial studies at baseline and at third-year follow-up. Baseline current depressive symptoms and past depression history were measured as well as important correlates of depression-social support, optimism, life events and caregiving. Multinomial logistic regression was used to estimate prevalent and incident prehypertension and hypertension in relation to depressive symptoms.Prevalence of current baseline depression ranged from 26% to 28% by hypertension category and education moderated these rates. In age-adjusted models, women with depression were more likely to be hypertensive (OR = 1.25; 95% CI 1.04-1.51, although results were attenuated when adjusting for covariates. Depression at baseline in normotensive Hispanic women was associated with incident hypertension at year 3 follow-up (OR = 1.74; 95% CI 1.10-2.74 after adjustment for insurance and behavioral factors. However, further adjustment for clinical covariates attenuated the association. Analyses of psychosocial variables correlated with depression but did not alter findings. Low rates of antidepressant medication usage were also reported.In the largest longitudinal study to date of older Hispanic women which included physiologic, behavioral and psychosocial moderators of depression, there was no association between baseline depressive symptoms and prevalent nor

  10. Elevated baseline serum glutamate as a pharmacometabolomic biomarker for acamprosate treatment outcome in alcohol-dependent subjects

    Science.gov (United States)

    Nam, H W; Karpyak, V M; Hinton, D J; Geske, J R; Ho, A M C; Prieto, M L; Biernacka, J M; Frye, M A; Weinshilboum, R M; Choi, D-S

    2015-01-01

    Acamprosate has been widely used since the Food and Drug Administration approved the medication for treatment of alcohol use disorders (AUDs) in 2004. Although the detailed molecular mechanism of acamprosate remains unclear, it has been largely known that acamprosate inhibits glutamate action in the brain. However, AUD is a complex and heterogeneous disorder. Thus, biomarkers are required to prescribe this medication to patients who will have the highest likelihood of responding positively. To identify pharmacometabolomic biomarkers of acamprosate response, we utilized serum samples from 120 alcohol-dependent subjects, including 71 responders (maintained continuous abstinence) and 49 non-responders (any alcohol use) during 12 weeks of acamprosate treatment. Notably, baseline serum glutamate levels were significantly higher in responders compared with non-responders. Importantly, serum glutamate levels of responders are normalized after acamprosate treatment, whereas there was no significant glutamate change in non-responders. Subsequent functional studies in animal models revealed that, in the absence of alcohol, acamprosate activates glutamine synthetase, which synthesizes glutamine from glutamate and ammonia. These results suggest that acamprosate reduces serum glutamate levels for those who have elevated baseline serum glutamate levels among responders. Taken together, our findings demonstrate that elevated baseline serum glutamate levels are a potential biomarker associated with positive acamprosate response, which is an important step towards development of a personalized approach to treatment for AUD. PMID:26285131

  11. Baseline simple and complex reaction times in female compared to male boxers.

    Science.gov (United States)

    Bianco, M; Ferri, M; Fabiano, C; Giorgiano, F; Tavella, S; Manili, U; Faina, M; Palmieri, V; Zeppilli, P

    2011-06-01

    The aim of the study was to compare baseline cognitive performance of female in respect to male amateur boxers. Study population included 28 female amateur boxers. Fifty-six male boxers, matched for age, employment and competitive level to female athletes, formed the control group. All boxers had no history of head concussions (except boxing). Each boxer was requested to: 1) fulfill a questionnaire collecting demographic data, level of education, occupational status, boxing record and number of head concussions during boxing; 2) undergo a baseline computerized neuropsychological (NP) test (CogSport) measuring simple and complex reaction times (RT). Female were lighter than male boxers (56±7 vs. 73.1±9.8 kg, Pknock-outs, etc.) correlated with NP scores. Female and male Olympic-style boxers have no (or minimal) differences in baseline cognitive performance. Further research with larger series of female boxers is required to confirm these findings.

  12. Informed baseline subtraction of proteomic mass spectrometry data aided by a novel sliding window algorithm.

    Science.gov (United States)

    Stanford, Tyman E; Bagley, Christopher J; Solomon, Patty J

    2016-01-01

    Proteomic matrix-assisted laser desorption/ionisation (MALDI) linear time-of-flight (TOF) mass spectrometry (MS) may be used to produce protein profiles from biological samples with the aim of discovering biomarkers for disease. However, the raw protein profiles suffer from several sources of bias or systematic variation which need to be removed via pre-processing before meaningful downstream analysis of the data can be undertaken. Baseline subtraction, an early pre-processing step that removes the non-peptide signal from the spectra, is complicated by the following: (i) each spectrum has, on average, wider peaks for peptides with higher mass-to-charge ratios ( m / z ), and (ii) the time-consuming and error-prone trial-and-error process for optimising the baseline subtraction input arguments. With reference to the aforementioned complications, we present an automated pipeline that includes (i) a novel 'continuous' line segment algorithm that efficiently operates over data with a transformed m / z -axis to remove the relationship between peptide mass and peak width, and (ii) an input-free algorithm to estimate peak widths on the transformed m / z scale. The automated baseline subtraction method was deployed on six publicly available proteomic MS datasets using six different m/z-axis transformations. Optimality of the automated baseline subtraction pipeline was assessed quantitatively using the mean absolute scaled error (MASE) when compared to a gold-standard baseline subtracted signal. Several of the transformations investigated were able to reduce, if not entirely remove, the peak width and peak location relationship resulting in near-optimal baseline subtraction using the automated pipeline. The proposed novel 'continuous' line segment algorithm is shown to far outperform naive sliding window algorithms with regard to the computational time required. The improvement in computational time was at least four-fold on real MALDI TOF-MS data and at least an order of

  13. Baseline neurocognitive testing in sports-related concussions: the importance of a prior night's sleep.

    Science.gov (United States)

    McClure, D Jake; Zuckerman, Scott L; Kutscher, Scott J; Gregory, Andrew J; Solomon, Gary S

    2014-02-01

    The management of sports-related concussions (SRCs) utilizes serial neurocognitive assessments and self-reported symptom inventories to assess recovery and safety for return to play (RTP). Because postconcussive RTP goals include symptom resolution and a return to neurocognitive baseline levels, clinical decisions rest in part on understanding modifiers of this baseline. Several studies have reported age and sex to influence baseline neurocognitive performance, but few have assessed the potential effect of sleep. We chose to investigate the effect of reported sleep duration on baseline Immediate Post-Concussion Assessment and Cognitive Testing (ImPACT) performance and the number of patient-reported symptoms. We hypothesized that athletes receiving less sleep before baseline testing would perform worse on neurocognitive metrics and report more symptoms. Cross-sectional study; Level of evidence, 3. We retrospectively reviewed 3686 nonconcussed athletes (2371 male, 1315 female; 3305 high school, 381 college) with baseline symptom and ImPACT neurocognitive scores. Patients were stratified into 3 groups based on self-reported sleep duration the night before testing: (1) short, sleep duration on baseline ImPACT performance. A univariate ANCOVA was performed to investigate the influence of sleep on total self-reported symptoms. When controlling for age and sex as covariates, the MANCOVA revealed significant group differences on ImPACT reaction time, verbal memory, and visual memory scores but not visual-motor (processing) speed scores. An ANCOVA also revealed significant group differences in total reported symptoms. For baseline symptoms and ImPACT scores, subsequent pairwise comparisons revealed these associations to be most significant when comparing the short and intermediate sleep groups. Our results indicate that athletes sleeping fewer than 7 hours before baseline testing perform worse on 3 of 4 ImPACT scores and report more symptoms. Because SRC management and RTP

  14. Development and validation of a prognostic model to predict death in patients with traumatic bleeding, and evaluation of the effect of tranexamic acid on mortality according to baseline risk: a secondary analysis of a randomised controlled trial.

    Science.gov (United States)

    Perel, P; Prieto-Merino, D; Shakur, H; Roberts, I

    2013-06-01

    Severe bleeding accounts for about one-third of in-hospital trauma deaths. Patients with a high baseline risk of death have the most to gain from the use of life-saving treatments. An accurate and user-friendly prognostic model to predict mortality in bleeding trauma patients could assist doctors and paramedics in pre-hospital triage and could shorten the time to diagnostic and life-saving procedures such as surgery and tranexamic acid (TXA). The aim of the study was to develop and validate a prognostic model for early mortality in patients with traumatic bleeding and to examine whether or not the effect of TXA on the risk of death and thrombotic events in bleeding adult trauma patients varies according to baseline risk. Multivariable logistic regression and risk-stratified analysis of a large international cohort of trauma patients. Two hundred and seventy-four hospitals in 40 high-, medium- and low-income countries. We derived prognostic models in a large placebo-controlled trial of the effects of early administration of a short course of TXA [Clinical Randomisation of an Antifibrinolytic in Significant Haemorrhage (CRASH-2) trial]. The trial included 20,127 trauma patients with, or at risk of, significant bleeding, within 8 hours of injury. We externally validated the model on 14,220 selected trauma patients from the Trauma Audit and Research Network (TARN), which included mainly patients from the UK. We examined the effect of TXA on all-cause mortality, death due to bleeding and thrombotic events (fatal and non-fatal myocardial infarction, stroke, deep-vein thrombosis and pulmonary embolism) within risk strata in the CRASH-2 trial data set and we estimated the proportion of premature deaths averted by applying the odds ratio (OR) from the CRASH-2 trial to each of the risk strata in TARN. For the stratified analysis according baseline risk we considered the intervention TXA (1 g over 10 minutes followed by 1 g over 8 hours) or matching placebo. For the

  15. FAQs about Baseline Testing among Young Athletes

    Science.gov (United States)

    ... a similar exam conducted by a health care professional during the season if an athlete has a suspected concussion. Baseline testing generally takes place during the pre-season—ideally prior to the first practice. It is important to note that some baseline ...

  16. 75 FR 74706 - Notice of Baseline Filings

    Science.gov (United States)

    2010-12-01

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Notice of Baseline Filings November 24, 2010. Centana Intrastate Pipeline, LLC. Docket No. PR10-84-001. Centana Intrastate Pipeline, LLC... applicants listed above submitted a revised baseline filing of their Statement of Operating Conditions for...

  17. Baseline NS5A resistance associated substitutions may impair DAA response in real-world hepatitis C patients.

    Science.gov (United States)

    Carrasco, Itzíar; Arias, Ana; Benítez-Gutiérrez, Laura; Lledó, Gemma; Requena, Silvia; Cuesta, Miriam; Cuervas-Mons, Valentín; de Mendoza, Carmen

    2018-03-01

    Oral DAA have demonstrated high efficacy as treatment of hepatitis C. However, the presence of resistance-associated substitutions (RAS) at baseline has occasionally been associated with impaired treatment response. Herein, we examined the impact of baseline RAS at the HCV NS5A gene region on treatment response in a real-life setting. All hepatitis C patients treated with DAA including NS5A inhibitors at our institution were retrospectively examined. The virus NS5A gene was analyzed using population sequencing at baseline and after 24 weeks of completing therapy in all patients that failed. All changes recorded at positions 28, 29, 30, 31, 32, 58, 62, 92, and 93 were considered. A total of 166 patients were analyzed. HCV genotypes were as follows: G1a (31.9%), G1b (48.2%), G3 (10.2%), and G4 (9.6%). Overall, 69 (41.6%) patients were coinfected with HIV and 46.7% had advanced liver fibrosis (Metavir F3-F4). Sixty (36.1%) patients had at least one RAS at baseline, including M28A/G/T (5), Q30X (12), L31I/F/M/V (6), T58P/S (25), Q/E62D (1), A92 K (7), and Y93C/H (15). Overall, 4.8% had two or more RAS, being more frequent in G4 (12.5%) followed by G1b (6.3%) and G1a (1.9%). Of 10 (6%) patients that failed DAA therapy, five had baseline NS5A RAS. No association was found for specific baseline RAS, although changes at position 30 were more frequent in failures than cures (22.2% vs 6.4%, P = 0.074). Moreover, the presence of two or more RAS at baseline was more frequent in failures (HR: 7.2; P = 0.029). Upon failure, six patients showed emerging RAS, including Q30C/H/R (3), L31M (1), and Y93C/H (2). Baseline NS5A RAS are frequently seen in DAA-naïve HCV patients. Two or more baseline NS5A RAS were found in nearly 5% and were significantly associated to DAA failure. Therefore, baseline NS5A testing should be considered when HCV treatment is planned with NS5A inhibitors. © 2017 Wiley Periodicals, Inc.

  18. Conceptualizing a Dynamic Fall Risk Model Including Intrinsic Risks and Exposures.

    Science.gov (United States)

    Klenk, Jochen; Becker, Clemens; Palumbo, Pierpaolo; Schwickert, Lars; Rapp, Kilan; Helbostad, Jorunn L; Todd, Chris; Lord, Stephen R; Kerse, Ngaire

    2017-11-01

    Falls are a major cause of injury and disability in older people, leading to serious health and social consequences including fractures, poor quality of life, loss of independence, and institutionalization. To design and provide adequate prevention measures, accurate understanding and identification of person's individual fall risk is important. However, to date, the performance of fall risk models is weak compared with models estimating, for example, cardiovascular risk. This deficiency may result from 2 factors. First, current models consider risk factors to be stable for each person and not change over time, an assumption that does not reflect real-life experience. Second, current models do not consider the interplay of individual exposure including type of activity (eg, walking, undertaking transfers) and environmental risks (eg, lighting, floor conditions) in which activity is performed. Therefore, we posit a dynamic fall risk model consisting of intrinsic risk factors that vary over time and exposure (activity in context). eHealth sensor technology (eg, smartphones) begins to enable the continuous measurement of both the above factors. We illustrate our model with examples of real-world falls from the FARSEEING database. This dynamic framework for fall risk adds important aspects that may improve understanding of fall mechanisms, fall risk models, and the development of fall prevention interventions. Copyright © 2017 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.

  19. Geochemical baseline studies of soil in Finland

    Science.gov (United States)

    Pihlaja, Jouni

    2017-04-01

    The soil element concentrations regionally vary a lot in Finland. Mostly this is caused by the different bedrock types, which are reflected in the soil qualities. Geological Survey of Finland (GTK) is carrying out geochemical baseline studies in Finland. In the previous phase, the research is focusing on urban areas and mine environments. The information can, for example, be used to determine the need for soil remediation, to assess environmental impacts or to measure the natural state of soil in industrial areas or mine districts. The field work is done by taking soil samples, typically at depth between 0-10 cm. Sampling sites are chosen to represent the most vulnerable areas when thinking of human impacts by possible toxic soil element contents: playgrounds, day-care centers, schools, parks and residential areas. In the mine districts the samples are taken from the areas locating outside the airborne dust effected areas. Element contents of the soil samples are then analyzed with ICP-AES and ICP-MS, Hg with CV-AAS. The results of the geochemical baseline studies are published in the Finnish national geochemical baseline database (TAPIR). The geochemical baseline map service is free for all users via internet browser. Through this map service it is possible to calculate regional soil baseline values using geochemical data stored in the map service database. Baseline data for 17 elements in total is provided in the map service and it can be viewed on the GTK's web pages (http://gtkdata.gtk.fi/Tapir/indexEN.html).

  20. Camera Trajectory fromWide Baseline Images

    Science.gov (United States)

    Havlena, M.; Torii, A.; Pajdla, T.

    2008-09-01

    Camera trajectory estimation, which is closely related to the structure from motion computation, is one of the fundamental tasks in computer vision. Reliable camera trajectory estimation plays an important role in 3D reconstruction, self localization, and object recognition. There are essential issues for a reliable camera trajectory estimation, for instance, choice of the camera and its geometric projection model, camera calibration, image feature detection and description, and robust 3D structure computation. Most of approaches rely on classical perspective cameras because of the simplicity of their projection models and ease of their calibration. However, classical perspective cameras offer only a limited field of view, and thus occlusions and sharp camera turns may cause that consecutive frames look completely different when the baseline becomes longer. This makes the image feature matching very difficult (or impossible) and the camera trajectory estimation fails under such conditions. These problems can be avoided if omnidirectional cameras, e.g. a fish-eye lens convertor, are used. The hardware which we are using in practice is a combination of Nikon FC-E9 mounted via a mechanical adaptor onto a Kyocera Finecam M410R digital camera. Nikon FC-E9 is a megapixel omnidirectional addon convertor with 180° view angle which provides images of photographic quality. Kyocera Finecam M410R delivers 2272×1704 images at 3 frames per second. The resulting combination yields a circular view of diameter 1600 pixels in the image. Since consecutive frames of the omnidirectional camera often share a common region in 3D space, the image feature matching is often feasible. On the other hand, the calibration of these cameras is non-trivial and is crucial for the accuracy of the resulting 3D reconstruction. We calibrate omnidirectional cameras off-line using the state-of-the-art technique and Mičušík's two-parameter model, that links the radius of the image point r to the

  1. Intrafractional baseline drift during free breathing breast cancer radiation therapy.

    Science.gov (United States)

    Jensen, Christer Andre; Acosta Roa, Ana María; Lund, Jo-Åsmund; Frengen, Jomar

    2017-06-01

    Intrafraction motion in breast cancer radiation therapy (BCRT) has not yet been thoroughly described in the literature. It has been observed that baseline drift occurs as part of the intrafraction motion. This study aims to measure baseline drift and its incidence in free-breathing BCRT patients using an in-house developed laser system for tracking the position of the sternum. Baseline drift was monitored in 20 right-sided breast cancer patients receiving free breathing 3D-conformal RT by using an in-house developed laser system which measures one-dimensional distance in the AP direction. A total of 357 patient respiratory traces from treatment sessions were logged and analysed. Baseline drift was compared to patient positioning error measured from in-field portal imaging. The mean overall baseline drift at end of treatment sessions was -1.3 mm for the patient population. Relatively small baseline drift was observed during the first fraction; however it was clearly detected already at the second fraction. Over 90% of the baseline drift occurs during the first 3 min of each treatment session. The baseline drift rate for the population was -0.5 ± 0.2 mm/min in the posterior direction the first minute after localization. Only 4% of the treatment sessions had a 5 mm or larger baseline drift at 5 min, all towards the posterior direction. Mean baseline drift in the posterior direction in free breathing BCRT was observed in 18 of 20 patients over all treatment sessions. This study shows that there is a substantial baseline drift in free breathing BCRT patients. No clear baseline drift was observed during the first treatment session; however, baseline drift was markedly present at the rest of the sessions. Intrafraction motion due to baseline drift should be accounted for in margin calculations.

  2. Baseline muscle mass is a poor predictor of functional overload-induced gain in the mouse model

    Directory of Open Access Journals (Sweden)

    Audrius Kilikevicius

    2016-11-01

    Full Text Available Genetic background contributes substantially to individual variability in muscle mass. Muscle hypertrophy in response to resistance training can also vary extensively. However, it is less clear if muscle mass at baseline is predictive of the hypertrophic response.The aim of this study was to examine the effect of genetic background on variability in muscle mass at baseline and in the adaptive response of the mouse fast- and slow-twitch muscles to overload. Males of eight laboratory mouse strains: C57BL/6J (B6, n=17, BALB/cByJ (n=7, DBA/2J (D2, n=12, B6.A-(rs3676616-D10Utsw1/Kjn (B6.A, n=9, C57BL/6J-Chr10A/J/NaJ (B6.A10, n=8, BEH+/+ (n=11, BEH (n=12 and DUHi (n=12, were studied. Compensatory growth of soleus and plantaris muscles was triggered by a 4-week overload induced by synergist unilateral ablation. Muscle weight in the control leg (baseline varied from 5.2±07 mg soleus and 11.4±1.3 mg plantaris in D2 mice to 18.0±1.7 mg soleus in DUHi and 43.7±2.6 mg plantaris in BEH (p<0.001 for both muscles. In addition, soleus in the B6.A10 strain was ~40% larger (p<0.001 compared to the B6. Functional overload increased muscle weight, however, the extent of gain was strain-dependent for both soleus (p<0.01 and plantaris (p<0.02 even after accounting for the baseline differences. For the soleus muscle, the BEH strain emerged as the least responsive, with a 1.3-fold increase, compared to a 1.7-fold gain in the most responsive D2 strain, and there was no difference in the gain between the B6.A10 and B6 strains. The BEH strain appeared the least responsive in the gain of plantaris as well, 1.3-fold, compared to ~1.5-fold gain in the remaining strains. We conclude that variation in muscle mass at baseline is not a reliable predictor of that in the overload-induced gain. This suggests that a different set of genes influence variability in muscle mass acquired in the process of normal development, growth and maintenance, and in the process of adaptive

  3. 75 FR 57268 - Notice of Baseline Filings

    Science.gov (United States)

    2010-09-20

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR10-103-000; Docket No. PR10-104-000; Docket No. PR10-105- 000 (Not Consolidated)] Notice of Baseline Filings September 13..., 2010, and September 10, 2010, respectively the applicants listed above submitted their baseline filing...

  4. [Factor structure validity of the social capital scale used at baseline in the ELSA-Brasil study].

    Science.gov (United States)

    Souto, Ester Paiva; Vasconcelos, Ana Glória Godoi; Chor, Dora; Reichenheim, Michael E; Griep, Rosane Härter

    2016-07-21

    This study aims to analyze the factor structure of the Brazilian version of the Resource Generator (RG) scale, using baseline data from the Brazilian Longitudinal Health Study in Adults (ELSA-Brasil). Cross-validation was performed in three random subsamples. Exploratory factor analysis using exploratory structural equation models was conducted in the first two subsamples to diagnose the factor structure, and confirmatory factor analysis was used in the third to corroborate the model defined by the exploratory analyses. Based on the 31 initial items, the model with the best fit included 25 items distributed across three dimensions. They all presented satisfactory convergent validity (values greater than 0.50 for the extracted variance) and precision (values greater than 0.70 for compound reliability). All factor correlations were below 0.85, indicating full discriminative factor validity. The RG scale presents acceptable psychometric properties and can be used in populations with similar characteristics.

  5. Predicting clinical concussion measures at baseline based on motivation and academic profile.

    Science.gov (United States)

    Trinidad, Katrina J; Schmidt, Julianne D; Register-Mihalik, Johna K; Groff, Diane; Goto, Shiho; Guskiewicz, Kevin M

    2013-11-01

    The purpose of this study was to predict baseline neurocognitive and postural control performance using a measure of motivation, high school grade point average (hsGPA), and Scholastic Aptitude Test (SAT) score. Cross-sectional. Clinical research center. Eighty-eight National Collegiate Athletic Association Division I incoming student-athletes (freshman and transfers). Participants completed baseline clinical concussion measures, including a neurocognitive test battery (CNS Vital Signs), a balance assessment [Sensory Organization Test (SOT)], and motivation testing (Rey Dot Counting). Participants granted permission to access hsGPA and SAT total score. Standard scores for each CNS Vital Signs domain and SOT composite score. Baseline motivation, hsGPA, and SAT explained a small percentage of the variance of complex attention (11%), processing speed (12%), and composite SOT score (20%). Motivation, hsGPA, and total SAT score do not explain a significant amount of the variance in neurocognitive and postural control measures but may still be valuable to consider when interpreting neurocognitive and postural control measures.

  6. Contact allergy to preservatives : ESSCA* results with the baseline series, 2009-2012

    NARCIS (Netherlands)

    Gimenez-Arnau, A. M.; Deza, G.; Bauer, A.; Johnston, G. A.; Mahler, V.; Schuttelaar, M. -L.; Sanchez-Perez, J.; Silvestre, J. F.; Wilkinson, M.; Uter, W.

    BackgroundAllergic contact dermatitis caused by biocides is common and causes significant patient morbidity. ObjectiveTo describe the current frequency and pattern of patch test reactivity to biocide allergens included in the baseline series of most European countries. MethodsData collected by the

  7. Effects of MTHFR and MS gene polymorphisms on baseline blood pressure and Benazepril effectiveness in Chinese hypertensive patients.

    Science.gov (United States)

    Jiang, S; Yu, Y; Venners, S A; Zhang, Y; Xing, H; Wang, X; Xu, X

    2011-03-01

    The development of essential hypertension (EH) and inter-individual differences in response to antihypertensive treatment may partly result from genetic heterogeneity. In this study, we conducted an investigation of the combined effects of 5, 10-methylenetetrahydrofolate reductase (MTHFR) C677T and methionine synthase (MS) A2756G polymorphisms on baseline blood pressure (BP) and BP response to antihypertensive Benazepril treatment in 823 Chinese hypertensive patients with a fixed daily dosage of 10 mg for 15 consecutive days. When MTHFR C677T and MS A2756G polymorphisms were modelled together with adjustment for important covariates, only MTHFR C677T was associated with baseline systolic BP (SBP) (β (s.e.)=2.84 (1.10), P=0.0096) or baseline diastolic BP (DBP) (β (s.e.)=2.19 (0.65), P=0.0008). Modelled together with adjustment for important covariates, MTHFR C677T and MS A2756G polymorphisms were both independently associated with increased DBP response (baseline minus post-treatment) to Benazepril treatment (C677T: β (s.e.)=1.58 (0.76), P=0.038; A2756G: β (s.e.)=2.14 (0.89), P=0.016). Neither polymorphism was associated with SBP response to Benazepril treatment. There were no significant interactions or effect modification between MTHFR C677T and MS A2756G gene polymorphisms in models of baseline SBP, baseline DBP or DBP response to Benazepril treatment. Our results suggest that the effects of MTHFR C677T and MS A2756G gene polymorphisms may have pivotal roles in the aetiology of EH and BP response to Benazepril treatment.

  8. 75 FR 65010 - Notice of Baseline Filings

    Science.gov (United States)

    2010-10-21

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR11-1-000; Docket No. PR11-2-000; Docket No. PR11-3-000] Notice of Baseline Filings October 14, 2010. Cranberry Pipeline Docket..., 2010, respectively the applicants listed above submitted their baseline filing of its Statement of...

  9. Baseline tests for arc melter vitrification of INEL buried wastes. Volume 1: Facility description and summary data report

    International Nuclear Information System (INIS)

    Oden, L.L.; O'Connor, W.K.; Turner, P.C.; Soelberg, N.R.; Anderson, G.L.

    1993-01-01

    This report presents field results and raw data from the Buried Waste Integrated Demonstration (BWID) Arc Melter Vitrification Project Phase 1 baseline test series conducted by the Idaho National Engineering Laboratory (INEL) in cooperation with the U.S. Bureau of Mines (USBM). The baseline test series was conducted using the electric arc melter facility at the USBM Albany Research Center in Albany, Oregon. Five different surrogate waste feed mixtures were tested that simulated thermally-oxidized, buried, TRU-contaminated, mixed wastes and soils present at the INEL. The USBM Arc Furnace Integrated Waste Processing Test Facility includes a continuous feed system, the arc melting furnace, an offgas control system, and utilities. The melter is a sealed, 3-phase alternating current (ac) furnace approximately 2 m high and 1.3 m wide. The furnace has a capacity of 1 metric ton of steel and can process as much as 1,500 lb/h of soil-type waste materials. The surrogate feed materials included five mixtures designed to simulate incinerated TRU-contaminated buried waste materials mixed with INEL soil. Process samples, melter system operations data and offgas composition data were obtained during the baseline tests to evaluate the melter performance and meet test objectives. Samples and data gathered during this program included (a) automatically and manually logged melter systems operations data, (b) process samples of slag, metal and fume solids, and (c) offgas composition, temperature, velocity, flowrate, moisture content, particulate loading and metals content. This report consists of 2 volumes: Volume I summarizes the baseline test operations. It includes an executive summary, system and facility description, review of the surrogate waste mixtures, and a description of the baseline test activities, measurements, and sample collection. Volume II contains the raw test data and sample analyses from samples collected during the baseline tests

  10. Impact of baseline BMI and weight change in CCTG adjuvant breast cancer trials.

    Science.gov (United States)

    Yerushalmi, R; Dong, B; Chapman, J W; Goss, P E; Pollak, M N; Burnell, M J; Levine, M N; Bramwell, V H C; Pritchard, K I; Whelan, T J; Ingle, J N; Shepherd, L E; Parulekar, W R; Han, L; Ding, K; Gelmon, K A

    2017-07-01

    We hypothesized that increased baseline BMI and BMI change would negatively impact clinical outcomes with adjuvant breast cancer systemic therapy. Data from chemotherapy trials MA.5 and MA.21; endocrine therapy MA.12, MA.14 and MA.27; and trastuzumab HERA/MA.24 were analyzed. The primary objective was to examine the effect of BMI change on breast cancer-free interval (BCFI) landmarked at 5 years; secondary objectives included BMI changes at 1 and 3 years; BMI changes on disease-specific survival (DSS) and overall survival (OS); and effects of baseline BMI. Stratified analyses included trial therapy and composite trial stratification factors. In pre-/peri-/early post-menopausal chemotherapy trials (N = 2793), baseline BMI did not impact any endpoint and increased BMI from baseline did not significantly affect BCFI (P = 0.85) after 5 years although it was associated with worse BCFI (P = 0.03) and DSS (P = 0.07) after 1 year. BMI increase by 3 and 5 years was associated with better DSS (P = 0.01; 0.01) and OS (P = 0.003; 0.05). In pre-menopausal endocrine therapy trial MA.12 (N = 672), patients with higher baseline BMI had worse BCFI (P = 0.02) after 1 year, worse DSS (P = 0.05; 0.004) after 1 and 5 years and worse OS (P = 0.01) after 5 years. Increased BMI did not impact BCFI (P = 0.90) after 5 years, although it was associated with worse BCFI (P = 0.01) after 1 year. In post-menopausal endocrine therapy trials MA.14 and MA.27 (N = 8236), baseline BMI did not significantly impact outcome for any endpoint. BMI change did not impact BCFI or DSS after 1 or 3 years, although a mean increased BMI of 0.3 was associated with better OS (P = 0.02) after 1 year. With the administration of trastuzumab (N = 1395) baseline BMI and BMI change did not significantly impact outcomes. Higher baseline BMI and BMI increases negatively affected outcomes only in pre-/peri-/early post-menopausal trial patients. Otherwise, BMI

  11. Sandia National Laboratories/New Mexico Environmental Baseline update--Revision 1.0

    International Nuclear Information System (INIS)

    1996-07-01

    This report provides a baseline update to provide the background information necessary for personnel to prepare clear and consise NEPA documentation. The environment of the Sandia National Laboratories is described in this document, including the ecology, meteorology, climatology, seismology, emissions, cultural resources and land use, visual resources, noise pollution, transportation, and socioeconomics

  12. Sandia National Laboratories/New Mexico Environmental Baseline update--Revision 1.0

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-07-01

    This report provides a baseline update to provide the background information necessary for personnel to prepare clear and consise NEPA documentation. The environment of the Sandia National Laboratories is described in this document, including the ecology, meteorology, climatology, seismology, emissions, cultural resources and land use, visual resources, noise pollution, transportation, and socioeconomics.

  13. 76 FR 8725 - Notice of Baseline Filings

    Science.gov (United States)

    2011-02-15

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Notice of Baseline Filings Enstor Grama Ridge Storage and Docket No. PR10-97-002. Transportation, L.L.C.. EasTrans, LLC Docket No. PR10-30-001... revised baseline filing of their Statement of Operating Conditions for services provided under section 311...

  14. 76 FR 5797 - Notice of Baseline Filings

    Science.gov (United States)

    2011-02-02

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR10-114-001; Docket No. PR10-129-001; Docket No. PR10-131- 001; Docket No. PR10-68-002 Not Consolidated] Notice of Baseline... applicants listed above submitted a revised baseline filing of their Statement of Operating Conditions for...

  15. Hydrogeology baseline study Aurora Mine

    International Nuclear Information System (INIS)

    1996-01-01

    A baseline hydrogeologic study was conducted in the area of Syncrude's proposed Aurora Mine in order to develop a conceptual regional hydrogeologic model for the area that could be used to understand groundwater flow conditions. Geologic information was obtained from over 2,000 coreholes and from data obtained between 1980 and 1996 regarding water level for the basal aquifer. A 3-D numerical groundwater flow model was developed to provide quantitative estimates of the potential environmental impacts of the proposed mining operations on the groundwater flow system. The information was presented in the context of a regional study area which encompassed much of the Athabasca Oil Sands Region, and a local study area which was defined by the lowlands of the Muskeg River Basin. Characteristics of the topography, hydrology, climate, geology, and hydrogeology of the region are described. The conclusion is that groundwater flow in the aquifer occurs mostly in a westerly direction beneath the Aurora Mine towards its inferred discharge location along the Athabasca River. Baseflow in the Muskeg River is mostly related to discharge from shallow surficial aquifers. Water in the river under baseflow conditions was fresh, of calcium-carbonate type, with very little indication of mineralization associated with deeper groundwater in the Aurora Mine area. 44 refs., 5 tabs., 31 figs

  16. The lagRST Model: A Turbulence Model for Non-Equilibrium Flows

    Science.gov (United States)

    Lillard, Randolph P.; Oliver, A. Brandon; Olsen, Michael E.; Blaisdell, Gregory A.; Lyrintzis, Anastasios S.

    2011-01-01

    This study presents a new class of turbulence model designed for wall bounded, high Reynolds number flows with separation. The model addresses deficiencies seen in the modeling of nonequilibrium turbulent flows. These flows generally have variable adverse pressure gradients which cause the turbulent quantities to react at a finite rate to changes in the mean flow quantities. This "lag" in the response of the turbulent quantities can t be modeled by most standard turbulence models, which are designed to model equilibrium turbulent boundary layers. The model presented uses a standard 2-equation model as the baseline for turbulent equilibrium calculations, but adds transport equations to account directly for non-equilibrium effects in the Reynolds Stress Tensor (RST) that are seen in large pressure gradients involving shock waves and separation. Comparisons are made to several standard turbulence modeling validation cases, including an incompressible boundary layer (both neutral and adverse pressure gradients), an incompressible mixing layer and a transonic bump flow. In addition, a hypersonic Shock Wave Turbulent Boundary Layer Interaction with separation is assessed along with a transonic capsule flow. Results show a substantial improvement over the baseline models for transonic separated flows. The results are mixed for the SWTBLI flows assessed. Separation predictions are not as good as the baseline models, but the over prediction of the peak heat flux downstream of the reattachment shock that plagues many models is reduced.

  17. Precision Search for Muon Antineutrino Disappearance Oscillations Using a Dual Baseline Technique

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Gary Li [Columbia Univ., New York, NY (United States)

    2013-01-01

    A search for short baseline muon antineutrino disappearance with the SciBooNE and MiniBooNE experiments at Fermi National Accelerator Laboratory in Batavia, Illinois is presented. Short baseline muon antineutrino disappearance measurements help constrain sterile neutrino models. The two detectors observe muon antineutrinos from the same beam, therefore the combined analysis of their data sets serves to partially constrain some of the flux and cross section uncertainties. A likelihood ratio method was used to set a 90% confidence level upper limit on muon antineutrino disappearance that dramatically improves upon prior sterile neutrino oscillation limits in the Δm2=0.1-100 eV2 region.

  18. Antiepileptic drug behavioral side effects and baseline hyperactivity in children and adolescents with new onset epilepsy.

    Science.gov (United States)

    Guilfoyle, Shanna M; Follansbee-Junger, Katherine; Smith, Aimee W; Combs, Angela; Ollier, Shannon; Hater, Brooke; Modi, Avani C

    2018-01-01

    To examine baseline psychological functioning and antiepileptic drug (AED) behavioral side effects in new onset epilepsy and determine, by age, whether baseline psychological functioning predicts AED behavioral side effects 1 month following AED initiation. A retrospective chart review was conducted between July 2011 and December 2014 that included youths with new onset epilepsy. As part of routine interdisciplinary care, caregivers completed the Behavior Assessment System for Children, 2nd Edition: Parent Rating Scale to report on baseline psychological functioning at the diagnostic visit and the Pediatric Epilepsy Side Effects Questionnaire to identify AED behavioral side effects at the 1-month follow-up clinic visit following AED initiation. Children (age = 2-11 years) and adolescents (age = 12-18 years) were examined separately. A total of 380 youths with new onset epilepsy (M age  = 8.9 ± 4.3 years; 83.4% Caucasian; 34.8% focal epilepsy, 41.1% generalized epilepsy, 23.7% unclassified epilepsy) were included. Seventy percent of youths had at-risk or clinically elevated baseline psychological symptoms. Children had significantly greater AED behavioral side effects (M = 25.08 ± 26.36) compared to adolescents (M = 12.36 ± 17.73), regardless of AED. Valproic acid demonstrated significantly greater behavioral side effects compared to all other AEDs, with the exception of levetiracetam. Higher hyperactivity/impulsivity at baseline significantly predicted higher AED behavioral side effects 1 month after AED initiation in both age groups. Younger children seem to be more prone to experience behavioral side effects, and these are likely to be higher if youths with epilepsy have baseline hyperactivity/impulsivity. Baseline psychological screening, specifically hyperactivity, can be used as a precision medicine tool for AED selection. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.

  19. Comparison of CFD Predictions of the TCA Baseline

    Science.gov (United States)

    Cappuccio, Gelsomina

    1999-01-01

    The computational fluid dynamics (CFD) comparisons being presented are compared to each other and to wind tunnel (WT) data on the baseline TCA. Some of the CFD computations were done prior to the tests and others later. Only force data (CL vs CD) from CFD will be presented as part of this report. The WT data presented comes from the testing of the baseline TCA in the Langley Unitary Plan Wind Tunnel (UPWT), Test Section #2. There are 2 sets of wind tunnel data being presented: one from test 1671 of model 2a (flapped wing) and the other from test 1679 of model 2b (solid wing). Most of the plots show only one run from each of the WT tests per configuration. But many repeat runs were taken during the tests. The WT repeat runs showed an uncertainty in the drag of +/- 0.5 count. There were times when the uncertainty in drag was better, +/- 0.25 count. Test 1671 data was of forces and pressures measured from model 2a. The wing had cutouts for installing various leading and trailing edge flaps at lower Mach numbers. The internal duct of the nacelles are not designed and fabricated as defined in the outer mold lines (OML) iges file. The internal duct was fabricated such that a linear transition occurs from the inlet to exhaust. Whereas, the iges definition has a constant area internal duct that quickly transitions from the inlet to exhaust cross sectional shape. The nacelle internal duct was fabricated, the way described, to save time and money. The variation in the cross sectional area is less than 1% from the iges definition. The nacelles were also installed with and without fairings. Fairings are defined as the build up of the nacelles on the upper wing surface so that the nacelles poke through the upper surface as defined in the OML iges file. Test 1679 data was of forces measured from model 2a and 2b. The wing for model 2b was a solid wing. The nacelles were built the same way as for model 2a, except for the nacelle base pressure installation. The nacelles were only

  20. Grand unified models including extra Z bosons

    International Nuclear Information System (INIS)

    Li Tiezhong

    1989-01-01

    The grand unified theories (GUT) of the simple Lie groups including extra Z bosons are discussed. Under authors's hypothesis there are only SU 5+m SO 6+4n and E 6 groups. The general discussion of SU 5+m is given, then the SU 6 and SU 7 are considered. In SU 6 the 15+6 * +6 * fermion representations are used, which are not same as others in fermion content, Yukawa coupling and broken scales. A conception of clans of particles, which are not families, is suggested. These clans consist of extra Z bosons and the corresponding fermions of the scale. The all of fermions in the clans are down quarks except for the standard model which consists of Z bosons and 15 fermions, therefore, the spectrum of the hadrons which are composed of these down quarks are different from hadrons at present

  1. Modeling of cylindrical surrounding gate MOSFETs including the fringing field effects

    International Nuclear Information System (INIS)

    Gupta, Santosh K.; Baishya, Srimanta

    2013-01-01

    A physically based analytical model for surface potential and threshold voltage including the fringing gate capacitances in cylindrical surround gate (CSG) MOSFETs has been developed. Based on this a subthreshold drain current model has also been derived. This model first computes the charge induced in the drain/source region due to the fringing capacitances and considers an effective charge distribution in the cylindrically extended source/drain region for the development of a simple and compact model. The fringing gate capacitances taken into account are outer fringe capacitance, inner fringe capacitance, overlap capacitance, and sidewall capacitance. The model has been verified with the data extracted from 3D TCAD simulations of CSG MOSFETs and was found to be working satisfactorily. (semiconductor devices)

  2. A novel baseline correction method using convex optimization framework in laser-induced breakdown spectroscopy quantitative analysis

    Science.gov (United States)

    Yi, Cancan; Lv, Yong; Xiao, Han; Ke, Ke; Yu, Xun

    2017-12-01

    For laser-induced breakdown spectroscopy (LIBS) quantitative analysis technique, baseline correction is an essential part for the LIBS data preprocessing. As the widely existing cases, the phenomenon of baseline drift is generated by the fluctuation of laser energy, inhomogeneity of sample surfaces and the background noise, which has aroused the interest of many researchers. Most of the prevalent algorithms usually need to preset some key parameters, such as the suitable spline function and the fitting order, thus do not have adaptability. Based on the characteristics of LIBS, such as the sparsity of spectral peaks and the low-pass filtered feature of baseline, a novel baseline correction and spectral data denoising method is studied in this paper. The improved technology utilizes convex optimization scheme to form a non-parametric baseline correction model. Meanwhile, asymmetric punish function is conducted to enhance signal-noise ratio (SNR) of the LIBS signal and improve reconstruction precision. Furthermore, an efficient iterative algorithm is applied to the optimization process, so as to ensure the convergence of this algorithm. To validate the proposed method, the concentration analysis of Chromium (Cr),Manganese (Mn) and Nickel (Ni) contained in 23 certified high alloy steel samples is assessed by using quantitative models with Partial Least Squares (PLS) and Support Vector Machine (SVM). Because there is no prior knowledge of sample composition and mathematical hypothesis, compared with other methods, the method proposed in this paper has better accuracy in quantitative analysis, and fully reflects its adaptive ability.

  3. 75 FR 70732 - Notice of Baseline Filings

    Science.gov (United States)

    2010-11-18

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR11-71-000; Docket No. PR11-72-000; Docket No. PR11-73- 000] Notice of Baseline Filings November 10, 2010. Docket No. PR11-71-000..., 2010, the applicants listed above submitted their baseline filing of their Statement of Operating...

  4. CONSTRAINING THE STRUCTURE OF SAGITTARIUS A*'s ACCRETION FLOW WITH MILLIMETER VERY LONG BASELINE INTERFEROMETRY CLOSURE PHASES

    Energy Technology Data Exchange (ETDEWEB)

    Broderick, Avery E [Canadian Institute for Theoretical Astrophysics, 60 St. George Street, Toronto, ON M5S 3H8 (Canada); Fish, Vincent L; Doeleman, Sheperd S [Massachusetts Institute of Technology, Haystack Observatory, Route 40, Westford, MA 01886 (United States); Loeb, Abraham [Institute for Theory and Computation, Harvard University, Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States)

    2011-09-01

    Millimeter wave very long baseline interferometry (mm-VLBI) provides access to the emission region surrounding Sagittarius A* (Sgr A*), the supermassive black hole at the center of the Milky Way, on sub-horizon scales. Recently, a closure phase of 0{sup 0} {+-} 40{sup 0} was reported on a triangle of Earth-sized baselines (SMT-CARMA-JCMT) representing a new constraint upon the structure and orientation of the emission region, independent from those provided by the previously measured 1.3 mm-VLBI visibility amplitudes alone. Here, we compare this to the closure phases associated with a class of physically motivated, radiatively inefficient accretion flow models and present predictions for future mm-VLBI experiments with the developing Event Horizon Telescope (EHT). We find that the accretion flow models are capable of producing a wide variety of closure phases on the SMT-CARMA-JCMT triangle and thus not all models are consistent with the recent observations. However, those models that reproduce the 1.3 mm-VLBI visibility amplitudes overwhelmingly have SMT-CARMA-JCMT closure phases between {+-}30{sup 0}, and are therefore broadly consistent with all current mm-VLBI observations. Improving station sensitivity by factors of a few, achievable by increases in bandwidth and phasing together multiple antennas at individual sites, should result in physically relevant additional constraints upon the model parameters and eliminate the current 180{sup 0} ambiguity on the source orientation. When additional stations are included, closure phases of order 45{sup 0}-90{sup 0} are typical. In all cases, the EHT will be able to measure these with sufficient precision to produce dramatic improvements in the constraints upon the spin of Sgr A*.

  5. Baseline Response Levels Are a Nuisance in Infant Contingency Learning

    Science.gov (United States)

    Millar, W. S.; Weir, Catherine

    2015-01-01

    The impact of differences in level of baseline responding on contingency learning in the first year was examined by considering the response acquisition of infants classified into baseline response quartiles. Whereas the three lower baseline groups showed the predicted increment in responding to a contingency, the highest baseline responders did…

  6. Baseline status and dose to the penile bulb predict impotence 1 year after radiotherapy for prostate cancer

    Energy Technology Data Exchange (ETDEWEB)

    Cozzarini, Cesare; Badenchini, Fabio [San Raffaele Scientific Institute, Radiotherapy, Milano (Italy); Rancati, Tiziana [Fondazione IRCCS Istituto Nazionale dei Tumori, Prostate Cancer Program, Milan (Italy); Palorini, Federica; Improta, Ilaria; Fiorino, Claudio [San Raffaele Scientific Institute, Medical Physics, Milan (Italy); Avuzzi, Barbara [Fondazione IRCCS Istituto Nazionale dei Tumori, Radiation Oncology 1, Milan (Italy); Degli Esposti, Claudio [Ospedale Bellaria, Radiotherapy, Bologna (Italy); Girelli, Giuseppe [Ospedale ASL9, Radiotherapy, Ivrea (Italy); Vavassori, Vittorio [Cliniche Gavazzeni-Humanitas, Radiotherapy, Bergamo (Italy); Valdagni, Riccardo [Fondazione IRCCS Istituto Nazionale dei Tumori, Prostate Cancer Program, Milan (Italy); Fondazione IRCCS Istituto Nazionale dei Tumori, Radiation Oncology 1, Milan (Italy)

    2016-05-15

    To assess the predictors of the onset of impotence 1 year after radiotherapy for prostate cancer. In a multi-centric prospective study, the International Index of Erectile Function (IIEF) questionnaire-based potency of 91 hormone-naive and potent patients (IIEF1-5 > 11 before radiotherapy) was assessed. At the time of this analysis, information on potency 1 year after treatment was available for 62 of 91 patients (42 treated with hypofractionation: 2.35-2.65 Gy/fr, 70-74.2 Gy; 20 with conventional fractionation: 74-78 Gy). Prospectively collected individual information and D{sub max}/D{sub mean} to the penile bulb were available; the corresponding 2 Gy-equivalent values (EQD2 {sub max}/EQD2 {sub mean}) were also considered. Predictors of 1-year impotency were assessed through uni- and multi-variable backward logistic regression: The best cut-off values discriminating between potent and impotent patients were assessed by ROC analyses. The discriminative power of the models and goodness-of-fit were measured by AUC analysis and the Hosmer-Lemeshow (H and L) test. At 1-year follow-up, 26 of 62 patients (42 %) became impotent. The only predictive variables were baseline IIEF1-5 values (best cut-off baseline IIEF1-5 ≥ 19), D{sub max} ≥ 68.5 Gy and EQD2 {sub max} ≥ 74.2 Gy. The risk of 1-year impotence may be predicted by a two-variable model including baseline IIEF1-5 (OR: 0.80, p = 0.003) and EQD2 {sub max} ≥ 74.2 Gy (OR: 4.1, p = 0.022). The AUC of the model was 0.77 (95% CI: 0.64-0.87, p = 0.0007, H and L: p = 0.62). The 1-year risk of impotency after high-dose radiotherapy in potent men depends on the EQD2 {sub max} to the penile bulb and on baseline IIEF1-5 values. A significant reduction in the risk may be expected mainly when sparing the bulb in patients with no/mild baseline impotency (IIEF1-5 > 17). (orig.) [German] Beurteilung von Praediktoren fuer das Auftreten von Impotenz 1 Jahr nach Radiotherapie bei Prostatakrebs. In einer multizentrischen

  7. IPCC Socio-Economic Baseline Dataset

    Data.gov (United States)

    National Aeronautics and Space Administration — The Intergovernmental Panel on Climate Change (IPCC) Socio-Economic Baseline Dataset consists of population, human development, economic, water resources, land...

  8. A roller chain drive model including contact with guide-bars

    DEFF Research Database (Denmark)

    Pedersen, Sine Leergaard; Hansen, John Michael; Ambrósio, J. A. C.

    2004-01-01

    A model of a roller chain drive is developed and applied to the simulation and analysis of roller chain drives of large marine diesel engines. The model includes the impact with guide-bars that are the motion delimiter components on the chain strands between the sprockets. The main components...... and the sprocket centre, i.e. a constraint is added when such distance is less than the pitch radius. The unilateral kinematic constraint is removed when its associated constraint reaction force, applied on the roller, is in the direction of the root of the sprocket teeth. In order to improve the numerical...

  9. Baseline methodologies for clean development mechanism projects

    Energy Technology Data Exchange (ETDEWEB)

    Lee, M.K. (ed.); Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-15

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  10. Baseline methodologies for clean development mechanism projects

    International Nuclear Information System (INIS)

    Lee, M.K.; Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-01

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  11. A constitutive model for the forces of a magnetic bearing including eddy currents

    Science.gov (United States)

    Taylor, D. L.; Hebbale, K. V.

    1993-01-01

    A multiple magnet bearing can be developed from N individual electromagnets. The constitutive relationships for a single magnet in such a bearing is presented. Analytical expressions are developed for a magnet with poles arranged circumferencially. Maxwell's field equations are used so the model easily includes the effects of induced eddy currents due to the rotation of the journal. Eddy currents must be included in any dynamic model because they are the only speed dependent parameter and may lead to a critical speed for the bearing. The model is applicable to bearings using attraction or repulsion.

  12. Baseline budgeting for continuous improvement.

    Science.gov (United States)

    Kilty, G L

    1999-05-01

    This article is designed to introduce the techniques used to convert traditionally maintained department budgets to baseline budgets. This entails identifying key activities, evaluating for value-added, and implementing continuous improvement opportunities. Baseline Budgeting for Continuous Improvement was created as a result of a newly named company president's request to implement zero-based budgeting. The president was frustrated with the mind-set of the organization, namely, "Next year's budget should be 10 to 15 percent more than this year's spending." Zero-based budgeting was not the answer, but combining the principles of activity-based costing and the Just-in-Time philosophy of eliminating waste and continuous improvement did provide a solution to the problem.

  13. Background differences in baseline and stimulated MMP levels influence abdominal aortic aneurysm susceptibility

    Science.gov (United States)

    Dale, Matthew A.; Ruhlman, Melissa K.; Zhao, Shijia; Meisinger, Trevor; Gu, Linxia; Swier, Vicki J.; Agrawal, Devendra K.; Greiner, Timothy C.; Carson, Jeffrey S.; Baxter, B. Timothy; Xiong, Wanfen

    2015-01-01

    Objective Evidence has demonstrated profound influence of genetic background on cardiovascular phenotypes. Murine models in Marfan syndrome (MFS) have shown that genetic background-related variations affect thoracic aortic aneurysm formation, rupture, and lifespan of mice. MFS mice with C57Bl/6 genetic background are less susceptible to aneurysm formation compared to the 129/SvEv genetic background. In this study, we hypothesize that susceptibility to abdominal aortic aneurysm (AAA) will be increased in 129/SvEv mice versus C57Bl/6 mice. We tested this hypothesis by assessing differences in aneurysm size, tissue properties, immune response, and MMP expression. Methods Mice of C57Bl/6 or 129/SvEv background underwent AAA induction by periaortic application of CaCl2. Baseline aortic diameters, tissue properties and MMP levels were measured. After aneurysm induction, diameters, MMP expression, and immune response (macrophage infiltration and bone marrow transplantation) were measured. Results Aneurysms were larger in 129/SvEv mice than C57Bl/6 mice (83.0% ± 13.6 increase compared to 57.8% ± 6.4). The aorta was stiffer in the 129/SvEv mice compared to C57Bl/6 mice (952.5 kPa ± 93.6 versus 621.4 kPa ± 84.2). Baseline MMP-2 and post-aneurysm MMP-2 and -9 levels were higher in 129/SvEv aortas compared to C57Bl/6 aortas. Elastic lamella disruption/fragmentation and macrophage infiltration were increased in 129/SvEv mice. Myelogenous cell reversal by bone marrow transplantation did not affect aneurysm size. Conclusions These data demonstrate that 129/SvEv mice are more susceptible to AAA compared to C57Bl/6 mice. Intrinsic properties of the aorta between the two strains of mice, including baseline expression of MMP-2, influence susceptibility to AAA. PMID:26546710

  14. Safe and sensible preprocessing and baseline correction of pupil-size data.

    Science.gov (United States)

    Mathôt, Sebastiaan; Fabius, Jasper; Van Heusden, Elle; Van der Stigchel, Stefan

    2018-02-01

    Measurement of pupil size (pupillometry) has recently gained renewed interest from psychologists, but there is little agreement on how pupil-size data is best analyzed. Here we focus on one aspect of pupillometric analyses: baseline correction, i.e., analyzing changes in pupil size relative to a baseline period. Baseline correction is useful in experiments that investigate the effect of some experimental manipulation on pupil size. In such experiments, baseline correction improves statistical power by taking into account random fluctuations in pupil size over time. However, we show that baseline correction can also distort data if unrealistically small pupil sizes are recorded during the baseline period, which can easily occur due to eye blinks, data loss, or other distortions. Divisive baseline correction (corrected pupil size = pupil size/baseline) is affected more strongly by such distortions than subtractive baseline correction (corrected pupil size = pupil size - baseline). We discuss the role of baseline correction as a part of preprocessing of pupillometric data, and make five recommendations: (1) before baseline correction, perform data preprocessing to mark missing and invalid data, but assume that some distortions will remain in the data; (2) use subtractive baseline correction; (3) visually compare your corrected and uncorrected data; (4) be wary of pupil-size effects that emerge faster than the latency of the pupillary response allows (within ±220 ms after the manipulation that induces the effect); and (5) remove trials on which baseline pupil size is unrealistically small (indicative of blinks and other distortions).

  15. Including policy and management in socio-hydrology models: initial conceptualizations

    Science.gov (United States)

    Hermans, Leon; Korbee, Dorien

    2017-04-01

    Socio-hydrology studies the interactions in coupled human-water systems. So far, the use of dynamic models that capture the direct feedback between societal and hydrological systems has been dominant. What has not yet been included with any particular emphasis, is the policy or management layer, which is a central element in for instance integrated water resources management (IWRM) or adaptive delta management (ADM). Studying the direct interactions between human-water systems generates knowledges that eventually helps influence these interactions in ways that may ensure better outcomes - for society and for the health and sustainability of water systems. This influence sometimes occurs through spontaneous emergence, uncoordinated by societal agents - private sector, citizens, consumers, water users. However, the term 'management' in IWRM and ADM also implies an additional coordinated attempt through various public actors. This contribution is a call to include the policy and management dimension more prominently into the research focus of the socio-hydrology field, and offers first conceptual variables that should be considered in attempts to include this policy or management layer in socio-hydrology models. This is done by drawing on existing frameworks to study policy processes throughout both planning and implementation phases. These include frameworks such as the advocacy coalition framework, collective learning and policy arrangements, which all emphasis longer-term dynamics and feedbacks between actor coalitions in strategic planning and implementation processes. A case about longter-term dynamics in the management of the Haringvliet in the Netherlands is used to illustrate the paper.

  16. Esophageal Baseline Impedance Reflects Mucosal Integrity and Predicts Symptomatic Outcome With Proton Pump Inhibitor Treatment.

    Science.gov (United States)

    Xie, Chenxi; Sifrim, Daniel; Li, Yuwen; Chen, Minhu; Xiao, Yinglian

    2018-01-30

    Esophageal baseline impedance, which is decreased in gastroesophageal reflux disease (GERD) patients, is related to the severity of acid reflux and the integrity of the esophageal mucosa. The study aims to compare the baseline impedance and the dilated intercellular spaces (DIS) within patients with typical reflux symptoms and to evaluate the correlation of baseline impedance with DIS, esophageal acid exposure, as well as the efficacy of proton pump inhibitor (PPI) treatment. Ninety-two patients and 10 healthy controls were included in the study. Erosive esophagitis (EE) was defined by esophageal mucosal erosion under upper endoscopy. Patients without mucosa erosion were divided into groups with pathologic acid reflux (non-erosive reflux disease [NERD]) or with hypersensitive esophagus. The biopsies of esophageal mucosa were taken 2-4 cm above the gastroesophageal junction Z-line during upper endoscopy for DIS measurement. All the patients received esomeprazole 20 mg twice-daily treatment for 8 weeks. The efficacy of esomeprazole was evaluated among all patients. The intercellular spaces were dilated in both EE and NERD patients ( P baseline impedance was decreased in both EE patients and NERD patients, and negatively correlated to the acid exposure time ( r = -0.527, P baseline impedance ( r = -0.230, P Baseline impedance > 1764 Ω" was an independent predictor for PPI failure (OR, 11.9; 95% CI, 2.4-58.9; P baseline impedance was observed in patients with mucosa erosion or pathological acid reflux. The baseline impedance reflected the mucosal integrity, it was more sensitive to esophageal acid exposure. Patients with high impedance might not benefit from the PPI treatment.

  17. Thermoelectric Generators for Automotive Waste Heat Recovery Systems Part I: Numerical Modeling and Baseline Model Analysis

    Science.gov (United States)

    Kumar, Sumeet; Heister, Stephen D.; Xu, Xianfan; Salvador, James R.; Meisner, Gregory P.

    2013-04-01

    A numerical model has been developed to simulate coupled thermal and electrical energy transfer processes in a thermoelectric generator (TEG) designed for automotive waste heat recovery systems. This model is capable of computing the overall heat transferred, the electrical power output, and the associated pressure drop for given inlet conditions of the exhaust gas and the available TEG volume. Multiple-filled skutterudites and conventional bismuth telluride are considered for thermoelectric modules (TEMs) for conversion of waste heat from exhaust into usable electrical power. Heat transfer between the hot exhaust gas and the hot side of the TEMs is enhanced with the use of a plate-fin heat exchanger integrated within the TEG and using liquid coolant on the cold side. The TEG is discretized along the exhaust flow direction using a finite-volume method. Each control volume is modeled as a thermal resistance network which consists of integrated submodels including a heat exchanger and a thermoelectric device. The pressure drop along the TEG is calculated using standard pressure loss correlations and viscous drag models. The model is validated to preserve global energy balances and is applied to analyze a prototype TEG with data provided by General Motors. Detailed results are provided for local and global heat transfer and electric power generation. In the companion paper, the model is then applied to consider various TEG topologies using skutterudite and bismuth telluride TEMs.

  18. Performance Measurement Baseline Change Request

    Data.gov (United States)

    Social Security Administration — The Performance Measurement Baseline Change Request template is used to document changes to scope, cost, schedule, or operational performance metrics for SSA's Major...

  19. A hydrodynamic model for granular material flows including segregation effects

    Science.gov (United States)

    Gilberg, Dominik; Klar, Axel; Steiner, Konrad

    2017-06-01

    The simulation of granular flows including segregation effects in large industrial processes using particle methods is accurate, but very time-consuming. To overcome the long computation times a macroscopic model is a natural choice. Therefore, we couple a mixture theory based segregation model to a hydrodynamic model of Navier-Stokes-type, describing the flow behavior of the granular material. The granular flow model is a hybrid model derived from kinetic theory and a soil mechanical approach to cover the regime of fast dilute flow, as well as slow dense flow, where the density of the granular material is close to the maximum packing density. Originally, the segregation model has been formulated by Thornton and Gray for idealized avalanches. It is modified and adapted to be in the preferred form for the coupling. In the final coupled model the segregation process depends on the local state of the granular system. On the other hand, the granular system changes as differently mixed regions of the granular material differ i.e. in the packing density. For the modeling process the focus lies on dry granular material flows of two particle types differing only in size but can be easily extended to arbitrary granular mixtures of different particle size and density. To solve the coupled system a finite volume approach is used. To test the model the rotational mixing of small and large particles in a tumbler is simulated.

  20. The relationship between psychological distress and baseline sports-related concussion testing.

    Science.gov (United States)

    Bailey, Christopher M; Samples, Hillary L; Broshek, Donna K; Freeman, Jason R; Barth, Jeffrey T

    2010-07-01

    This study examined the effect of psychological distress on neurocognitive performance measured during baseline concussion testing. Archival data were utilized to examine correlations between personality testing and computerized baseline concussion testing. Significantly correlated personality measures were entered into linear regression analyses, predicting baseline concussion testing performance. Suicidal ideation was examined categorically. Athletes underwent testing and screening at a university athletic training facility. Participants included 47 collegiate football players 17 to 19 years old, the majority of whom were in their first year of college. Participants were administered the Concussion Resolution Index (CRI), an internet-based neurocognitive test designed to monitor and manage both at-risk and concussed athletes. Participants took the Personality Assessment Inventory (PAI), a self-administered inventory designed to measure clinical syndromes, treatment considerations, and interpersonal style. Scales and subscales from the PAI were utilized to determine the influence psychological distress had on the CRI indices: simple reaction time, complex reaction time, and processing speed. Analyses revealed several significant correlations among aspects of somatic concern, depression, anxiety, substance abuse, and suicidal ideation and CRI performance, each with at least a moderate effect. When entered into a linear regression, the block of combined psychological symptoms accounted for a significant amount of baseline CRI performance, with moderate to large effects (r = 0.23-0.30). When examined categorically, participants with suicidal ideation showed significantly slower simple reaction time and complex reaction time, with a similar trend on processing speed. Given the possibility of obscured concussion deficits after injury, implications for premature return to play, and the need to target psychological distress outright, these findings heighten the clinical

  1. Mathematical Model of Thyristor Inverter Including a Series-parallel Resonant Circuit

    OpenAIRE

    Miroslaw Luft; Elzbieta Szychta

    2008-01-01

    The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with theaid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  2. Sex Differences and Self-Reported Attention Problems During Baseline Concussion Testing.

    Science.gov (United States)

    Brooks, Brian L; Iverson, Grant L; Atkins, Joseph E; Zafonte, Ross; Berkner, Paul D

    2016-01-01

    Amateur athletic programs often use computerized cognitive testing as part of their concussion management programs. There is evidence that athletes with preexisting attention problems will have worse cognitive performance and more symptoms at baseline testing. The purpose of this study was to examine whether attention problems affect assessments differently for male and female athletes. Participants were drawn from a database that included 6,840 adolescents from Maine who completed Immediate Postconcussion Assessment and Cognitive Testing (ImPACT) at baseline (primary outcome measure). The final sample included 249 boys and 100 girls with self-reported attention problems. Each participant was individually matched for sex, age, number of past concussions, and sport to a control participant (249 boys, 100 girls). Boys with attention problems had worse reaction time than boys without attention problems. Girls with attention problems had worse visual-motor speed than girls without attention problems. Boys with attention problems reported more total symptoms, including more cognitive-sensory and sleep-arousal symptoms, compared with boys without attention problems. Girls with attention problems reported more cognitive-sensory, sleep-arousal, and affective symptoms than girls without attention problems. When considering the assessment, management, and outcome from concussions in adolescent athletes, it is important to consider both sex and preinjury attention problems regarding cognitive test results and symptom reporting.

  3. Mathematical model of thyristor inverter including a series-parallel resonant circuit

    OpenAIRE

    Luft, M.; Szychta, E.

    2008-01-01

    The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with the aid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  4. An Automated Baseline Correction Method Based on Iterative Morphological Operations.

    Science.gov (United States)

    Chen, Yunliang; Dai, Liankui

    2018-05-01

    Raman spectra usually suffer from baseline drift caused by fluorescence or other reasons. Therefore, baseline correction is a necessary and crucial step that must be performed before subsequent processing and analysis of Raman spectra. An automated baseline correction method based on iterative morphological operations is proposed in this work. The method can adaptively determine the structuring element first and then gradually remove the spectral peaks during iteration to get an estimated baseline. Experiments on simulated data and real-world Raman data show that the proposed method is accurate, fast, and flexible for handling different kinds of baselines in various practical situations. The comparison of the proposed method with some state-of-the-art baseline correction methods demonstrates its advantages over the existing methods in terms of accuracy, adaptability, and flexibility. Although only Raman spectra are investigated in this paper, the proposed method is hopefully to be used for the baseline correction of other analytical instrumental signals, such as IR spectra and chromatograms.

  5. Accurate SHAPE-directed RNA secondary structure modeling, including pseudoknots.

    Science.gov (United States)

    Hajdin, Christine E; Bellaousov, Stanislav; Huggins, Wayne; Leonard, Christopher W; Mathews, David H; Weeks, Kevin M

    2013-04-02

    A pseudoknot forms in an RNA when nucleotides in a loop pair with a region outside the helices that close the loop. Pseudoknots occur relatively rarely in RNA but are highly overrepresented in functionally critical motifs in large catalytic RNAs, in riboswitches, and in regulatory elements of viruses. Pseudoknots are usually excluded from RNA structure prediction algorithms. When included, these pairings are difficult to model accurately, especially in large RNAs, because allowing this structure dramatically increases the number of possible incorrect folds and because it is difficult to search the fold space for an optimal structure. We have developed a concise secondary structure modeling approach that combines SHAPE (selective 2'-hydroxyl acylation analyzed by primer extension) experimental chemical probing information and a simple, but robust, energy model for the entropic cost of single pseudoknot formation. Structures are predicted with iterative refinement, using a dynamic programming algorithm. This melded experimental and thermodynamic energy function predicted the secondary structures and the pseudoknots for a set of 21 challenging RNAs of known structure ranging in size from 34 to 530 nt. On average, 93% of known base pairs were predicted, and all pseudoknots in well-folded RNAs were identified.

  6. Progress Towards an LES Wall Model Including Unresolved Roughness

    Science.gov (United States)

    Craft, Kyle; Redman, Andrew; Aikens, Kurt

    2015-11-01

    Wall models used in large eddy simulations (LES) are often based on theories for hydraulically smooth walls. While this is reasonable for many applications, there are also many where the impact of surface roughness is important. A previously developed wall model has been used primarily for jet engine aeroacoustics. However, jet simulations have not accurately captured thick initial shear layers found in some experimental data. This may partly be due to nozzle wall roughness used in the experiments to promote turbulent boundary layers. As a result, the wall model is extended to include the effects of unresolved wall roughness through appropriate alterations to the log-law. The methodology is tested for incompressible flat plate boundary layers with different surface roughness. Correct trends are noted for the impact of surface roughness on the velocity profile. However, velocity deficit profiles and the Reynolds stresses do not collapse as well as expected. Possible reasons for the discrepancies as well as future work will be presented. This work used the Extreme Science and Engineering Discovery Environment (XSEDE), which is supported by National Science Foundation grant number ACI-1053575. Computational resources on TACC Stampede were provided under XSEDE allocation ENG150001.

  7. Kinetic models of gene expression including non-coding RNAs

    Energy Technology Data Exchange (ETDEWEB)

    Zhdanov, Vladimir P., E-mail: zhdanov@catalysis.r

    2011-03-15

    In cells, genes are transcribed into mRNAs, and the latter are translated into proteins. Due to the feedbacks between these processes, the kinetics of gene expression may be complex even in the simplest genetic networks. The corresponding models have already been reviewed in the literature. A new avenue in this field is related to the recognition that the conventional scenario of gene expression is fully applicable only to prokaryotes whose genomes consist of tightly packed protein-coding sequences. In eukaryotic cells, in contrast, such sequences are relatively rare, and the rest of the genome includes numerous transcript units representing non-coding RNAs (ncRNAs). During the past decade, it has become clear that such RNAs play a crucial role in gene expression and accordingly influence a multitude of cellular processes both in the normal state and during diseases. The numerous biological functions of ncRNAs are based primarily on their abilities to silence genes via pairing with a target mRNA and subsequently preventing its translation or facilitating degradation of the mRNA-ncRNA complex. Many other abilities of ncRNAs have been discovered as well. Our review is focused on the available kinetic models describing the mRNA, ncRNA and protein interplay. In particular, we systematically present the simplest models without kinetic feedbacks, models containing feedbacks and predicting bistability and oscillations in simple genetic networks, and models describing the effect of ncRNAs on complex genetic networks. Mathematically, the presentation is based primarily on temporal mean-field kinetic equations. The stochastic and spatio-temporal effects are also briefly discussed.

  8. Structure of a traditional baseline data system

    Energy Technology Data Exchange (ETDEWEB)

    1976-12-01

    Research was conducted to determine whether appropriate data exist for the development of a comprehensive statistical baseline data system on the human environment in the Athabasca oil sands region of Alberta. The existing data sources pertinent to the target area were first reviewed and discussed. Criteria were selected to assist the evaluation of data, including type of data collected, source, degree of detail, geographic identification, accessibility, and time frame. These criteria allowed assessing whether the data would be amenable to geographically-coded, continuous monitoring systems. It was found that the Statistics Canada Census provided the most detail, the most complete coverage of the target area, the smallest statistical areas, the greatest consistency in data and data collection, and the most regular collection. The local agency collection efforts were generally oriented toward specific goals and the data intended primarily for intra-agency use. The smallest statistical units in these efforts may be too large to be of value to a common small-area system, and data collection agencies did not generally use coterminous boundaries. Recommendations were made to give primary consideration to Statistics Canada data in the initial development of the baseline data system. Further development of such a system depends on the adoption by local agencies of a common small-area system for data collection. 38 refs., 6 figs.

  9. Fragrance mix II in the baseline series contributes significantly to detection of fragrance allergy

    DEFF Research Database (Denmark)

    Heisterberg, Maria S Vølund; Andersen, Klaus E.; Avnstorp, Christian

    2010-01-01

    Background: Fragrance mix II (FM II) is a relatively new screening marker for fragrance contact allergy. It was introduced in the patch test baseline series in Denmark in 2005 and contains six different fragrance chemicals commonly present in cosmetic products and which are known allergens. Aim......: To investigate the diagnostic contribution of including FM II in the baseline series by comparing it with other screening markers of fragrance allergy: fragrance mix I (FM I), Myroxylon pereirae and hydroxyisohexyl 3-cyclohexene carboxaldehyde (HICC). Method: Retrospective study of 12 302 patients consecutively...

  10. Mathematical Model of Thyristor Inverter Including a Series-parallel Resonant Circuit

    Directory of Open Access Journals (Sweden)

    Miroslaw Luft

    2008-01-01

    Full Text Available The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with theaid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  11. Rationing with baselines

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Moreno-Ternero, Juan D.; Østerdal, Lars Peter Raahave

    2013-01-01

    We introduce a new operator for general rationing problems in which, besides conflicting claims, individual baselines play an important role in the rationing process. The operator builds onto ideas of composition, which are not only frequent in rationing, but also in related problems...... such as bargaining, choice, and queuing. We characterize the operator and show how it preserves some standard axioms in the literature on rationing. We also relate it to recent contributions in such literature....

  12. Markers for context-responsiveness: Client baseline interpersonal problems moderate the efficacy of two psychotherapies for generalized anxiety disorder.

    Science.gov (United States)

    Gomez Penedo, Juan Martin; Constantino, Michael J; Coyne, Alice E; Westra, Henny A; Antony, Martin M

    2017-10-01

    To follow-up a randomized clinical trial that compared the acute and long-term efficacy of 15 sessions of cognitive-behavioral therapy (CBT) versus CBT integrated with motivational interviewing (MI) for severe generalized anxiety disorder (GAD; Westra, Constantino, & Antony, 2016), we (a) characterized the sample's baseline interpersonal problems, and (b) analyzed the role of several theory-relevant problems as moderators of the comparative treatment effects on outcome. We first compared clients' (N = 85) baseline interpersonal problems profile to a general clinical sample. We next conducted piecewise, 2-level growth models to analyze the interactive effects of treatment condition and the hypothesized interpersonal problem indices of nonassertiveness (ranging from low to high), exploitability (ranging from low to high on this specific combination of nonassertiveness and friendliness), and overall agency (ranging from more problems of being too submissive to more problems of being too domineering, including friendly or hostile variants) on acute and follow-up worry reduction. Finally, we conducted hierarchical generalized linear models to examine these interactive effects on the likelihood of achieving clinically meaningful worry reduction across follow-up. As expected, the GAD clients evidenced more nonassertive and exploitable interpersonal problems than the general clinical sample. Also as predicted, clients with more problematic nonassertiveness and low overall agency in their relationships had greater follow-up worry reduction in MI-CBT versus CBT, including to a clinically significant degree for the agency by treatment interaction. GAD-specific interpersonal problems can serve as contextual markers for integrative treatment selection and planning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  13. Including sugar cane in the agro-ecosystem model ORCHIDEE-STICS

    Science.gov (United States)

    Valade, A.; Vuichard, N.; Ciais, P.; Viovy, N.

    2010-12-01

    With 4 million ha currently grown for ethanol in Brazil only, approximately half the global bioethanol production in 2005 (Smeets 2008), and a devoted land area expected to expand globally in the years to come, sugar cane is at the heart of the biofuel debate. Indeed, ethanol made from biomass is currently the most widespread option for alternative transportation fuels. It was originally promoted as a carbon neutral energy resource that could bring energy independence to countries and local opportunities to farmers, until attention was drawn to its environmental and socio-economical drawbacks. It is still not clear to which extent it is a solution or a contributor to climate change mitigation. Dynamic Global Vegetation models can help address these issues and quantify the potential impacts of biofuels on ecosystems at scales ranging from on-site to global. The global agro-ecosystem model ORCHIDEE describes water, carbon and energy exchanges at the soil-atmosphere interface for a limited number of natural and agricultural vegetation types. In order to integrate agricultural management to the simulations and to capture more accurately the specificity of crops' phenology, ORCHIDEE has been coupled with the agronomical model STICS. The resulting crop-oriented vegetation model ORCHIDEE-STICS has been used so far to simulate temperate crops such as wheat, corn and soybean. As a generic ecosystem model, each grid cell can include several vegetation types with their own phenology and management practices, making it suitable to spatial simulations. Here, ORCHIDEE-STICS is altered to include sugar cane as a new agricultural Plant functional Type, implemented and parametrized using the STICS approach. An on-site calibration and validation is then performed based on biomass and flux chamber measurements in several sites in Australia and variables such as LAI, dry weight, heat fluxes and respiration are used to evaluate the ability of the model to simulate the specific

  14. Mercury baseline levels in Flemish soils (Belgium)

    International Nuclear Information System (INIS)

    Tack, Filip M.G.; Vanhaesebroeck, Thomas; Verloo, Marc G.; Van Rompaey, Kurt; Ranst, Eric van

    2005-01-01

    It is important to establish contaminant levels that are normally present in soils to provide baseline data for pollution studies. Mercury is a toxic element of concern. This study was aimed at assessing baseline mercury levels in soils in Flanders. In a previous study, mercury contents in soils in Oost-Vlaanderen were found to be significantly above levels reported elsewhere. For the current study, observations were extended over two more provinces, West-Vlaanderen and Antwerpen. Ranges of soil Hg contents were distinctly higher in the province Oost-Vlaanderen (interquartile range from 0.09 to 0.43 mg/kg) than in the other provinces (interquartile ranges from 0.7 to 0.13 and 0.7 to 0.15 mg/kg for West-Vlaanderen and Antwerpen, respectively). The standard threshold method was applied to separate soils containing baseline levels of Hg from the data. Baseline concentrations for Hg were characterised by a median of 0.10 mg Hg/kg dry soil, an interquartile range from 0.07 to 0.14 mg/kg and a 90% percentile value of 0.30 mg/kg. The influence of soil properties such as clay and organic carbon contents, and pH on baseline Hg concentrations was not important. Maps of the spatial distribution of Hg levels showed that the province Oost-Vlaanderen exhibited zones with systematically higher Hg soil contents. This may be related to the former presence of many small-scale industries employing mercury in that region. - Increased mercury levels may reflect human activity

  15. Transuranic waste baseline inventory report. Revision No. 3

    International Nuclear Information System (INIS)

    1996-06-01

    The Transuranic Waste Baseline Inventory Report (TWBIR) establishes a methodology for grouping wastes of similar physical and chemical properties from across the U.S. Department of Energy (DOE) transuranic (TRU) waste system into a series of open-quotes waste profilesclose quotes that can be used as the basis for waste form discussions with regulatory agencies. The purpose of Revisions 0 and 1 of this report was to provide data to be included in the Sandia National Laboratories/New Mexico (SNL/NM) performance assessment (PA) processes for the Waste Isolation Pilot Plant (WIPP). Revision 2 of the document expanded the original purpose and was also intended to support the WIPP Land Withdrawal Act (LWA) requirement for providing the total DOE TRU waste inventory. The document included a chapter and an appendix that discussed the total DOE TRU waste inventory, including nondefense, commercial, polychlorinated biphenyls (PCB)-contaminated, and buried (predominately pre-1970) TRU wastes that are not planned to be disposed of at WIPP

  16. The CMS Computing Model

    International Nuclear Information System (INIS)

    Bonacorsi, D.

    2007-01-01

    The CMS experiment at LHC has developed a baseline Computing Model addressing the needs of a computing system capable to operate in the first years of LHC running. It is focused on a data model with heavy streaming at the raw data level based on trigger, and on the achievement of the maximum flexibility in the use of distributed computing resources. The CMS distributed Computing Model includes a Tier-0 centre at CERN, a CMS Analysis Facility at CERN, several Tier-1 centres located at large regional computing centres, and many Tier-2 centres worldwide. The workflows have been identified, along with a baseline architecture for the data management infrastructure. This model is also being tested in Grid Service Challenges of increasing complexity, coordinated with the Worldwide LHC Computing Grid community

  17. Baseline results of the first healthy schools evaluation among a community of young, Irish, urban disadvantaged children and a comparison of outcomes with international norms.

    Science.gov (United States)

    Comiskey, Catherine M; O'Sullivan, Karin; Quirke, Mary B; Wynne, Ciara; Hollywood, Eleanor; MGillloway, Sinead

    2012-11-01

    In 2008, the Irish Government initiated a pilot Healthy Schools Programme based on the World Health Organization Health Promoting Schools Model among children attending schools officially designated as urban and disadvantaged. We present here the first results on physical and emotional health and the relationship between childhood depression and demographic and socioeconomic factors. The Healthy Schools Programme evaluation was a 3-year longitudinal outcome study among urban disadvantaged children aged 4 to 12 years. Physical and psychological health outcomes were measured using validated, international instruments at baseline. Outcomes at baseline were compared with international norms and where differences were found, results were statistically modeled to determine factors predicting poor outcomes. A total of 552 children responded at baseline, representing over 50% of all eligible children available to participate from 7 schools. Findings at baseline revealed that in general, children did not differ significantly from international norms. However, detailed analysis of the childhood depression scores revealed that in order of importance, psychological well-being, the school environment, social support, and peer relations and age were statistically significant predictors of increased childhood depression in children under 12 years of age. Future health and well-being studies in schools among urban disadvantaged children need to broaden their scope to include measures of depression in children under 12 years of age and be cognisant of the impact of the school environment on the mental and emotional health of the very young. © 2012, American School Health Association.

  18. Development of a Soil Organic Carbon Baseline for Otjozondjupa, Namibia

    OpenAIRE

    Nijbroek, R.; Kempen, B.; Mutua, J.; Soderstrom, M.; Piikki, K.; Hengari, S.; Andreas, A.

    2017-01-01

    Land Degradation Neutrality (LDN) has been piloted in 14 countries and will be scaled up to over 120 countries. As a LDN pilot country, Namibia developed sub-national LDN baselines in Otjozondjupa Region. In addition to the three LDN indicators (soil organic carbon, land productivity and land cover change), Namibia also regards bush encroachment as an important form of land degradation. We collected 219 soil profiles and used Random Forest modelling to develop the soil organic carbon stock ba...

  19. Baseline geochemistry of soil and bedrock Tshirege Member of the Bandelier Tuff at MDA-P

    International Nuclear Information System (INIS)

    Warren, R.G.; McDonald, E.V.; Ryti, R.T.

    1997-08-01

    This report provides baseline geochemistry for soils (including fill), and for bedrock within three specific areas that are planned for use in the remediation of Material Disposal Area P (MDA-P) at Technical Area 16 (TA-16). The baseline chemistry includes leachable element concentrations for both soils and bedrock and total element concentrations for all soil samples and for two selected bedrock samples. MDA-P operated from the early 1950s to 1984 as a landfill for rubble and debris generated by the burning of high explosives (HE) at the TA-16 Burning Ground, HE-contaminated equipment and material, barium nitrate sand, building materials, and trash. The aim of this report is to establish causes for recognizable chemical differences between the background and baseline data sets. In many cases, the authors conclude that recognizable differences represent natural enrichments. In other cases, differences are best attributed to analytical problems. But most importantly, the comparison of background and baseline geochemistry demonstrates significant contamination for several elements not only at the two remedial sites near the TA-16 Burning Ground, but also within the entire region of the background study. This contamination is highly localized very near to the surface in soil and fill, and probably also in bedrock; consequently, upper tolerance limits (UTLs) calculated as upper 95% confidence limits of the 95th percentile are of little value and thus are not provided. This report instead provides basic statistical summaries and graphical comparisons for background and baseline samples to guide strategies for remediation of the three sites to be used in the restoration of MDA-P

  20. Association of baseline bleeding pattern on amenorrhea with levonorgestrel intrauterine system use.

    Science.gov (United States)

    Mejia, Manuela; McNicholas, Colleen; Madden, Tessa; Peipert, Jeffrey F

    2016-11-01

    This study aims to evaluate the effect of baseline bleeding patterns on rates of amenorrhea reported at 12 months in levonorgestrel (LNG) 52 mg intrauterine system (IUS) users. We also assessed the effect of baseline bleeding patterns at 3 and 6 months postinsertion. In this secondary analysis of the Contraceptive CHOICE Project, we included participants who had an LNG-IUS inserted within 1 month of enrollment and continued use for 12 months. Using 12-month telephone survey data, we defined amenorrhea at 12 months of use as no bleeding or spotting during the previous 6 months. We used chi-square and multivariable logistic regression to assess the association of baseline bleeding pattern with amenorrhea while controlling for confounding variables. Of 1802 continuous 12-month LNG-IUS users, amenorrhea was reported by 4.9%, 14.8% and 15.4% of participants at 3, 6 and 12 months, receptively. Participants with light baseline bleeding or short duration of flow reported higher rates of amenorrhea at 3 and 6 months postinsertion (pamenorrhea at 3 and 6 months (pamenorrhea at 12 months than those who reported moderate bleeding (OR adj , 0.36; 95% CI, 0.16-0.69). Women with heavier menstrual bleeding are less likely than women with moderate flow to report amenorrhea following 12 months of LNG-IUS use. Baseline heavy menstrual flow reduces the likelihood of amenorrhea with LNG-IUS use, information that could impact contraceptive counseling. Anticipatory counseling can improve method satisfaction and continuation, an important strategy to continue to reduce unintended pregnancy and abortion rates. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. 324 Building Baseline Radiological Characterization

    Energy Technology Data Exchange (ETDEWEB)

    R.J. Reeder, J.C. Cooper

    2010-06-24

    This report documents the analysis of radiological data collected as part of the characterization study performed in 1998. The study was performed to create a baseline of the radiological conditions in the 324 Building.

  2. Digital baseline estimation method for multi-channel pulse height analyzing

    International Nuclear Information System (INIS)

    Xiao Wuyun; Wei Yixiang; Ai Xianyun

    2005-01-01

    The basic features of digital baseline estimation for multi-channel pulse height analysis are introduced. The weight-function of minimum-noise baseline filter is deduced with functional variational calculus. The frequency response of this filter is also deduced with Fourier transformation, and the influence of parameters on amplitude frequency response characteristics is discussed. With MATLAB software, the noise voltage signal from the charge sensitive preamplifier is simulated, and the processing effect of minimum-noise digital baseline estimation is verified. According to the results of this research, digital baseline estimation method can estimate baseline optimally, and it is very suitable to be used in digital multi-channel pulse height analysis. (authors)

  3. Clarifying the use of aggregated exposures in multilevel models: self-included vs. self-excluded measures.

    Directory of Open Access Journals (Sweden)

    Etsuji Suzuki

    Full Text Available Multilevel analyses are ideally suited to assess the effects of ecological (higher level and individual (lower level exposure variables simultaneously. In applying such analyses to measures of ecologies in epidemiological studies, individual variables are usually aggregated into the higher level unit. Typically, the aggregated measure includes responses of every individual belonging to that group (i.e. it constitutes a self-included measure. More recently, researchers have developed an aggregate measure which excludes the response of the individual to whom the aggregate measure is linked (i.e. a self-excluded measure. In this study, we clarify the substantive and technical properties of these two measures when they are used as exposures in multilevel models.Although the differences between the two aggregated measures are mathematically subtle, distinguishing between them is important in terms of the specific scientific questions to be addressed. We then show how these measures can be used in two distinct types of multilevel models-self-included model and self-excluded model-and interpret the parameters in each model by imposing hypothetical interventions. The concept is tested on empirical data of workplace social capital and employees' systolic blood pressure.Researchers assume group-level interventions when using a self-included model, and individual-level interventions when using a self-excluded model. Analytical re-parameterizations of these two models highlight their differences in parameter interpretation. Cluster-mean centered self-included models enable researchers to decompose the collective effect into its within- and between-group components. The benefit of cluster-mean centering procedure is further discussed in terms of hypothetical interventions.When investigating the potential roles of aggregated variables, researchers should carefully explore which type of model-self-included or self-excluded-is suitable for a given situation

  4. Meta-Analysis of the Relation of Baseline Right Ventricular Function to Response to Cardiac Resynchronization Therapy.

    Science.gov (United States)

    Sharma, Abhishek; Bax, Jerome J; Vallakati, Ajay; Goel, Sunny; Lavie, Carl J; Kassotis, John; Mukherjee, Debabrata; Einstein, Andrew; Warrier, Nikhil; Lazar, Jason M

    2016-04-15

    Right ventricular (RV) dysfunction has been associated with adverse clinical outcomes in patients with heart failure (HF). Cardiac resynchronization therapy (CRT) improves left ventricular (LV) size and function in patients with markedly abnormal electrocardiogram QRS duration. However, relation of baseline RV function with response to CRT has not been well described. In this study, we aim to investigate the relation of baseline RV function with response to CRT as assessed by change in LV ejection fraction (EF). A systematic search of studies published from 1966 to May 31, 2015 was conducted using PubMed, CINAHL, Cochrane CENTRAL, and the Web of Science databases. Studies were included if they have reported (1) parameters of baseline RV function (tricuspid annular plane systolic excursion [TAPSE] or RVEF or RV basal strain or RV fractional area change [FAC]) and (2) LVEF before and after CRT. Random-effects metaregression was used to evaluate the effect of baseline RV function parameters and change in LVEF. Sixteen studies (n = 1,764) were selected for final analysis. Random-effects metaregression analysis showed no significant association between the magnitude of the difference in EF before and after CRT with baseline TAPSE (β = 0.005, p = 0.989); baseline RVEF (β = 0.270, p = 0.493); baseline RVFAC (β = -0.367, p = 0.06); baseline basal strain (β = -0.342, p = 0.462) after a mean follow-up period of 10.5 months. In conclusion, baseline RV function as assessed by TAPSE, FAC, basal strain, or RVEF does not determine response to CRT as assessed by change in LVEF. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Integrated Baseline System (IBS) Version 2.0: Utilities Guide

    Energy Technology Data Exchange (ETDEWEB)

    Burford, M.J.; Downing, T.R.; Williams, J.R. [Pacific Northwest Lab., Richland, WA (United States); Bower, J.C. [Bower Software Services, Kennewick, WA (United States)

    1994-03-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Utilities Guide explains how you can use the IBS utility programs to manage and manipulate various kinds of IBS data. These programs include utilities for creating, editing, and displaying maps and other data that are referenced to geographic location. The intended audience for this document are chiefly data managers but also system managers and some emergency management planners and analysts.

  6. Stack Characterization in CryoSat Level1b SAR/SARin Baseline C

    Science.gov (United States)

    Scagliola, Michele; Fornari, Marco; Di Giacinto, Andrea; Bouffard, Jerome; Féménias, Pierre; Parrinello, Tommaso

    2015-04-01

    CryoSat was launched on the 8th April 2010 and is the first European ice mission dedicated to the monitoring of precise changes in the thickness of polar ice sheets and floating sea ice. CryoSat is the first altimetry mission operating in SAR mode and it carries an innovative radar altimeter called the Synthetic Aperture Interferometric Altimeter (SIRAL), that transmits pulses at a high pulse repetition frequency thus making the received echoes phase coherent and suitable for azimuth processing. The current CryoSat IPF (Instrument Processing Facility), Baseline B, was released in operation in February 2012. After more than 2 years of development, the release in operations of the Baseline C is expected in the first half of 2015. It is worth recalling here that the CryoSat SAR/SARin IPF1 generates 20Hz waveforms in correspondence of an approximately equally spaced set of ground locations on the Earth surface, i.e. surface samples, and that a surface sample gathers a collection of single-look echoes coming from the processed bursts during the time of visibility. Thus, for a given surface sample, the stack can be defined as the collection of all the single-look echoes pointing to the current surface sample, after applying all the necessary range corrections. The L1B product contains the power average of all the single-look echoes in the stack: the multi-looked L1B waveform. This reduces the data volume, while removing some information contained in the single looks, useful for characterizing the surface and modelling the L1B waveform. To recover such information, a set of parameters has been added to the L1B product: the stack characterization or beam behaviour parameters. The stack characterization, already included in previous Baselines, has been reviewed and expanded in Baseline C. This poster describes all the stack characterization parameters, detailing what they represent and how they have been computed. In details, such parameters can be summarized in: - Stack

  7. The dynamics of integrate-and-fire: mean versus variance modulations and dependence on baseline parameters.

    Science.gov (United States)

    Pressley, Joanna; Troyer, Todd W

    2011-05-01

    The leaky integrate-and-fire (LIF) is the simplest neuron model that captures the essential properties of neuronal signaling. Yet common intuitions are inadequate to explain basic properties of LIF responses to sinusoidal modulations of the input. Here we examine responses to low and moderate frequency modulations of both the mean and variance of the input current and quantify how these responses depend on baseline parameters. Across parameters, responses to modulations in the mean current are low pass, approaching zero in the limit of high frequencies. For very low baseline firing rates, the response cutoff frequency matches that expected from membrane integration. However, the cutoff shows a rapid, supralinear increase with firing rate, with a steeper increase in the case of lower noise. For modulations of the input variance, the gain at high frequency remains finite. Here, we show that the low-frequency responses depend strongly on baseline parameters and derive an analytic condition specifying the parameters at which responses switch from being dominated by low versus high frequencies. Additionally, we show that the resonant responses for variance modulations have properties not expected for common oscillatory resonances: they peak at frequencies higher than the baseline firing rate and persist when oscillatory spiking is disrupted by high noise. Finally, the responses to mean and variance modulations are shown to have a complementary dependence on baseline parameters at higher frequencies, resulting in responses to modulations of Poisson input rates that are independent of baseline input statistics.

  8. Determinants of rapid weight gain during infancy: baseline results from the NOURISH randomised controlled trial

    Directory of Open Access Journals (Sweden)

    Mihrshahi Seema

    2011-11-01

    Full Text Available Abstract Background Rapid weight gain in infancy is an important predictor of obesity in later childhood. Our aim was to determine which modifiable variables are associated with rapid weight gain in early life. Methods Subjects were healthy infants enrolled in NOURISH, a randomised, controlled trial evaluating an intervention to promote positive early feeding practices. This analysis used the birth and baseline data for NOURISH. Birthweight was collected from hospital records and infants were also weighed at baseline assessment when they were aged 4-7 months and before randomisation. Infant feeding practices and demographic variables were collected from the mother using a self administered questionnaire. Rapid weight gain was defined as an increase in weight-for-age Z-score (using WHO standards above 0.67 SD from birth to baseline assessment, which is interpreted clinically as crossing centile lines on a growth chart. Variables associated with rapid weight gain were evaluated using a multivariable logistic regression model. Results Complete data were available for 612 infants (88% of the total sample recruited with a mean (SD age of 4.3 (1.0 months at baseline assessment. After adjusting for mother's age, smoking in pregnancy, BMI, and education and infant birthweight, age, gender and introduction of solid foods, the only two modifiable factors associated with rapid weight gain to attain statistical significance were formula feeding [OR = 1.72 (95%CI 1.01-2.94, P = 0.047] and feeding on schedule [OR = 2.29 (95%CI 1.14-4.61, P = 0.020]. Male gender and lower birthweight were non-modifiable factors associated with rapid weight gain. Conclusions This analysis supports the contention that there is an association between formula feeding, feeding to schedule and weight gain in the first months of life. Mechanisms may include the actual content of formula milk (e.g. higher protein intake or differences in feeding styles, such as feeding to schedule

  9. Simple suggestions for including vertical physics in oil spill models

    International Nuclear Information System (INIS)

    D'Asaro, Eric; University of Washington, Seatle, WA

    2001-01-01

    Current models of oil spills include no vertical physics. They neglect the effect of vertical water motions on the transport and concentration of floating oil. Some simple ways to introduce vertical physics are suggested here. The major suggestion is to routinely measure the density stratification of the upper ocean during oil spills in order to develop a database on the effect of stratification. (Author)

  10. First Grade Baseline Evaluation

    Science.gov (United States)

    Center for Innovation in Assessment (NJ1), 2013

    2013-01-01

    The First Grade Baseline Evaluation is an optional tool that can be used at the beginning of the school year to help teachers get to know the reading and language skills of each student. The evaluation is composed of seven screenings. Teachers may use the entire evaluation or choose to use those individual screenings that they find most beneficial…

  11. A histopathological score on baseline biopsies from elderly donors predicts outcome 1 year after renal transplantation

    DEFF Research Database (Denmark)

    Toft, Birgitte G; Federspiel, Birgitte H; Sørensen, Søren S

    2012-01-01

    wall thickness of arteries and/or arterioles. Nineteen renal baseline biopsies from 15 donors (age: 64 ± 10 years) were included and following consensus the histopathological score was 4.3 ± 2.1 (intraclass correlation coefficient: 0.81; confidence interval: 0.66-0.92). The donor organs were used......Kidneys from elderly deceased patients and otherwise marginal donors may be considered for transplantation and a pretransplantation histopathological score for prediction of postoperative outcome is warranted. In a retrospective design, 29 baseline renal needle biopsies from elderly deceased donors...... Danish donors a histopathological score on baseline renal needle biopsies, with at least ten glomeruli and one artery present, predicts graft function 1 year after transplantation....

  12. Baseline Motivation Type as a Predictor of Dropout in a Healthy Eating Text Messaging Program.

    Science.gov (United States)

    Coa, Kisha; Patrick, Heather

    2016-09-29

    Growing evidence suggests that text messaging programs are effective in facilitating health behavior change. However, high dropout rates limit the potential effectiveness of these programs. This paper describes patterns of early dropout in the HealthyYou text (HYTxt) program, with a focus on the impact of baseline motivation quality on dropout, as characterized by Self-Determination Theory (SDT). This analysis included 193 users of HYTxt, a diet and physical activity text messaging intervention developed by the US National Cancer Institute. Descriptive statistics were computed, and logistic regression models were run to examine the association between baseline motivation type and early program dropout. Overall, 43.0% (83/193) of users dropped out of the program; of these, 65.1% (54/83; 28.0% of all users) did so within the first 2 weeks. Users with higher autonomous motivation had significantly lower odds of dropping out within the first 2 weeks. A one unit increase in autonomous motivation was associated with lower odds (odds ratio 0.44, 95% CI 0.24-0.81) of early dropout, which persisted after adjusting for level of controlled motivation. Applying SDT-based strategies to enhance autonomous motivation might reduce early dropout rates, which can improve program exposure and effectiveness.

  13. Esophageal acid exposure decreases intraluminal baseline impedance levels

    NARCIS (Netherlands)

    Kessing, Boudewijn F.; Bredenoord, Albert J.; Weijenborg, Pim W.; Hemmink, Gerrit J. M.; Loots, Clara M.; Smout, A. J. P. M.

    2011-01-01

    Intraluminal baseline impedance levels are determined by the conductivity of the esophageal wall and can be decreased in gastroesophageal reflux disease (GERD) patients. The aim of this study was to investigate the baseline impedance in GERD patients, on and off proton pump inhibitor (PPI), and in

  14. Measuring cognitive change with ImPACT: the aggregate baseline approach.

    Science.gov (United States)

    Bruce, Jared M; Echemendia, Ruben J; Meeuwisse, Willem; Hutchison, Michael G; Aubry, Mark; Comper, Paul

    2017-11-01

    The Immediate Post-Concussion Assessment and Cognitive Test (ImPACT) is commonly used to assess baseline and post-injury cognition among athletes in North America. Despite this, several studies have questioned the reliability of ImPACT when given at intervals employed in clinical practice. Poor test-retest reliability reduces test sensitivity to cognitive decline, increasing the likelihood that concussed athletes will be returned to play prematurely. We recently showed that the reliability of ImPACT can be increased when using a new composite structure and the aggregate of two baselines to predict subsequent performance. The purpose of the present study was to confirm our previous findings and determine whether the addition of a third baseline would further increase the test-retest reliability of ImPACT. Data from 97 English speaking professional hockey players who had received at least 4 ImPACT baseline evaluations were extracted from a National Hockey League Concussion Program database. Linear regression was used to determine whether each of the first three testing sessions accounted for unique variance in the fourth testing session. Results confirmed that the aggregate baseline approach improves the psychometric properties of ImPACT, with most indices demonstrating adequate or better test-retest reliability for clinical use. The aggregate baseline approach provides a modest clinical benefit when recent baselines are available - and a more substantial benefit when compared to approaches that obtain baseline measures only once during the course of a multi-year playing career. Pending confirmation in diverse samples, neuropsychologists are encouraged to use the aggregate baseline approach to best quantify cognitive change following sports concussion.

  15. In-Space Manufacturing Baseline Property Development

    Science.gov (United States)

    Stockman, Tom; Schneider, Judith; Prater, Tracie; Bean, Quincy; Werkheiser, Nicki

    2016-01-01

    The In-Space Manufacturing (ISM) project at NASA Marshall Space Flight Center currently operates a 3D FDM (fused deposition modeling) printer onboard the International Space Station. In order to enable utilization of this capability by designer, the project needs to establish characteristic material properties for materials produced using the process. This is difficult for additive manufacturing since standards and specifications do not yet exist for these technologies. Due to availability of crew time, there are limitations to the sample size which in turn limits the application of the traditional design allowables approaches to develop a materials property database for designers. In this study, various approaches to development of material databases were evaluated for use by designers of space systems who wish to leverage in-space manufacturing capabilities. This study focuses on alternative statistical techniques for baseline property development to support in-space manufacturing.

  16. Gravity sensing using Very Long Baseline Atom Interferometry

    Science.gov (United States)

    Schlippert, D.; Wodey, E.; Meiners, C.; Tell, D.; Schubert, C.; Ertmer, W.; Rasel, E. M.

    2017-12-01

    Very Long Baseline Atom Interferometry (VLBAI) has applications in high-accuracy absolute gravimetry, gravity-gradiometry, and for tests of fundamental physics. Thanks to the quadratic scaling of the phase shift with increasing free evolution time, extending the baseline of atomic gravimeters from tens of centimeters to meters puts resolutions of 10-13g and beyond in reach.We present the design and progress of key elements of the VLBAI-test stand: a dual-species source of Rb and Yb, a high-performance two-layer magnetic shield, and an active vibration isolation system allowing for unprecedented stability of the mirror acting as an inertial reference. We envisage a vibration-limited short-term sensitivity to gravitational acceleration of 1x10-8 m/s-2Hz-1/2 and up to a factor of 25 improvement when including additional correlation with a broadband seismometer. Here, the supreme long-term stability of atomic gravity sensors opens the route towards competition with superconducting gravimeters. The operation of VLBAI as a differential dual-species gravimeter using ultracold mixtures of Yb and Rb atoms enables quantum tests of the universality of free fall (UFF) at an unprecedented level of <10-13, potentially surpassing the best experiments to date.

  17. The baseline pressure of intracranial pressure (ICP) sensors can be altered by electrostatic discharges.

    Science.gov (United States)

    Eide, Per K; Bakken, André

    2011-08-22

    The monitoring of intracranial pressure (ICP) has a crucial role in the surveillance of patients with brain injury. During long-term monitoring of ICP, we have seen spontaneous shifts in baseline pressure (ICP sensor zero point), which are of technical and not physiological origin. The aim of the present study was to explore whether or not baseline pressures of ICP sensors can be affected by electrostatics discharges (ESD's), when ESD's are delivered at clinically relevant magnitudes. We performed bench-testing of a set of commercial ICP sensors. In our experimental setup, the ICP sensor was placed in a container with 0.9% NaCl solution. A test person was charged 0.5-10 kV, and then delivered ESD's to the sensor by touching a metal rod that was located in the container. The continuous pressure signals were recorded continuously before/after the ESD's, and the pressure readings were stored digitally using a computerized system A total of 57 sensors were tested, including 25 Codman ICP sensors and 32 Raumedic sensors. When charging the test person in the range 0.5-10 kV, typically ESD's in the range 0.5-5 kV peak pulse were delivered to the ICP sensor. Alterations in baseline pressure ≥ 2 mmHg was seen in 24 of 25 (96%) Codman sensors and in 17 of 32 (53%) Raumedic sensors. Lasting changes in baseline pressure > 10 mmHg that in the clinical setting would affect patient management, were seen frequently for both sensor types. The changes in baseline pressure were either characterized by sudden shifts or gradual drifts in baseline pressure. The baseline pressures of commercial solid ICP sensors can be altered by ESD's at discharge magnitudes that are clinically relevant. Shifts in baseline pressure change the ICP levels visualised to the physician on the monitor screen, and thereby reveal wrong ICP values, which likely represent a severe risk to the patient.

  18. STATUS OF THE US LONG BASELINE NEUTRINO EXPERIMENT STUDY.

    Energy Technology Data Exchange (ETDEWEB)

    BISHAI,M.

    2006-09-21

    The US Long Baseline Neutrino Experiment Study was commissioned jointly by Brookhaven National Laboratory and Fermi National Accelerator Laboratory to investigate the potential for future U.S. based long baseline neutrino oscillation experiments beyond the currently planned program. The Study focused on MW class convention at neutrino beams that can be produced at Fermilab or BNL. The experimental baselines are based on two possible detector locations: (1) off-axis to the existing Fermilab NuMI beamline at baselines of 700 to 810 km and (2) NSF's proposed future Deep Underground Science and Engineering Laboratory (DUSEL) at baselines greater than 1000 km. Two detector technologies are considered: a megaton class Water Cherenkov detector deployed deep underground at a DUSEL site, or a 100kT Liquid Argon Time-Projection Chamber (TPC) deployed on the surface at any of the proposed sites. The physics sensitivities of the proposed experiments are summarized. We find that conventional horn focused wide-band neutrino beam options from Fermilab or BNL aimed at a massive detector with a baseline of > 1000 km have the best sensitivity to CP violation and the neutrino mass hierarchy for values of the mixing angle {theta}{sub 13} down to 2.2{sup o}.

  19. High Rates of Baseline Drug Resistance and Virologic Failure Among ART-naive HIV-infected Children in Mali.

    Science.gov (United States)

    Crowell, Claudia S; Maiga, Almoustapha I; Sylla, Mariam; Taiwo, Babafemi; Kone, Niaboula; Oron, Assaf P; Murphy, Robert L; Marcelin, Anne-Geneviève; Traore, Ban; Fofana, Djeneba B; Peytavin, Gilles; Chadwick, Ellen G

    2017-11-01

    Limited data exist on drug resistance and antiretroviral treatment (ART) outcomes in HIV-1-infected children in West Africa. We determined the prevalence of baseline resistance and correlates of virologic failure (VF) in a cohort of ART-naive HIV-1-infected children baseline (before ART) and at 6 months. Resistance was defined according to the Stanford HIV Genotypic Resistance database. VF was defined as viral load ≥1000 copies/mL after 6 months of ART. Logistic regression was used to evaluate factors associated with VF or death >1 month after enrollment. Post hoc, antiretroviral concentrations were assayed on baseline samples of participants with baseline resistance. One-hundred twenty children with a median age 2.6 years (interquartile range: 1.6-5.0) were included. Eighty-eight percent reported no prevention of mother-to-child transmission exposure. At baseline, 27 (23%), 4 (3%) and none had non-nucleoside reverse transcriptase inhibitor (NNRTI), nucleoside reverse transcriptase inhibitor or protease inhibitor resistance, respectively. Thirty-nine (33%) developed VF and 4 died >1 month post-ART initiation. In multivariable analyses, poor adherence [odds ratio (OR): 6.1, P = 0.001], baseline NNRTI resistance among children receiving NNRTI-based ART (OR: 22.9, P baseline NNRTI resistance (OR: 5.8, P = 0.018) were significantly associated with VF/death. Ten (38%) with baseline resistance had detectable levels of nevirapine or efavirenz at baseline; 7 were currently breastfeeding, but only 2 reported maternal antiretroviral use. Baseline NNRTI resistance was common in children without reported NNRTI exposure and was associated with increased risk of treatment failure. Detectable NNRTI concentrations were present despite few reports of maternal/infant antiretroviral use.

  20. Urbanization and baseline prevalence of genital infections including Candida, Trichomonas, and human papillomavirus and of a disturbed vaginal ecology as established in the Dutch Cervical Screening Program

    NARCIS (Netherlands)

    Boon, ME; Claasen, HHV; Kok, LP

    OBJECTIVE: An overgrowth of coccoid bacilli in the absence of lactobacilli (bacterial vaginosis) is considered a sign of a "disturbed" vaginal ecologic system. The aim of this study was to establish the baseline prevalence of genital infections and of a disturbed vaginal ecologic system and their

  1. Hydromechanical modeling of clay rock including fracture damage

    Science.gov (United States)

    Asahina, D.; Houseworth, J. E.; Birkholzer, J. T.

    2012-12-01

    Argillaceous rock typically acts as a flow barrier, but under certain conditions significant and potentially conductive fractures may be present. Fracture formation is well-known to occur in the vicinity of underground excavations in a region known as the excavation disturbed zone. Such problems are of particular importance for low-permeability, mechanically weak rock such as clays and shales because fractures can be relatively transient as a result of fracture self-sealing processes. Perhaps not as well appreciated is the fact that natural fractures can form in argillaceous rock as a result of hydraulic overpressure caused by phenomena such as disequlibrium compaction, changes in tectonic stress, and mineral dehydration. Overpressure conditions can cause hydraulic fracturing if the fluid pressure leads to tensile effective stresses that exceed the tensile strength of the material. Quantitative modeling of this type of process requires coupling between hydrogeologic processes and geomechanical processes including fracture initiation and propagation. Here we present a computational method for three-dimensional, hydromechanical coupled processes including fracture damage. Fractures are represented as discrete features in a fracture network that interact with a porous rock matrix. Fracture configurations are mapped onto an unstructured, three-dimensonal, Voronoi grid, which is based on a random set of spatial points. Discrete fracture networks (DFN) are represented by the connections of the edges of a Voronoi cells. This methodology has the advantage that fractures can be more easily introduced in response to coupled hydro-mechanical processes and generally eliminates several potential issues associated with the geometry of DFN and numerical gridding. A geomechanical and fracture-damage model is developed here using the Rigid-Body-Spring-Network (RBSN) numerical method. The hydrogelogic and geomechanical models share the same geometrical information from a 3D Voronoi

  2. Baseline prevalence and longitudinal evolution of non-motor symptoms in early Parkinson's disease: the PPMI cohort.

    Science.gov (United States)

    Simuni, Tanya; Caspell-Garcia, Chelsea; Coffey, Christopher S; Weintraub, Daniel; Mollenhauer, Brit; Lasch, Shirley; Tanner, Caroline M; Jennings, Danna; Kieburtz, Karl; Chahine, Lana M; Marek, Kenneth

    2018-01-01

    To examine the baseline prevalence and longitudinal evolution in non-motor symptoms (NMS) in a prospective cohort of, at baseline, patients with de novo Parkinson's disease (PD) compared with healthy controls (HC). Parkinson's Progression Markers Initiative (PPMI) is a longitudinal, ongoing, controlled study of de novo PD participants and HC. NMS were rated using the Movement Disorder Society Unified Parkinson's Disease Rating Scale (MDS-UPDRS) Part I score and other validated NMS scales at baseline and after 2 years. Biological variables included cerebrospinal fluid (CSF) markers and dopamine transporter imaging. 423 PD subjects and 196 HC were enrolled and followed for 2 years. MDS-UPDRS Part I total mean (SD) scores increased from baseline 5.6 (4.1) to 7.7 (5.0) at year 2 in PD subjects (pbaseline NMS score was associated with female sex (p=0.008), higher baseline MDS-UPDRS Part II scores (pbaseline. There was no association with the dose or class of dopaminergic therapy. This study of NMS in early PD identified clinical and biological variables associated with both baseline burden and predictors of progression. The association of a greater longitudinal increase in NMS with lower baseline Aβ1-42 level is an important finding that will have to be replicated in other cohorts. ClinicalTrials.gov identifier: NCT01141023. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  3. Direct-phase-variable model of a synchronous reluctance motor including all slot and winding harmonics

    International Nuclear Information System (INIS)

    Obe, Emeka S.; Binder, A.

    2011-01-01

    A detailed model in direct-phase variables of a synchronous reluctance motor operating at mains voltage and frequency is presented. The model includes the stator and rotor slot openings, the actual winding layout and the reluctance rotor geometry. Hence, all mmf and permeance harmonics are taken into account. It is seen that non-negligible harmonics introduced by slots are present in the inductances computed by the winding function procedure. These harmonics are usually ignored in d-q models. The machine performance is simulated in the stator reference frame to depict the difference between this new direct-phase model including all harmonics and the conventional rotor reference frame d-q model. Saturation is included by using a polynomial fitting the variation of d-axis inductance with stator current obtained by finite-element software FEMAG DC (registered) . The detailed phase-variable model can yield torque pulsations comparable to those obtained from finite elements while the d-q model cannot.

  4. Links between early baseline cortisol, attachment classification, and problem behaviors: A test of differential susceptibility versus diathesis-stress.

    Science.gov (United States)

    Fong, Michelle C; Measelle, Jeffrey; Conradt, Elisabeth; Ablow, Jennifer C

    2017-02-01

    The purpose of the current study was to predict concurrent levels of problem behaviors from young children's baseline cortisol and attachment classification, a proxy for the quality of caregiving experienced. In a sample of 58 children living at or below the federal poverty threshold, children's baseline cortisol levels, attachment classification, and problem behaviors were assessed at 17 months of age. We hypothesized that an interaction between baseline cortisol and attachment classification would predict problem behaviors above and beyond any main effects of baseline cortisol and attachment. However, based on limited prior research, we did not predict whether or not this interaction would be more consistent with diathesis-stress or differential susceptibility models. Consistent with diathesis-stress theory, the results indicated no significant differences in problem behavior levels among children with high baseline cortisol. In contrast, children with low baseline cortisol had the highest level of problem behaviors in the context of a disorganized attachment relationship. However, in the context of a secure attachment relationship, children with low baseline cortisol looked no different, with respect to problem behavior levels, then children with high cortisol levels. These findings have substantive implications for the socioemotional development of children reared in poverty. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Baseline descriptions for LWR spent fuel storage, handling, and transportation

    Energy Technology Data Exchange (ETDEWEB)

    Moyer, J.W.; Sonnier, C.S.

    1978-04-01

    Baseline descriptions for the storage, handling, and transportation of reactor spent fuel are provided. The storage modes described include light water reactor (LWR) pools, away-from-reactor basins, dry surface storage, reprocessing-facility interim storage pools, and deep geologic storage. Land and water transportation are also discussed. This work was sponsored by the Department of Energy/Office of Safeguards and Security as part of the Sandia Laboratories Fixed Facility Physical Protection Program. 45 figs, 4 tables.

  6. Baseline descriptions for LWR spent fuel storage, handling, and transportation

    International Nuclear Information System (INIS)

    Moyer, J.W.; Sonnier, C.S.

    1978-04-01

    Baseline descriptions for the storage, handling, and transportation of reactor spent fuel are provided. The storage modes described include light water reactor (LWR) pools, away-from-reactor basins, dry surface storage, reprocessing-facility interim storage pools, and deep geologic storage. Land and water transportation are also discussed. This work was sponsored by the Department of Energy/Office of Safeguards and Security as part of the Sandia Laboratories Fixed Facility Physical Protection Program. 45 figs, 4 tables

  7. Sport and team differences on baseline measures of sport-related concussion.

    Science.gov (United States)

    Zimmer, Adam; Piecora, Kyle; Schuster, Danielle; Webbe, Frank

    2013-01-01

    With the advent of the National Collegiate Athletic Association's (NCAA's) mandating the presence and practice of concussion-management plans in collegiate athletic programs, institutions will consider potential approaches for concussion management, including both baseline and normative comparison approaches. To examine sport and team differences in baseline performance on a computer-based neurocognitive measure and 2 standard sideline measures of cognition and balance and to determine the potential effect of premorbid factors sex and height on baseline performance. Cross-sectional study. University laboratory. A total of 437 NCAA Division II student-athletes (males = 273, females = 164; age = 19.61 ± 1.64 years, height = 69.89 ± 4.04 inches [177.52 ± 10.26 cm]) were recruited during mandatory preseason testing conducted in a concussion-management program. The computerized Concussion Resolution Index (CRI), the Standardized Assessment of Concussion (Form A; SAC), and the Balance Error Scoring System (BESS). Players on the men's basketball team tended to perform worse on the baseline measures, whereas soccer players tended to perform better. We found a difference in total BESS scores between these sports (P = .002). We saw a difference between sports on the hard-surface portion of the BESS (F6,347 = 3.33, P = .003, ηp(2) = 0.05). No sport, team, or sex differences were found with SAC scores (P > .05). We noted differences between sports and teams in the CRI indices, with basketball, particularly the men's team, performing worse than soccer (P sport differences, height was a covariate for the team (F1,385 = 5.109, P = .02, ηp(2) = 0.013) and sport (F1,326 = 11.212, P = .001, ηp(2) = 0.033) analyses, but the interaction of sex and sport on CRI indices was not significant in any test (P > .05). Given that differences in neurocognitive functioning and performance among sports and teams exist, the comparison of posttraumatic and baseline assessment may lead to more

  8. Future Long-Baseline Neutrino Facilities and Detectors

    Directory of Open Access Journals (Sweden)

    Milind Diwan

    2013-01-01

    Full Text Available We review the ongoing effort in the US, Japan, and Europe of the scientific community to study the location and the detector performance of the next-generation long-baseline neutrino facility. For many decades, research on the properties of neutrinos and the use of neutrinos to study the fundamental building blocks of matter has unveiled new, unexpected laws of nature. Results of neutrino experiments have triggered a tremendous amount of development in theory: theories beyond the standard model or at least extensions of it and development of the standard solar model and modeling of supernova explosions as well as the development of theories to explain the matter-antimatter asymmetry in the universe. Neutrino physics is one of the most dynamic and exciting fields of research in fundamental particle physics and astrophysics. The next-generation neutrino detector will address two aspects: fundamental properties of the neutrino like mass hierarchy, mixing angles, and the CP phase, and low-energy neutrino astronomy with solar, atmospheric, and supernova neutrinos. Such a new detector naturally allows for major improvements in the search for nucleon decay. A next-generation neutrino observatory needs a huge, megaton scale detector which in turn has to be installed in a new, international underground laboratory, capable of hosting such a huge detector.

  9. Future Long-Baseline Neutrino Facilities and Detectors

    Energy Technology Data Exchange (ETDEWEB)

    Diwan, Milind [Brookhaven; Edgecock, Rob [Huddersfield U.; Hasegawa, Takuya [KEK, Tsukuba; Patzak, Thomas [APC, Paris; Shiozawa, Masato [Kamioka Observ.; Strait, Jim [Fermilab

    2013-01-01

    We review the ongoing effort in the US, Japan, and Europe of the scientific community to study the location and the detector performance of the next-generation long-baseline neutrino facility. For many decades, research on the properties of neutrinos and the use of neutrinos to study the fundamental building blocks of matter has unveiled new, unexpected laws of nature. Results of neutrino experiments have triggered a tremendous amount of development in theory: theories beyond the standard model or at least extensions of it and development of the standard solar model and modeling of supernova explosions as well as the development of theories to explain the matter-antimatter asymmetry in the universe. Neutrino physics is one of the most dynamic and exciting fields of research in fundamental particle physics and astrophysics. The next-generation neutrino detector will address two aspects: fundamental properties of the neutrino like mass hierarchy, mixing angles, and the CP phase, and low-energy neutrino astronomy with solar, atmospheric, and supernova neutrinos. Such a new detector naturally allows for major improvements in the search for nucleon decay. A next-generation neutrino observatory needs a huge, megaton scale detector which in turn has to be installed in a new, international underground laboratory, capable of hosting such a huge detector.

  10. Baseline restoration technique based on symmetrical zero-area trapezoidal pulse shaper

    Energy Technology Data Exchange (ETDEWEB)

    Zeng, Guoqiang, E-mail: 24829500@qq.com [Key Laboratory of Applied Nuclear Techniques in Geosciences Sichuan, Chengdu University of Technology, Chengdu 610059 (China); Yang, Jian, E-mail: 22105653@qq.com [Key Laboratory of Applied Nuclear Techniques in Geosciences Sichuan, Chengdu University of Technology, Chengdu 610059 (China); Hu, Tianyu; Ge, Liangquan [Key Laboratory of Applied Nuclear Techniques in Geosciences Sichuan, Chengdu University of Technology, Chengdu 610059 (China); Ouyang, Xiaoping [Northwest Institute of Nuclear Technology, Xi’an 710024,China (China); Zhang, Qingxian; Gu, Yi [Key Laboratory of Applied Nuclear Techniques in Geosciences Sichuan, Chengdu University of Technology, Chengdu 610059 (China)

    2017-06-21

    Since the baseline of the unipolar pulse shaper have the direct-current (DC) offset and drift, an additional baseline estimator is need to obtain baseline values in real-time. The bipolar zero-area (BZA) pulse shapers can be used for baseline restoration, but they cannot restrain the baseline drift due to their asymmetrical shape. In this study, three trapezoids are synthesized as a symmetrical zero-area (SZA) shape, which can remove the DC offset and restrain the baseline drift. This baseline restoration technique can be easily implemented in digital pulse processing (DPP) systems base on the recursive algorithm. To strengthen our approach, the iron's characteristic x-ray was detected using a Si-PIN diode detector. Compared with traditional trapezoidal pulse shapers, the SZA trapezoidal pulse shaper improved the energy resolution from 237 eV to 216 eV for the 6.403 keV Kα peak.

  11. A probabilistic scenario approach for developing improved Reduced Emissions from Deforestation and Degradation (REDD+ baselines

    Directory of Open Access Journals (Sweden)

    Malika Virah-Sawmy

    2015-07-01

    By generating robust probabilistic baseline scenarios, exponential smoothing models can facilitate the effectiveness of REDD+ payments, support a more efficient allocation of scarce conservation resources, and improve our understanding of effective forest conservation investments, also beyond REDD+.

  12. Scanner baseliner monitoring and control in high volume manufacturing

    Science.gov (United States)

    Samudrala, Pavan; Chung, Woong Jae; Aung, Nyan; Subramany, Lokesh; Gao, Haiyong; Gomez, Juan-Manuel

    2016-03-01

    We analyze performance of different customized models on baseliner overlay data and demonstrate the reduction in overlay residuals by ~10%. Smart Sampling sets were assessed and compared with the full wafer measurements. We found that performance of the grid can still be maintained by going to one-third of total sampling points, while reducing metrology time by 60%. We also demonstrate the feasibility of achieving time to time matching using scanner fleet manager and thus identify the tool drifts even when the tool monitoring controls are within spec limits. We also explore the scanner feedback constant variation with illumination sources.

  13. High Baseline Postconcussion Symptom Scores and Concussion Outcomes in Athletes.

    Science.gov (United States)

    Custer, Aimee; Sufrinko, Alicia; Elbin, R J; Covassin, Tracey; Collins, Micky; Kontos, Anthony

    2016-02-01

    Some healthy athletes report high levels of baseline concussion symptoms, which may be attributable to several factors (eg, illness, personality, somaticizing). However, the role of baseline symptoms in outcomes after sport-related concussion (SRC) has not been empirically examined. To determine if athletes with high symptom scores at baseline performed worse than athletes without baseline symptoms on neurocognitive testing after SRC. Cohort study. High school and collegiate athletic programs. A total of 670 high school and collegiate athletes participated in the study. Participants were divided into groups with either no baseline symptoms (Postconcussion Symptom Scale [PCSS] score = 0, n = 247) or a high level of baseline symptoms (PCSS score > 18 [top 10% of sample], n = 68). Participants were evaluated at baseline and 2 to 7 days after SRC with the Immediate Post-concussion Assessment and Cognitive Test and PCSS. Outcome measures were Immediate Post-concussion Assessment and Cognitive Test composite scores (verbal memory, visual memory, visual motor processing speed, and reaction time) and total symptom score on the PCSS. The groups were compared using repeated-measures analyses of variance with Bonferroni correction to assess interactions between group and time for symptoms and neurocognitive impairment. The no-symptoms group represented 38% of the original sample, whereas the high-symptoms group represented 11% of the sample. The high-symptoms group experienced a larger decline from preinjury to postinjury than the no-symptoms group in verbal (P = .03) and visual memory (P = .05). However, total concussion-symptom scores increased from preinjury to postinjury for the no-symptoms group (P = .001) but remained stable for the high-symptoms group. Reported baseline symptoms may help identify athletes at risk for worse outcomes after SRC. Clinicians should examine baseline symptom levels to better identify patients for earlier referral and treatment for their

  14. Forest Structure Characterization Using Jpl's UAVSAR Multi-Baseline Polarimetric SAR Interferometry and Tomography

    Science.gov (United States)

    Neumann, Maxim; Hensley, Scott; Lavalle, Marco; Ahmed, Razi

    2013-01-01

    This paper concerns forest remote sensing using JPL's multi-baseline polarimetric interferometric UAVSAR data. It presents exemplary results and analyzes the possibilities and limitations of using SAR Tomography and Polarimetric SAR Interferometry (PolInSAR) techniques for the estimation of forest structure. Performance and error indicators for the applicability and reliability of the used multi-baseline (MB) multi-temporal (MT) PolInSAR random volume over ground (RVoG) model are discussed. Experimental results are presented based on JPL's L-band repeat-pass polarimetric interferometric UAVSAR data over temperate and tropical forest biomes in the Harvard Forest, Massachusetts, and in the La Amistad Park, Panama and Costa Rica. The results are partially compared with ground field measurements and with air-borne LVIS lidar data.

  15. Baseline effects on carbon footprints of biofuels: The case of wood

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Eric, E-mail: johnsonatlantic@gmail.com [Atlantic Consulting, 8136 Gattikon (Switzerland); Tschudi, Daniel [ETH, Berghaldenstrasse 46, 8800 Thalwil (Switzerland)

    2012-11-15

    As biofuel usage has boomed over the past decade, so has research and regulatory interest in its carbon accounting. This paper examines one aspect of that carbon accounting: the baseline, i.e. the reference case against which other conditions or changes can be compared. A literature search and analysis identified four baseline types: no baseline; reference point; marginal fossil fuel; and biomass opportunity cost. The fourth one, biomass opportunity cost, is defined in more detail, because this is not done elsewhere in the literature. The four baselines are then applied to the carbon footprint of a wood-fired power plant. The footprint of the resulting wood-fired electricity varies dramatically, according to the type of baseline. Baseline type is also found to be the footprint's most significant sensitivity. Other significant sensitivities are: efficiency of the power plant; the growth (or re-growth) rate of the forest that supplies the wood; and the residue fraction of the wood. Length of the policy horizon is also an important factor in determining the footprint. The paper concludes that because of their significance and variability, baseline choices should be made very explicit in biofuel carbon footprints. - Highlights: Black-Right-Pointing-Pointer Four baseline types for biofuel footprinting are identified. Black-Right-Pointing-Pointer One type, 'biomass opportunity cost', is defined mathematically and graphically. Black-Right-Pointing-Pointer Choice of baseline can dramatically affect the footprint result. Black-Right-Pointing-Pointer The 'no baseline' approach is not acceptable. Black-Right-Pointing-Pointer Choice between the other three baselines depends on the question being addressed.

  16. Automated baseline change detection phase I. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-01

    The Automated Baseline Change Detection (ABCD) project is supported by the DOE Morgantown Energy Technology Center (METC) as part of its ER&WM cross-cutting technology program in robotics. Phase 1 of the Automated Baseline Change Detection project is summarized in this topical report. The primary objective of this project is to apply robotic and optical sensor technology to the operational inspection of mixed toxic and radioactive waste stored in barrels, using Automated Baseline Change Detection (ABCD), based on image subtraction. Absolute change detection is based on detecting any visible physical changes, regardless of cause, between a current inspection image of a barrel and an archived baseline image of the same barrel. Thus, in addition to rust, the ABCD system can also detect corrosion, leaks, dents, and bulges. The ABCD approach and method rely on precise camera positioning and repositioning relative to the barrel and on feature recognition in images. In support of this primary objective, there are secondary objectives to determine DOE operational inspection requirements and DOE system fielding requirements.

  17. Automated baseline change detection phase I. Final report

    International Nuclear Information System (INIS)

    1995-12-01

    The Automated Baseline Change Detection (ABCD) project is supported by the DOE Morgantown Energy Technology Center (METC) as part of its ER ampersand WM cross-cutting technology program in robotics. Phase 1 of the Automated Baseline Change Detection project is summarized in this topical report. The primary objective of this project is to apply robotic and optical sensor technology to the operational inspection of mixed toxic and radioactive waste stored in barrels, using Automated Baseline Change Detection (ABCD), based on image subtraction. Absolute change detection is based on detecting any visible physical changes, regardless of cause, between a current inspection image of a barrel and an archived baseline image of the same barrel. Thus, in addition to rust, the ABCD system can also detect corrosion, leaks, dents, and bulges. The ABCD approach and method rely on precise camera positioning and repositioning relative to the barrel and on feature recognition in images. In support of this primary objective, there are secondary objectives to determine DOE operational inspection requirements and DOE system fielding requirements

  18. Integrated model of port oil piping transportation system safety including operating environment threats

    Directory of Open Access Journals (Sweden)

    Kołowrocki Krzysztof

    2017-06-01

    Full Text Available The paper presents an integrated general model of complex technical system, linking its multistate safety model and the model of its operation process including operating environment threats and considering variable at different operation states its safety structures and its components safety parameters. Under the assumption that the system has exponential safety function, the safety characteristics of the port oil piping transportation system are determined.

  19. Integrated model of port oil piping transportation system safety including operating environment threats

    OpenAIRE

    Kołowrocki, Krzysztof; Kuligowska, Ewa; Soszyńska-Budny, Joanna

    2017-01-01

    The paper presents an integrated general model of complex technical system, linking its multistate safety model and the model of its operation process including operating environment threats and considering variable at different operation states its safety structures and its components safety parameters. Under the assumption that the system has exponential safety function, the safety characteristics of the port oil piping transportation system are determined.

  20. Vegetation Parameter Extraction Using Dual Baseline Polarimetric SAR Interferometry Data

    Science.gov (United States)

    Zhang, H.; Wang, C.; Chen, X.; Tang, Y.

    2009-04-01

    For vegetation parameter inversion, the single baseline polarimetric SAR interferometry (POLinSAR) technique, such as the three-stage method and the ESPRIT algorithm, is limited by the observed data with the minimum ground to volume amplitude ration, which effects the estimation of the effective phase center for the vegetation canopy or the surface, and thus results in the underestimated vegetation height. In order to remove this effect of the single baseline inversion techniques in some extend, another baseline POLinSAR data is added on vegetation parameter estimation in this paper, and a dual baseline POLinSAR technique for the extraction of the vegetation parameter is investigated and improved to reduce the dynamic bias for the vegetation parameter estimation. Finally, the simulated data and real data are used to validate this dual baseline technique.

  1. Relationship of Baseline Hemoglobin Level with Serum Ferritin, Postphlebotomy Hemoglobin Changes, and Phlebotomy Requirements among HFE C282Y Homozygotes

    Directory of Open Access Journals (Sweden)

    Seyed Ali Mousavi

    2015-01-01

    Full Text Available Objectives. We aimed to examine whether baseline hemoglobin levels in C282Y-homozygous patients are related to the degree of serum ferritin (SF elevation and whether patients with different baseline hemoglobin have different phlebotomy requirements. Methods. A total of 196 patients (124 males and 72 females who had undergone therapeutic phlebotomy and had SF and both pre- and posttreatment hemoglobin values were included in the study. Results. Bivariate correlation analysis suggested that baseline SF explains approximately 6 to 7% of the variation in baseline hemoglobin. The results also showed that males who had higher (≥150 g/L baseline hemoglobin levels had a significantly greater reduction in their posttreatment hemoglobin despite requiring fewer phlebotomies to achieve iron depletion than those who had lower (<150 g/L baseline hemoglobin, regardless of whether baseline SF was below or above 1000 µg/L. There were no significant differences between hemoglobin subgroups regarding baseline and treatment characteristics, except for transferrin saturation between male subgroups with SF above 1000 µg/L. Similar differences were observed when females with higher (≥138 g/L baseline hemoglobin were compared with those with lower (<138 g/L baseline hemoglobin. Conclusion. Dividing C282Y-homozygous patients into just two subgroups according to the degree of baseline SF elevation may obscure important subgroup variations.

  2. Relationship of Baseline Hemoglobin Level with Serum Ferritin, Postphlebotomy Hemoglobin Changes, and Phlebotomy Requirements among HFE C282Y Homozygotes

    Science.gov (United States)

    Mousavi, Seyed Ali; Mahmood, Faiza; Aandahl, Astrid; Knutsen, Teresa Risopatron; Llohn, Abid Hussain

    2015-01-01

    Objectives. We aimed to examine whether baseline hemoglobin levels in C282Y-homozygous patients are related to the degree of serum ferritin (SF) elevation and whether patients with different baseline hemoglobin have different phlebotomy requirements. Methods. A total of 196 patients (124 males and 72 females) who had undergone therapeutic phlebotomy and had SF and both pre- and posttreatment hemoglobin values were included in the study. Results. Bivariate correlation analysis suggested that baseline SF explains approximately 6 to 7% of the variation in baseline hemoglobin. The results also showed that males who had higher (≥150 g/L) baseline hemoglobin levels had a significantly greater reduction in their posttreatment hemoglobin despite requiring fewer phlebotomies to achieve iron depletion than those who had lower (baseline hemoglobin, regardless of whether baseline SF was below or above 1000 µg/L. There were no significant differences between hemoglobin subgroups regarding baseline and treatment characteristics, except for transferrin saturation between male subgroups with SF above 1000 µg/L. Similar differences were observed when females with higher (≥138 g/L) baseline hemoglobin were compared with those with lower (baseline hemoglobin. Conclusion. Dividing C282Y-homozygous patients into just two subgroups according to the degree of baseline SF elevation may obscure important subgroup variations. PMID:26380265

  3. Predicting Coronary Artery Aneurysms in Kawasaki Disease at a North American Center: An Assessment of Baseline z Scores.

    Science.gov (United States)

    Son, Mary Beth F; Gauvreau, Kimberlee; Kim, Susan; Tang, Alexander; Dedeoglu, Fatma; Fulton, David R; Lo, Mindy S; Baker, Annette L; Sundel, Robert P; Newburger, Jane W

    2017-05-31

    Accurate risk prediction of coronary artery aneurysms (CAAs) in North American children with Kawasaki disease remains a clinical challenge. We sought to determine the predictive utility of baseline coronary dimensions adjusted for body surface area ( z scores) for future CAAs in Kawasaki disease and explored the extent to which addition of established Japanese risk scores to baseline coronary artery z scores improved discrimination for CAA development. We explored the relationships of CAA with baseline z scores; with Kobayashi, Sano, Egami, and Harada risk scores; and with the combination of baseline z scores and risk scores. We defined CAA as a maximum z score (zMax) ≥2.5 of the left anterior descending or right coronary artery at 4 to 8 weeks of illness. Of 261 patients, 77 patients (29%) had a baseline zMax ≥2.0. CAAs occurred in 15 patients (6%). CAAs were strongly associated with baseline zMax ≥2.0 versus Baseline zMax ≥2.0 had a C statistic of 0.77, good sensitivity (80%), and excellent negative predictive value (98%). None of the risk scores alone had adequate discrimination. When high-risk status per the Japanese risk scores was added to models containing baseline zMax ≥2.0, none were significantly better than baseline zMax ≥2.0 alone. In a North American center, baseline zMax ≥2.0 in children with Kawasaki disease demonstrated high predictive utility for later development of CAA. Future studies should validate the utility of our findings. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  4. The outlook for natural gas markets in the GRI baseline projection

    International Nuclear Information System (INIS)

    Holtberg, P.D.

    1990-01-01

    Gas Research Institute is an independent, not-for-profit organization that plans, manages, and develops financing for a cooperative research and development program for the mutual benefit of the natural gas industry and its customers. The research program consists of over 500 active research projects in natural gas supply and end use, and in gas industry operations, as well as related basic research. This paper summarizes the U.S. natural gas demand and supply outlook projected in a preliminary version of the 191 edition of the GRI Baseline Projection of U.S. Energy Supply and Demand. The projection used for this paper is from an early run of the GRI modeling structure. As such, it is subject to substantial revision before the Baseline Projection is finalized. The paper presents a projection of natural gas demand in the major end-use sectors and the slate of supply sources expected to meet that demand over the period from 1989 to 2010

  5. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  6. Streamline Your Project: A Lifecycle Model.

    Science.gov (United States)

    Viren, John

    2000-01-01

    Discusses one approach to project organization providing a baseline lifecycle model for multimedia/CBT development. This variation of the standard four-phase model of Analysis, Design, Development, and Implementation includes a Pre-Analysis phase, called Definition, and a Post-Implementation phase, known as Maintenance. Each phase is described.…

  7. Environmental Assessment/Baseline Survey to Establish New Drop Zone (DZ) in Cadiz, Ohio

    Science.gov (United States)

    2009-03-01

    International Airport LATN Low Altitude Tactical Navigation MACA Military Airspace Collision Avoidance µg/m3 microgram per cubic meter MSL mean...Environmental Assessment and Baseline Survey 3-3 To Establish New Drop Zone in Cadiz, OH The 911 AW has a Military Airspace Collision Avoidance ( MACA ) plan...in Cadiz, OH The 911 AW flight safety would revise their existing MACA plan to include activities at the new drop zone. The MACA includes placing a

  8. Sensitivity of amounts and distribution of tropical forest carbon credits depending on baseline rules

    International Nuclear Information System (INIS)

    Griscom, Bronson; Shoch, David; Stanley, Bill; Cortez, Rane; Virgilio, Nicole

    2009-01-01

    One of the largest sources of global greenhouse gas emissions can be addressed through conservation of tropical forests by channeling funds to developing countries at a cost-savings for developed countries. However, questions remain to be resolved in negotiating a system for including reduced emissions from deforestation and forest degradation (REDD) in a post-Kyoto climate treaty. The approach to determine national baselines, or reference levels, for quantifying REDD has emerged as central to negotiations over a REDD mechanism in a post-Kyoto policy framework. The baseline approach is critical to the success of a REDD mechanism because it affects the quantity, credibility, and equity of credits generated from efforts to reduce forest carbon emissions. We compared outcomes of seven proposed baseline approaches as a function of country circumstances, using a retrospective analysis of FAO-FRA data on forest carbon emissions from deforestation. Depending upon the baseline approach used, the total credited emissions avoided ranged over two orders of magnitude for the same quantity of actual emissions reductions. There was also a wide range in the relative distribution of credits generated among the five country types we identified. Outcomes were especially variable for countries with high remaining forest and low rates of deforestation (HFLD). We suggest that the most credible approaches measure emissions avoided with respect to a business-as-usual baseline scenario linked to historic emissions data, and allow limited adjustments based on forest carbon stocks.

  9. The predictive value of the baseline Oswestry Disability Index in lumbar disc arthroplasty.

    Science.gov (United States)

    Deutsch, Harel

    2010-06-01

    The goal of the study was to determine patient factors predictive of good outcome after lumbar disc arthroplasty. Specifically, the paper examines the relationship of the preoperative Oswestry Disability Index (ODI) to patient outcome at 1 year. The study is a retrospective review of 20 patients undergoing a 1-level lumbar disc arthroplasty at the author's institution between 2004 and 2008. All data were collected prospectively. Data included the ODI, visual analog scale scores, and patient demographics. All patients underwent a 1-level disc arthroplasty at L4-5 or L5-S1. The patients were divided into 2 groups based on their baseline ODI. Patients with an ODI between 38 and 59 demonstrated better outcomes with lumbar disc arthroplasty. Only 1 (20%) of 5 patients with a baseline ODI higher than 60 reported a good outcome. In contrast, 13 (87%) of 15 patients with an ODI between 38 and 59 showed a good outcome (p = 0.03). The negative predictive value of using ODI > 60 is 60% in patients who are determined to be candidates for lumbar arthroplasty. Lumbar arthroplasty is very effective in some patients. Other patients do not improve after surgery. The baseline ODI results are predictive of outcome in patients selected for lumbar disc arthroplasty. A baseline ODI > 60 is predictive of poor outcome. A high ODI may be indicative of psychosocial overlay.

  10. A study of man made radioactivity baseline in dietary materials

    International Nuclear Information System (INIS)

    de la Paz, L.; Estacio, J.; Palattao, M.V.; Anden, A.

    1986-01-01

    This paper describes the radioactivity baseline from literature data coming from various countries where data are available. 1979-1985 were chosen as the baseline years for the following: milk (fresh and powdered), meat and meat products, cereals, fruits, coffee and tea, fish and vegetables. Pre- and post-Chernobyl baseline data are given. (ELC). 21 figs; 17 refs

  11. Attention enhances contrast appearance via increased input baseline of neural responses.

    Science.gov (United States)

    Cutrone, Elizabeth K; Heeger, David J; Carrasco, Marisa

    2014-12-30

    Covert spatial attention increases the perceived contrast of stimuli at attended locations, presumably via enhancement of visual neural responses. However, the relation between perceived contrast and the underlying neural responses has not been characterized. In this study, we systematically varied stimulus contrast, using a two-alternative, forced-choice comparison task to probe the effect of attention on appearance across the contrast range. We modeled performance in the task as a function of underlying neural contrast-response functions. Fitting this model to the observed data revealed that an increased input baseline in the neural responses accounted for the enhancement of apparent contrast with spatial attention. © 2014 ARVO.

  12. Baseline effects on carbon footprints of biofuels: The case of wood

    International Nuclear Information System (INIS)

    Johnson, Eric; Tschudi, Daniel

    2012-01-01

    As biofuel usage has boomed over the past decade, so has research and regulatory interest in its carbon accounting. This paper examines one aspect of that carbon accounting: the baseline, i.e. the reference case against which other conditions or changes can be compared. A literature search and analysis identified four baseline types: no baseline; reference point; marginal fossil fuel; and biomass opportunity cost. The fourth one, biomass opportunity cost, is defined in more detail, because this is not done elsewhere in the literature. The four baselines are then applied to the carbon footprint of a wood-fired power plant. The footprint of the resulting wood-fired electricity varies dramatically, according to the type of baseline. Baseline type is also found to be the footprint's most significant sensitivity. Other significant sensitivities are: efficiency of the power plant; the growth (or re-growth) rate of the forest that supplies the wood; and the residue fraction of the wood. Length of the policy horizon is also an important factor in determining the footprint. The paper concludes that because of their significance and variability, baseline choices should be made very explicit in biofuel carbon footprints. - Highlights: ► Four baseline types for biofuel footprinting are identified. ► One type, ‘biomass opportunity cost’, is defined mathematically and graphically. ► Choice of baseline can dramatically affect the footprint result. ► The ‘no baseline’ approach is not acceptable. ► Choice between the other three baselines depends on the question being addressed.

  13. Enhanced UWB Radio Channel Model for Short-Range Communication Scenarios Including User Dynamics

    DEFF Research Database (Denmark)

    Kovacs, Istvan Zsolt; Nguyen, Tuan Hung; Eggers, Patrick Claus F.

    2005-01-01

    channel model represents an enhancement of the existing IEEE 802.15.3a/4a PAN channel model, where antenna and user-proximity effects are not included. Our investigations showed that significant variations of the received wideband power and time-delay signal clustering are possible due the human body...

  14. Aggregated Demand Modelling Including Distributed Generation, Storage and Demand Response

    OpenAIRE

    Marzooghi, Hesamoddin; Hill, David J.; Verbic, Gregor

    2014-01-01

    It is anticipated that penetration of renewable energy sources (RESs) in power systems will increase further in the next decades mainly due to environmental issues. In the long term of several decades, which we refer to in terms of the future grid (FG), balancing between supply and demand will become dependent on demand actions including demand response (DR) and energy storage. So far, FG feasibility studies have not considered these new demand-side developments for modelling future demand. I...

  15. Long baseline neutrino oscillation experiments

    International Nuclear Information System (INIS)

    Gallagher, H.

    2006-01-01

    In this paper I will review briefly the experimental results which established the existence of neutrino mixing, the current generation of long baseline accelerator experiments, and the prospects for the future. In particular I will focus on the recent analysis of the MINOS experiment. (author)

  16. Baseline composition of solar energetic particles

    International Nuclear Information System (INIS)

    Meyer, J.

    1985-01-01

    We analyze all existing spacecraft observations of the highly variable heavy element composition of solar energetic particles (SEP) during non- 3 He-rich events. All data show the imprint of an ever-present basic composition pattern (dubbed ''mass-unbiased baseline'' SEP composition) that differs from the photospheric composition by a simple bias related to first ionization potential (FIP). In each particular observation, this mass-unbiased baseline composition is being distorted by an additional bias, which is always a monotonic function of mass (or Z). This latter bias varies in amplitude and even sign from observation to observation. To first order, it seems related to differences in the A/Z* ratio between elements (Z* = mean effective charge)

  17. Prevalence of Invalid Performance on Baseline Testing for Sport-Related Concussion by Age and Validity Indicator.

    Science.gov (United States)

    Abeare, Christopher A; Messa, Isabelle; Zuccato, Brandon G; Merker, Bradley; Erdodi, Laszlo

    2018-03-12

    Estimated base rates of invalid performance on baseline testing (base rates of failure) for the management of sport-related concussion range from 6.1% to 40.0%, depending on the validity indicator used. The instability of this key measure represents a challenge in the clinical interpretation of test results that could undermine the utility of baseline testing. To determine the prevalence of invalid performance on baseline testing and to assess whether the prevalence varies as a function of age and validity indicator. This retrospective, cross-sectional study included data collected between January 1, 2012, and December 31, 2016, from a clinical referral center in the Midwestern United States. Participants included 7897 consecutively tested, equivalently proportioned male and female athletes aged 10 to 21 years, who completed baseline neurocognitive testing for the purpose of concussion management. Baseline assessment was conducted with the Immediate Postconcussion Assessment and Cognitive Testing (ImPACT), a computerized neurocognitive test designed for assessment of concussion. Base rates of failure on published ImPACT validity indicators were compared within and across age groups. Hypotheses were developed after data collection but prior to analyses. Of the 7897 study participants, 4086 (51.7%) were male, mean (SD) age was 14.71 (1.78) years, 7820 (99.0%) were primarily English speaking, and the mean (SD) educational level was 8.79 (1.68) years. The base rate of failure ranged from 6.4% to 47.6% across individual indicators. Most of the sample (55.7%) failed at least 1 of 4 validity indicators. The base rate of failure varied considerably across age groups (117 of 140 [83.6%] for those aged 10 years to 14 of 48 [29.2%] for those aged 21 years), representing a risk ratio of 2.86 (95% CI, 2.60-3.16; P indicator and the age of the examinee. The strong age association, with 3 of 4 participants aged 10 to 12 years failing validity indicators, suggests that the

  18. The LIFE Cognition Study: design and baseline characteristics

    Science.gov (United States)

    Sink, Kaycee M; Espeland, Mark A; Rushing, Julia; Castro, Cynthia M; Church, Timothy S; Cohen, Ronald; Gill, Thomas M; Henkin, Leora; Jennings, Janine M; Kerwin, Diana R; Manini, Todd M; Myers, Valerie; Pahor, Marco; Reid, Kieran F; Woolard, Nancy; Rapp, Stephen R; Williamson, Jeff D

    2014-01-01

    Observational studies have shown beneficial relationships between exercise and cognitive function. Some clinical trials have also demonstrated improvements in cognitive function in response to moderate–high intensity aerobic exercise; however, these have been limited by relatively small sample sizes and short durations. The Lifestyle Interventions and Independence for Elders (LIFE) Study is the largest and longest randomized controlled clinical trial of physical activity with cognitive outcomes, in older sedentary adults at increased risk for incident mobility disability. One LIFE Study objective is to evaluate the effects of a structured physical activity program on changes in cognitive function and incident all-cause mild cognitive impairment or dementia. Here, we present the design and baseline cognitive data. At baseline, participants completed the modified Mini Mental Status Examination, Hopkins Verbal Learning Test, Digit Symbol Coding, Modified Rey–Osterrieth Complex Figure, and a computerized battery, selected to be sensitive to changes in speed of processing and executive functioning. During follow up, participants completed the same battery, along with the Category Fluency for Animals, Boston Naming, and Trail Making tests. The description of the mild cognitive impairment/dementia adjudication process is presented here. Participants with worse baseline Short Physical Performance Battery scores (prespecified at ≤7) had significantly lower median cognitive test scores compared with those having scores of 8 or 9 with modified Mini Mental Status Examination score of 91 versus (vs) 93, Hopkins Verbal Learning Test delayed recall score of 7.4 vs 7.9, and Digit Symbol Coding score of 45 vs 48, respectively (all P<0.001). The LIFE Study will contribute important information on the effects of a structured physical activity program on cognitive outcomes in sedentary older adults at particular risk for mobility impairment. In addition to its importance in the

  19. Idaho National Laboratory’s Greenhouse Gas FY08 Baseline

    Energy Technology Data Exchange (ETDEWEB)

    Jennifer D. Morton

    2011-06-01

    A greenhouse gas (GHG) inventory is a systematic attempt to account for the production and release of certain gasses generated by an institution from various emission sources. The gasses of interest are those which have become identified by climate science as related to anthropogenic global climate change. This document presents an inventory of GHGs generated during fiscal year (FY) 2008 by Idaho National Laboratory (INL), a Department of Energy (DOE)-sponsored entity, located in southeastern Idaho. Concern about the environmental impact of GHGs has grown in recent years. This, together with a desire to decrease harmful environmental impacts, would be enough to encourage the calculation of a baseline estimate of total GHGs generated at INL. Additionally, INL has a desire to see how its emissions compare with similar institutions, including other DOE national laboratories. Executive Order 13514 requires that federal agencies and institutions document reductions in GHG emissions in the future, and such documentation will require knowledge of a baseline against which reductions can be measured. INL's FY08 GHG inventory was calculated according to methodologies identified in federal GHG guidance documents using operational control boundaries. It measures emissions generated in three Scopes: (1) INL emissions produced directly by stationary or mobile combustion and by fugitive emissions, (2) the share of emissions generated by entities from which INL purchased electrical power, and (3) indirect or shared emissions generated by outsourced activities that benefit INL (occur outside INL's organizational boundaries but are a consequence of INL's activities). This inventory found that INL generated a total of 113,049 MT of CO2-equivalent emissions during FY08. The following conclusions were made from looking at the results of the individual contributors to INL's baseline GHG inventory: (1) Electricity (including the associated transmission and

  20. Low-dose budesonide treatment reduces severe asthma-related events in patients with infrequent asthma symptoms at baseline

    DEFF Research Database (Denmark)

    Reddel, H. K.; Busse, W. W.; Pedersen, Søren

    2015-01-01

    symptoms, evidence is lacking for the benefit of ICS and safety of bronchodilator-only treatment. We investigated asthma outcomes by baseline symptom frequency in a post-hoc analysis of the multinational inhaled Steroid Treatment As Regular Therapy in early asthma (START) study.2 METHODS: Patients aged 4......-66 years with recent-onset mild asthma (11 years] or 200 mug [patients aged 2 symptom days/week; further divided into 0-1, >1-2 symptom days/week). RESULTS: Overall, 7138 patients were included (budesonide, n=3577; placebo, n=3561). At baseline, symptom frequency was 0-1 symptom days/week for 2184 (30...... even in patients with the lowest baseline asthma symptom frequency (0-1 days/week). (Figure Presented)....

  1. Revised CDM baseline study on fuel use and manure management at household level

    Energy Technology Data Exchange (ETDEWEB)

    Buysman, E.; Bryan, S.; Pino, M.

    2010-05-15

    This report presents the revised study of the original CDM baseline study conducted in 2006. The original study was conducted under the authority of the National Biogas Program (NBP), to study the potential GHG mitigation resulting from the adoption of domestic biodigesters. In the beginning of June 2006, a survey amongst 300 randomly selected households with the technical potential for a biodigester was conducted in the NBP's 6-targeted provinces (Kampong Cham, Svay Rieng, Prey Veng, Kampong Speu, Takeo and Kandal) in southeast Cambodia. The revised baseline study includes two additional provinces, Kampot and Kampong Chhnang. The survey showed that a significant proportion of the households have no access to basic sanitation and often have health problems. They consume mainly wood as cooking fuel and the majority use inefficient cooking stoves. The main lighting fuel is kerosene. The GHG emissions were calculated for each type of Animal Waste Management System (AWMS) and the baseline fuel consumption. The main methodology used is the GS-VER biodigester methodology and the IPCC 2006 guidelines to ex-ante estimate baseline, project and the emission reductions. The GHG emission from wood burning is only considered when it originates from a non-renewable source. The NRB analysis determined a NRB share of 70.7% for both collected and purchased wood. Total GHG emission is calculated by combining AWMS and wood fuels emissions. The annual baseline and project emission was estimated to be respectively 5.38 tCO2eq and 0.46 tCO2eq per average household, the emission reductions (ER) are therefore 4.92 tCO2eq/household/year.

  2. Magical properties of a 2540 km baseline superbeam experiment

    International Nuclear Information System (INIS)

    Raut, Sushant K.; Singh, Ravi Shanker; Uma Sankar, S.

    2011-01-01

    Lack of any information on the CP violating phase δ CP weakens our ability to determine neutrino mass hierarchy. Magic baseline of 7500 km was proposed to overcome this problem. However, to obtain large enough fluxes, at this very long baseline, one needs new techniques of generating high intensity neutrino beams. In this Letter, we highlight the magical properties of a 2540 km baseline. At such a baseline, using a narrow band neutrino superbeam whose no oscillation event rate peaks around the energy 3.5 GeV, we can determine neutrino mass hierarchy independently of the CP phase. For sin 2 2θ 13 ≥0.05, a very modest exposure of 10 Kiloton-years is sufficient to determine the hierarchy. For 0.02≤sin 2 2θ 13 ≤0.05, an exposure of about 100 Kiloton-years is needed.

  3. Tiotropium improves lung function, exacerbation rate, and asthma control, independent of baseline characteristics including age, degree of airway obstruction, and allergic status

    DEFF Research Database (Denmark)

    Kerstjens, Huib A M; Moroni-Zentgraf, Petra; Tashkin, Donald P

    2016-01-01

    performed in parallel in patients with severe symptomatic asthma. Exploratory subgroup analyses of peak forced expiratory volume in 1 s (FEV1), trough FEV1, time to first severe exacerbation, time to first episode of asthma worsening, and seven-question Asthma Control Questionnaire responder rate were......BACKGROUND: Many patients with asthma remain symptomatic despite treatment with inhaled corticosteroids (ICS) with or without long-acting β2-agonists (LABAs). Tiotropium add-on to ICS plus a LABA has been shown to improve lung function and reduce exacerbation risk in patients with symptomatic...... asthma. OBJECTIVE: To determine whether the efficacy of tiotropium add-on therapy is dependent on patients' baseline characteristics. METHODS: Two randomized, double-blind, parallel-group, twin trials (NCT00772538 and NCT00776984) of once-daily tiotropium Respimat(®) 5 μg add-on to ICS plus a LABA were...

  4. Background selection as baseline for nucleotide variation across the Drosophila genome.

    Directory of Open Access Journals (Sweden)

    Josep M Comeron

    2014-06-01

    Full Text Available The constant removal of deleterious mutations by natural selection causes a reduction in neutral diversity and efficacy of selection at genetically linked sites (a process called Background Selection, BGS. Population genetic studies, however, often ignore BGS effects when investigating demographic events or the presence of other types of selection. To obtain a more realistic evolutionary expectation that incorporates the unavoidable consequences of deleterious mutations, we generated high-resolution landscapes of variation across the Drosophila melanogaster genome under a BGS scenario independent of polymorphism data. We find that BGS plays a significant role in shaping levels of variation across the entire genome, including long introns and intergenic regions distant from annotated genes. We also find that a very large percentage of the observed variation in diversity across autosomes can be explained by BGS alone, up to 70% across individual chromosome arms at 100-kb scale, thus indicating that BGS predictions can be used as baseline to infer additional types of selection and demographic events. This approach allows detecting several outlier regions with signal of recent adaptive events and selective sweeps. The use of a BGS baseline, however, is particularly appropriate to investigate the presence of balancing selection and our study exposes numerous genomic regions with the predicted signature of higher polymorphism than expected when a BGS context is taken into account. Importantly, we show that these conclusions are robust to the mutation and selection parameters of the BGS model. Finally, analyses of protein evolution together with previous comparisons of genetic maps between Drosophila species, suggest temporally variable recombination landscapes and, thus, local BGS effects that may differ between extant and past phases. Because genome-wide BGS and temporal changes in linkage effects can skew approaches to estimate demographic and

  5. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, January--March 1992

    Energy Technology Data Exchange (ETDEWEB)

    1992-09-01

    The objectives of the study are to: Develop a baseline design for indirect liquefaction using advanced Fischer-Tropsch (F-T) technology. Prepare the capital and operating costs for the baseline design. Develop a process flow sheet simulation (PFS) model. This report summarizes the activities completed during the period December 23, 1992 through March 15, 1992. In Task 1, Baseline Design and Alternates, the following activities related to the tradeoff studies were completed: approach and basis; oxygen purity; F-T reactor pressure; wax yield; autothermal reformer; hydrocarbons (C{sub 3}/C{sub 4}s) recovery; and hydrogenrecovery. In Task 3, Engineering Design Criteria, activities were initiated to support the process tradeoff studies in Task I and to develop the environmental strategy for the Illinois site. The work completed to date consists of the development of the F-T reactor yield correlation from the Mobil dam and a brief review of the environmental strategy prepared for the same site in the direct liquefaction baseline study.Some work has also been done in establishing site-related criteria, in establishing the maximum vessel diameter for train sizing and in coping with the low H{sub 2}/CO ratio from the Shell gasifier. In Task 7, Project Management and Administration, the following activities were completed: the subcontract agreement between Amoco and Bechtel was negotiated; a first technical progress meeting was held at the Bechtel office in February; and the final Project Management Plan was approved by PETC and issued in March 1992.

  6. Forecasting Sensorimotor Adaptability from Baseline Inter-Trial Correlations

    Science.gov (United States)

    Beaton, K. H.; Bloomberg, J. J.

    2014-01-01

    One of the greatest challenges surrounding adaptation to the spaceflight environment is the large variability in symptoms, and corresponding functional impairments, from one crewmember to the next. This renders preflight training and countermeasure development difficult, as a "one-size-fits-all" approach is inappropriate. Therefore, it would be highly advantageous to know ahead of time which crewmembers might have more difficulty adjusting to the novel g-levels inherent to spaceflight. Such knowledge could guide individually customized countermeasures, which would enable more efficient use of crew time, both preflight and inflight, and provide better outcomes. The primary goal of this project is to look for a baseline performance metric that can forecast sensorimotor adaptability without exposure to an adaptive stimulus. We propose a novel hypothesis that considers baseline inter-trial correlations, the trial-to-trial fluctuations in motor performance, as a predictor of individual sensorimotor adaptive capabilities. To-date, a strong relationship has been found between baseline inter-trial correlations and adaptability in two oculomotor systems. For this project, we will explore an analogous predictive mechanism in the locomotion system. METHODS: Baseline Inter-trial Correlations: Inter-trial correlations specify the relationships among repeated trials of a given task that transpire as a consequence of correcting for previous performance errors over multiple timescales. We can quantify the strength of inter-trial correlations by measuring the decay of the autocorrelation function (ACF), which describes how rapidly information from past trials is "forgotten." Processes whose ACFs decay more slowly exhibit longer-term inter-trial correlations (longer memory processes), while processes whose ACFs decay more rapidly exhibit shorterterm inter-trial correlations (shorter memory processes). Longer-term correlations reflect low-frequency activity, which is more easily

  7. S5-4: Formal Modeling of Affordance in Human-Included Systems

    Directory of Open Access Journals (Sweden)

    Namhun Kim

    2012-10-01

    Full Text Available In spite of it being necessary for humans to consider modeling, analysis, and control of human-included systems, it has been considered a challenging problem because of the critical role of humans in complex systems and of humans' capability of executing unanticipated actions–both beneficial and detrimental ones. Thus, to provide systematic approaches to modeling human actions as a part of system behaviors, a formal modeling framework for human-involved systems in which humans play a controlling role based on their perceptual information is presented. The theory of affordance provides definitions of human actions and their associated properties; Finite State Automata (FSA based modeling is capable of mapping nondeterministic humans into computable components in the system representation. In this talk, we investigate the role of perception in human actions in the system operation and examine the representation of perceptual elements in affordance-based modeling formalism. The proposed framework is expected to capture the natural ways in which humans participate in the system as part of its operation. A human-machine cooperative manufacturing system control example and a human agent simulation example will be introduced for the illustrative purposes at the end of the presentation.

  8. 2016 Annual Technology Baseline (ATB)

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Wesley; Kurup, Parthiv; Hand, Maureen; Feldman, David; Sigrin, Benjamin; Lantz, Eric; Stehly, Tyler; Augustine, Chad; Turchi, Craig; O' Connor, Patrick; Waldoch, Connor

    2016-09-01

    Consistent cost and performance data for various electricity generation technologies can be difficult to find and may change frequently for certain technologies. With the Annual Technology Baseline (ATB), National Renewable Energy Laboratory provides an organized and centralized dataset that was reviewed by internal and external experts. It uses the best information from the Department of Energy laboratory's renewable energy analysts and Energy Information Administration information for conventional technologies. The ATB will be updated annually in order to provide an up-to-date repository of current and future cost and performance data. Going forward, we plan to revise and refine the values using best available information. The ATB includes both a presentation with notes (PDF) and an associated Excel Workbook. The ATB includes the following electricity generation technologies: land-based wind; offshore wind; utility-scale solar PV; concentrating solar power; geothermal power; hydropower plants (upgrades to existing facilities, powering non-powered dams, and new stream-reach development); conventional coal; coal with carbon capture and sequestration; integrated gasification combined cycle coal; natural gas combustion turbines; natural gas combined cycle; conventional biopower. Nuclear laboratory's renewable energy analysts and Energy Information Administration information for conventional technologies. The ATB will be updated annually in order to provide an up-to-date repository of current and future cost and performance data. Going forward, we plan to revise and refine the values using best available information.

  9. Modeling of the Direct Current Generator Including the Magnetic Saturation and Temperature Effects

    Directory of Open Access Journals (Sweden)

    Alfonso J. Mercado-Samur

    2013-11-01

    Full Text Available In this paper the inclusion of temperature effect on the field resistance on the direct current generator model DC1A, which is valid to stability studies is proposed. First, the linear generator model is presented, after the effect of magnetic saturation and the change in the resistance value due to temperature produced by the field current are included. The comparison of experimental results and model simulations to validate the model is used. A direct current generator model which is a better representation of the generator is obtained. Visual comparison between simulations and experimental results shows the success of the proposed model, because it presents the lowest error of the compared models. The accuracy of the proposed model is observed via Modified Normalized Sum of Squared Errors index equal to 3.8979%.

  10. Operationalizing clean development mechanism baselines: A case study of China's electrical sector

    Science.gov (United States)

    Steenhof, Paul A.

    The global carbon market is rapidly developing as the first commitment period of the Kyoto Protocol draws closer and Parties to the Protocol with greenhouse gas (GHG) emission reduction targets seek alternative ways to reduce their emissions. The Protocol includes the Clean Development Mechanism (CDM), a tool that encourages project-based investments to be made in developing nations that will lead to an additional reduction in emissions. Due to China's economic size and rate of growth, technological characteristics, and its reliance on coal, it contains a large proportion of the global CDM potential. As China's economy modernizes, more technologies and processes are requiring electricity and demand for this energy source is accelerating rapidly. Relatively inefficient technology to generate electricity in China thereby results in the electrical sector having substantial GHG emission reduction opportunities as related to the CDM. In order to ensure the credibility of the CDM in leading to a reduction in GHG emissions, it is important that the baseline method used in the CDM approval process is scientifically sound and accessible for both others to use and for evaluation purposes. Three different methods for assessing CDM baselines and environmental additionality are investigated in the context of China's electrical sector: a method based on a historical perspective of the electrical sector (factor decomposition), a method structured upon a current perspective (operating and build margins), and a simulation of the future (dispatch analysis). Assessing future emission levels for China's electrical sector is a very challenging task given the complexity of the system, its dynamics, and that it is heavily influenced by internal and external forces, but of the different baseline methods investigated, dispatch modelling is best suited for the Chinese context as it is able to consider the important regional and temporal dimensions of its economy and its future development

  11. Estimating baseline risks from biouptake and food ingestion at a contaminated site

    International Nuclear Information System (INIS)

    MacDonell, M.; Woytowich, K.; Blunt, D.; Picel, M.

    1993-01-01

    Biouptake of contaminants and subsequent human exposure via food ingestion represents a public concern at many contaminated sites. Site-specific measurements from plant and animal studies are usually quite limited, so this exposure pathway is often modeled to assess the potential for adverse health effects. A modeling tool was applied to evaluate baseline risks at a contaminated site in Missouri, and the results were used to confirm that ingestion of fish and game animals from the site area do not pose a human health threat. Results were also used to support the development of cleanup criteria for site soil

  12. The effect of endoscopic fundoplication and proton pump inhibitors on baseline impedance and heartburn severity in GERD patients.

    Science.gov (United States)

    Rinsma, N F; Farré, R; Bouvy, N D; Masclee, A A M; Conchillo, J M

    2015-02-01

    Antireflux therapy may lead to recovery of impaired mucosal integrity in gastro-esophageal reflux disease (GERD) patients as reflected by an increase in baseline impedance. The study objective was to evaluate the effect of endoscopic fundoplication and proton pump inhibitor (PPI) PPI therapy on baseline impedance and heartburn severity in GERD patients. Forty-seven GERD patients randomized to endoscopic fundoplication (n = 32) or PPI therapy (n = 15), and 29 healthy controls were included. Before randomization and 6 months after treatment, baseline impedance was obtained during 24-h pH-impedance monitoring. Heartburn severity was evaluated using the GERD-HRQL questionnaire. Before treatment, baseline impedance in GERD patients was lower than in healthy controls (p heartburn severity indicates that other factors may contribute to heartburn perception in GERD. © 2014 John Wiley & Sons Ltd.

  13. BioModels: expanding horizons to include more modelling approaches and formats.

    Science.gov (United States)

    Glont, Mihai; Nguyen, Tung V N; Graesslin, Martin; Hälke, Robert; Ali, Raza; Schramm, Jochen; Wimalaratne, Sarala M; Kothamachu, Varun B; Rodriguez, Nicolas; Swat, Maciej J; Eils, Jurgen; Eils, Roland; Laibe, Camille; Malik-Sheriff, Rahuman S; Chelliah, Vijayalakshmi; Le Novère, Nicolas; Hermjakob, Henning

    2018-01-04

    BioModels serves as a central repository of mathematical models representing biological processes. It offers a platform to make mathematical models easily shareable across the systems modelling community, thereby supporting model reuse. To facilitate hosting a broader range of model formats derived from diverse modelling approaches and tools, a new infrastructure for BioModels has been developed that is available at http://www.ebi.ac.uk/biomodels. This new system allows submitting and sharing of a wide range of models with improved support for formats other than SBML. It also offers a version-control backed environment in which authors and curators can work collaboratively to curate models. This article summarises the features available in the current system and discusses the potential benefit they offer to the users over the previous system. In summary, the new portal broadens the scope of models accepted in BioModels and supports collaborative model curation which is crucial for model reproducibility and sharing. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. Probing neutrino oscillations jointly in long and very long baseline experiments

    International Nuclear Information System (INIS)

    Wang, Y.F.; Whisnant, K.; Young Binglin; Xiong Zhaohua; Yang Jinmin

    2002-01-01

    We examine the prospects of making a joint analysis of neutrino oscillations at two baselines with neutrino superbeams. Assuming narrow band superbeams and a 100 kiloton water Cherenkov calorimeter, we calculate the event rates and sensitivities to the matter effect, the signs of the neutrino mass differences, the CP phase, and the mixing angle θ 13 . Taking into account all possible experimental errors under general consideration, we explore the optimum cases of a narrow band beam to measure the matter effect and the CP violation effect at all baselines up to 3000 km. We then focus on two specific baselines, a long baseline of 300 km and a very long baseline of 2100 km, and analyze their joint capabilities. We find that the joint analysis can offer extra leverage to resolve some of the ambiguities that are associated with the measurement at a single baseline

  15. Clinical outcomes of linezolid and vancomycin in patients with nosocomial pneumonia caused by methicillin-resistant Staphylococcus aureus stratified by baseline renal function: a retrospective, cohort analysis.

    Science.gov (United States)

    Liu, Ping; Capitano, Blair; Stein, Amy; El-Solh, Ali A

    2017-05-22

    The primary objective of this study is to assess whether baseline renal function impacts treatment outcomes of linezolid and vancomycin (with a dose-optimized regimen) for methicillin-resistant Staphylococcus aureus (MRSA) pneumonia. We conducted a retrospective cohort analysis of data generated from a prospective, randomized, controlled clinical trial (NCT 00084266). The analysis included 405 patients with culture-proven MRSA pneumonia. Baseline renal function was stratified based on creatinine clearance. Clinical and microbiological success rates and presence of nephrotoxicity were assessed at the end of treatment (EOT) and end of study (EOS). Multivariate logistic regression analyses of baseline patient characteristics, including treatment, were performed to identify independent predictors of efficacy. Vancomycin concentrations were analyzed using a nonlinear mixed-effects modeling approach. The relationships between vancomycin exposures, pharmacokinetic-pharmacodynamic index (trough concentration, area under the curve over a 24-h interval [AUC 0-24 ], and AUC 0-24 /MIC) and efficacy/nephrotoxicity were assessed in MRSA pneumonia patients using univariate logistic regression or Cox proportional hazards regression analysis approach. After controlling for use of vasoactive agents, choice of antibiotic therapy and bacteremia, baseline renal function was not correlated with clinical and microbiological successes in MRSA pneumonia at either end of treatment or at end of study for both treatment groups. No positive association was identified between vancomycin exposures and efficacy in these patients. Higher vancomycin exposures were correlated with an increased risk of nephrotoxicity (e.g., hazards ratio [95% confidence interval] for a 5 μg/ml increase in trough concentration: 1.42 [1.10, 1.82]). In non-dialysis patients, baseline renal function did not impact the differences in efficacy or nephrotoxicity with treatment of linezolid versus vancomycin in MRSA

  16. A thermal conductivity model for nanofluids including effect of the temperature-dependent interfacial layer

    International Nuclear Information System (INIS)

    Sitprasert, Chatcharin; Dechaumphai, Pramote; Juntasaro, Varangrat

    2009-01-01

    The interfacial layer of nanoparticles has been recently shown to have an effect on the thermal conductivity of nanofluids. There is, however, still no thermal conductivity model that includes the effects of temperature and nanoparticle size variations on the thickness and consequently on the thermal conductivity of the interfacial layer. In the present work, the stationary model developed by Leong et al. (J Nanopart Res 8:245-254, 2006) is initially modified to include the thermal dispersion effect due to the Brownian motion of nanoparticles. This model is called the 'Leong et al.'s dynamic model'. However, the Leong et al.'s dynamic model over-predicts the thermal conductivity of nanofluids in the case of the flowing fluid. This suggests that the enhancement in the thermal conductivity of the flowing nanofluids due to the increase in temperature does not come from the thermal dispersion effect. It is more likely that the enhancement in heat transfer of the flowing nanofluids comes from the temperature-dependent interfacial layer effect. Therefore, the Leong et al.'s stationary model is again modified to include the effect of temperature variation on the thermal conductivity of the interfacial layer for different sizes of nanoparticles. This present model is then evaluated and compared with the other thermal conductivity models for the turbulent convective heat transfer in nanofluids along a uniformly heated tube. The results show that the present model is more general than the other models in the sense that it can predict both the temperature and the volume fraction dependence of the thermal conductivity of nanofluids for both non-flowing and flowing fluids. Also, it is found to be more accurate than the other models due to the inclusion of the effect of the temperature-dependent interfacial layer. In conclusion, the present model can accurately predict the changes in thermal conductivity of nanofluids due to the changes in volume fraction and temperature for

  17. Energy reconstruction in the long-baseline neutrino experiment.

    Science.gov (United States)

    Mosel, U; Lalakulich, O; Gallmeister, K

    2014-04-18

    The Long-Baseline Neutrino Experiment aims at measuring fundamental physical parameters to high precision and exploring physics beyond the standard model. Nuclear targets introduce complications towards that aim. We investigate the uncertainties in the energy reconstruction, based on quasielastic scattering relations, due to nuclear effects. The reconstructed event distributions as a function of energy tend to be smeared out and shifted by several 100 MeV in their oscillatory structure if standard event selection is used. We show that a more restrictive experimental event selection offers the possibility to reach the accuracy needed for a determination of the mass ordering and the CP-violating phase. Quasielastic-based energy reconstruction could thus be a viable alternative to the calorimetric reconstruction also at higher energies.

  18. Carbon emission coefficient of power consumption in India: baseline determination from the demand side

    International Nuclear Information System (INIS)

    Nag, Barnali; Parikh, J.K.

    2005-01-01

    Substantial investments are expected in the Indian power sector under the flexibility mechanisms (CDM/JI) laid down in Article 12 of the Kyoto Protocol. In this context it is important to evolve a detailed framework for baseline construction in the power sector so as to incorporate the major factors that would affect the baseline values directly or indirectly. It is also important to establish carbon coefficients from electricity generation to help consider accurate project boundaries for numerous electricity conservation and DSM schemes. The objective of this paper is to provide (i) time series estimates of indirect carbon emissions per unit of power consumption (which can also be thought of as emission coefficient of power consumption) and (ii) baseline emissions for the power sector till 2015. Annual time series data on Indian electricity generating industry, for 1974-1998, has been used to develop emission projections till 2015. The impacts of generation mix, fuel efficiency, transmission and distribution losses and auxiliary consumption are studied in a Divisia decomposition framework and their possible future impacts on baseline emissions are studied through three scenarios of growth in power consumption. The study also estimates and projects the carbon emission coefficient per unit of final consumption of electricity that can be used for conducting cost benefit of emission reduction potential for several electricity conserving technologies and benchmarking policy models

  19. Effect of tidal cycle and food intake on the baseline plasma corticosterone rhythm in intertidally foraging marine iguanas.

    Science.gov (United States)

    Woodley, Sarah K; Painter, Danika L; Moore, Michael C; Wikelski, Martin; Romero, L Michael

    2003-06-15

    In most species, plasma levels of baseline glucocorticoids such as corticosterone (B) have a circadian rhythm. This rhythm can be entrained by both photoperiod and food intake and is related to aspects of energy intake and metabolism. Marine iguanas (Amblyrhynchus cristatus) offer a unique opportunity to better understand the relative importance of the light:dark cycle versus food intake in influencing the rhythm in baseline B in a natural system. Compared to other species, food intake is not as strictly determined by the phase of the light:dark cycle. Animals feed in the intertidal zone so feeding activity is heavily influenced by the tidal cycle. We measured baseline plasma B levels in free-living iguanas over several 24-h periods that varied in the timing of low tide/foraging activity. We found that baseline B levels were higher during the day relative to night. However, when low tide occurred during the day, baseline B levels dropped coincident with the timing of low tide. Whether the baseline B rhythm (including the drop during foraging) is an endogenous rhythm with a circatidal component, or is simply a result of feeding and associated physiological changes needs to be tested. Together, these data suggest that the baseline B rhythm in marine iguanas is influenced by the tidal cycle/food intake as well as the light:dark cycle.

  20. Improving Ambiguity Resolution for Medium Baselines Using Combined GPS and BDS Dual/Triple-Frequency Observations.

    Science.gov (United States)

    Gao, Wang; Gao, Chengfa; Pan, Shuguo; Wang, Denghui; Deng, Jiadong

    2015-10-30

    The regional constellation of the BeiDou navigation satellite system (BDS) has been providing continuous positioning, navigation and timing services since 27 December 2012, covering China and the surrounding area. Real-time kinematic (RTK) positioning with combined BDS and GPS observations is feasible. Besides, all satellites of BDS can transmit triple-frequency signals. Using the advantages of multi-pseudorange and carrier observations from multi-systems and multi-frequencies is expected to be of much benefit for ambiguity resolution (AR). We propose an integrated AR strategy for medium baselines by using the combined GPS and BDS dual/triple-frequency observations. In the method, firstly the extra-wide-lane (EWL) ambiguities of triple-frequency system, i.e., BDS, are determined first. Then the dual-frequency WL ambiguities of BDS and GPS were resolved with the geometry-based model by using the BDS ambiguity-fixed EWL observations. After that, basic (i.e., L1/L2 or B1/B2) ambiguities of BDS and GPS are estimated together with the so-called ionosphere-constrained model, where the ambiguity-fixed WL observations are added to enhance the model strength. During both of the WL and basic AR, a partial ambiguity fixing (PAF) strategy is adopted to weaken the negative influence of new-rising or low-elevation satellites. Experiments were conducted and presented, in which the GPS/BDS dual/triple-frequency data were collected in Nanjing and Zhengzhou of China, with the baseline distance varying from about 28.6 to 51.9 km. The results indicate that, compared to the single triple-frequency BDS system, the combined system can significantly enhance the AR model strength, and thus improve AR performance for medium baselines with a 75.7% reduction of initialization time on average. Besides, more accurate and stable positioning results can also be derived by using the combined GPS/BDS system.

  1. Importance of baseline specification in evaluating conservation interventions and achieving no net loss of biodiversity.

    Science.gov (United States)

    Bull, J W; Gordon, A; Law, E A; Suttle, K B; Milner-Gulland, E J

    2014-06-01

    There is an urgent need to improve the evaluation of conservation interventions. This requires specifying an objective and a frame of reference from which to measure performance. Reference frames can be baselines (i.e., known biodiversity at a fixed point in history) or counterfactuals (i.e., a scenario that would have occurred without the intervention). Biodiversity offsets are interventions with the objective of no net loss of biodiversity (NNL). We used biodiversity offsets to analyze the effects of the choice of reference frame on whether interventions met stated objectives. We developed 2 models to investigate the implications of setting different frames of reference in regions subject to various biodiversity trends and anthropogenic impacts. First, a general analytic model evaluated offsets against a range of baseline and counterfactual specifications. Second, a simulation model then replicated these results with a complex real world case study: native grassland offsets in Melbourne, Australia. Both models showed that achieving NNL depended upon the interaction between reference frame and background biodiversity trends. With a baseline, offsets were less likely to achieve NNL where biodiversity was decreasing than where biodiversity was stable or increasing. With a no-development counterfactual, however, NNL was achievable only where biodiversity was declining. Otherwise, preventing development was better for biodiversity. Uncertainty about compliance was a stronger determinant of success than uncertainty in underlying biodiversity trends. When only development and offset locations were considered, offsets sometimes resulted in NNL, but not across an entire region. Choice of reference frame determined feasibility and effort required to attain objectives when designing and evaluating biodiversity offset schemes. We argue the choice is thus of fundamental importance for conservation policy. Our results shed light on situations in which biodiversity offsets may

  2. Effects of Baseline Selection on Magnetocardiography: P-Q and T-P Intervals

    International Nuclear Information System (INIS)

    Lim, Hyun Kyoon; Kwon, Hyuk Chan; Kim, Tae En; Lee, Yong Ho; Kim, Jin Mok; Kim, In Seon; Kim, Ki Woong; Park, Yong Ki

    2007-01-01

    The baseline selection is the first and important step to analyze magnetocardiography (MCG) parameters. There are no difficulties to select the baseline between P- and Q-wave peak (P-Q interval) of MCG wave recorded from healthy subjects because the P-Q intervals of the healthy subjects do not much vary. However, patients with ischemic heart disease often show an unstable P-Q interval which does not seem to be appropriate for the baseline. In this case, T-P interval is alternatively recommended for the baseline. However, there has been no study on the difference made by the baseline selection. In this study, we studied the effect of the different baseline selection. MCG data were analyzed from twenty healthy subjects and twenty one patients whose baselines were alternatively selected in the T-P interval for their inappropriate P-Q interval. Paired T-test was used to compare two set of data. Fifteen parameters derived from the R-wave peak, the T-wave peak, and the period, T max/3 ∼ T max were compared for the different baseline selection. As a result, most parameters did not show significant differences (p>0.05) except few parameters. Therefore, there will be no significant differences if anyone of two intervals were selected for the MCG baseline. However, for the consistent analysis, P-Q interval is strongly recommended for the baseline correction.

  3. Baseline Risk Assessment Supporting Closure at Waste Management Area C at the Hanford Site Washington

    International Nuclear Information System (INIS)

    Singleton, Kristin M.

    2015-01-01

    The Office of River Protection under the U.S. Department of Energy is pursuing closure of the Single-Shell Tank (SST) Waste Management Area (WMA) C under the requirements of the Hanford Federal Facility Agreement and Consent Order (HFFACO). A baseline risk assessment (BRA) of current conditions is based on available characterization data and information collected at WMA C. The baseline risk assessment is being developed as a part of a Resource Conservation and Recovery Act (RCRA) Facility Investigation (RFI)/Corrective Measures Study (CMS) at WMA C that is mandatory under Comprehensive Environmental Response, Compensation, and Liability Act and RCRA corrective action. The RFI/CMS is needed to identify and evaluate the hazardous chemical and radiological contamination in the vadose zone from past releases of waste from WMA C. WMA C will be under Federal ownership and control for the foreseeable future, and managed as an industrial area with restricted access and various institutional controls. The exposure scenarios evaluated under these conditions include Model Toxics Control Act (MTCA) Method C, industrial worker, maintenance and surveillance worker, construction worker, and trespasser scenarios. The BRA evaluates several unrestricted land use scenarios (residential all-pathway, MTCA Method B, and Tribal) to provide additional information for risk management. Analytical results from 13 shallow zone (0 to 15 ft. below ground surface) sampling locations were collected to evaluate human health impacts at WMA C. In addition, soil analytical data were screened against background concentrations and ecological soil screening levels to determine if soil concentrations have the potential to adversely affect ecological receptors. Analytical data from 12 groundwater monitoring wells were evaluated between 2004 and 2013. A screening of groundwater monitoring data against background concentrations and Federal maximum concentration levels was used to determine vadose zone

  4. Baseline Risk Assessment Supporting Closure at Waste Management Area C at the Hanford Site Washington

    Energy Technology Data Exchange (ETDEWEB)

    Singleton, Kristin M. [Washington River Protection Solutions LLC, Richland, WA (United States)

    2015-01-07

    The Office of River Protection under the U.S. Department of Energy is pursuing closure of the Single-Shell Tank (SST) Waste Management Area (WMA) C under the requirements of the Hanford Federal Facility Agreement and Consent Order (HFFACO). A baseline risk assessment (BRA) of current conditions is based on available characterization data and information collected at WMA C. The baseline risk assessment is being developed as a part of a Resource Conservation and Recovery Act (RCRA) Facility Investigation (RFI)/Corrective Measures Study (CMS) at WMA C that is mandatory under Comprehensive Environmental Response, Compensation, and Liability Act and RCRA corrective action. The RFI/CMS is needed to identify and evaluate the hazardous chemical and radiological contamination in the vadose zone from past releases of waste from WMA C. WMA C will be under Federal ownership and control for the foreseeable future, and managed as an industrial area with restricted access and various institutional controls. The exposure scenarios evaluated under these conditions include Model Toxics Control Act (MTCA) Method C, industrial worker, maintenance and surveillance worker, construction worker, and trespasser scenarios. The BRA evaluates several unrestricted land use scenarios (residential all-pathway, MTCA Method B, and Tribal) to provide additional information for risk management. Analytical results from 13 shallow zone (0 to 15 ft. below ground surface) sampling locations were collected to evaluate human health impacts at WMA C. In addition, soil analytical data were screened against background concentrations and ecological soil screening levels to determine if soil concentrations have the potential to adversely affect ecological receptors. Analytical data from 12 groundwater monitoring wells were evaluated between 2004 and 2013. A screening of groundwater monitoring data against background concentrations and Federal maximum concentration levels was used to determine vadose zone

  5. Baseline mean deviation and rates of visual field change in treated glaucoma patients.

    Science.gov (United States)

    Forchheimer, I; de Moraes, C G; Teng, C C; Folgar, F; Tello, C; Ritch, R; Liebmann, J M

    2011-05-01

    To evaluate the relationships between baseline visual field (VF) mean deviation (MD) and subsequent progression in treated glaucoma. Records of patients seen in a glaucoma practice between 1999 and 2009 were reviewed. Patients with glaucomatous optic neuropathy, baseline VF damage, and ≥8 SITA-standard 24-2 VF were included. Patients were divided into tertiles based upon baseline MD. Automated pointwise linear regression determined global and localized rates (decibels (dB) per year) of change. Progression was defined when two or more adjacent test locations in the same hemifield showed a sensitivity decline at a rate of >1.0  dB per year, P0.50) and global rates of VF change of progressing eyes were -1.3±1.2, -1.01±0.7, and -0.9±0.5 dB/year (P=0.09, analysis of variance). Within these groups, intraocular pressure (IOP) in stable vs progressing eyes were 15.5±3.3 vs 17.0±3.1 (P0.50) and multivariate (P=0.26) analyses adjusting for differences in follow-up IOP. After correcting for differences in IOP in treated glaucoma patients, we did not find a relationship between the rate of VF change (dB per year) and the severity of the baseline VF MD. This finding may have been due to more aggressive IOP lowering in eyes with more severe disease. Eyes with lower IOP progressed less frequently across the spectrum of VF loss.

  6. Baseline asthma burden, comorbidities, and biomarkers in omalizumab-treated patients in PROSPERO.

    Science.gov (United States)

    Chipps, Bradley E; Zeiger, Robert S; Luskin, Allan T; Busse, William W; Trzaskoma, Benjamin L; Antonova, Evgeniya N; Pazwash, Hooman; Limb, Susan L; Solari, Paul G; Griffin, Noelle M; Casale, Thomas B

    2017-12-01

    Patients included in clinical trials do not necessarily reflect the real-world population. To understand the characteristics, including disease and comorbidity burden, of patients with asthma receiving omalizumab in a real-world setting. The Prospective Observational Study to Evaluate Predictors of Clinical Effectiveness in Response to Omalizumab (PROSPERO) was a US-based, multicenter, single-arm, and prospective study. Patients (≥12 years of age) with allergic asthma initiating omalizumab treatment based on physician-assessed need were included and followed for 12 months. Exacerbations, health care use, adverse events, and Asthma Control Test (ACT) scores were assessed monthly. Biomarkers (blood eosinophils, fractional exhaled nitric oxide, and periostin) were evaluated and patient-reported outcomes (Asthma Quality of Life Questionnaire for 12 Years and Older [AQLQ+12] and Work Productivity and Activity Impairment: Asthma questionnaire [WPAI:Asthma]) were completed at baseline and months 6 and 12. The Mini Rhinoconjunctivitis Quality of Life Questionnaire (MiniRQLQ) was completed at baseline and 12 months. Most of the 806 enrollees (91.4%) were adults (mean age 47.3 years, SD 17.4), white (70.3%), and female (63.5%). Allergic comorbidity was frequently reported (84.2%), as were hypertension (35.5%) and depression (22.1%). In the 12 months before study entry, 22.1% of patients reported at least 1 asthma-related hospitalization, 60.7% reported at least 2 exacerbations, and 83.3% reported ACT scores no higher than 19 (uncontrolled asthma). Most patients had low biomarker levels based on prespecified cut-points. Baseline mean patient-reported outcome scores were 4.0 (SD 1.4) for AQLQ+12, 2.7 (SD 1.4) for MiniRQLQ, and 47.7 (SD 28.9) for WPAI:Asthma percentage of activity impairment and 33.5 (SD 28.7) for percentage of overall work impairment. The population initiating omalizumab in PROSPERO reported poorly controlled asthma and a substantial disease burden. Clinical

  7. Data-Driven Baseline Estimation of Residential Buildings for Demand Response

    Directory of Open Access Journals (Sweden)

    Saehong Park

    2015-09-01

    Full Text Available The advent of advanced metering infrastructure (AMI generates a large volume of data related with energy service. This paper exploits data mining approach for customer baseline load (CBL estimation in demand response (DR management. CBL plays a significant role in measurement and verification process, which quantifies the amount of demand reduction and authenticates the performance. The proposed data-driven baseline modeling is based on the unsupervised learning technique. Specifically we leverage both the self organizing map (SOM and K-means clustering for accurate estimation. This two-level approach efficiently reduces the large data set into representative weight vectors in SOM, and then these weight vectors are clustered by K-means clustering to find the load pattern that would be similar to the potential load pattern of the DR event day. To verify the proposed method, we conduct nationwide scale experiments where three major cities’ residential consumption is monitored by smart meters. Our evaluation compares the proposed solution with the various types of day matching techniques, showing that our approach outperforms the existing methods by up to a 68.5% lower error rate.

  8. Updated global 3+1 analysis of short-baseline neutrino oscillations

    Science.gov (United States)

    Gariazzo, S.; Giunti, C.; Laveder, M.; Li, Y. F.

    2017-06-01

    We present the results of an updated fit of short-baseline neutrino oscillation data in the framework of 3+1 active-sterile neutrino mixing. We first consider ν e and {\\overline{ν}}_e disappearance in the light of the Gallium and reactor anomalies. We discuss the implications of the recent measurement of the reactor {\\overline{ν}}_e spectrum in the NEOS experiment, which shifts the allowed regions of the parameter space towards smaller values of | U e4|2. The β-decay constraints of the Mainz and Troitsk experiments allow us to limit the oscillation length between about 2 cm and 7 m at 3 σ for neutrinos with an energy of 1 MeV. The corresponding oscillations can be discovered in a model-independent way in ongoing reactor and source experiments by measuring ν e and {\\overline{ν}}_e disappearance as a function of distance. We then consider the global fit of the data on short-baseline {}_{ν_{μ}}^{(-)}{\\to}_{ν_e}^{(-)} transitions in the light of the LSND anomaly, taking into account the constraints from {}_{ν_e}^{(-)} and {}_{ν_{μ}}^{(-)} disappearance experiments, including the recent data of the MINOS and IceCube experiments. The combination of the NEOS constraints on | U e4|2 and the MINOS and IceCube constraints on | U μ4|2 lead to an unacceptable appearance-disappearance tension which becomes tolerable only in a pragmatic fit which neglects the MiniBooNE low-energy anomaly. The minimization of the global χ 2 in the space of the four mixing parameters Δ m 41 2 , | U e4|2, | U μ4|2, and | U τ4|2 leads to three allowed regions with narrow Δ m 41 2 widths at Δ m 41 2 ≈ 1.7 (best-fit), 1.3 (at 2 σ), 2.4 (at 3 σ) eV2. The effective amplitude of short-baseline {}_{ν_{μ}}^{(-)}{\\to}_{ν_e}^{(-)} oscillations is limited by 0.00048 ≲ sin2 2 ϑ eμ ≲ 0.0020 at 3 σ. The restrictions of the allowed regions of the mixing parameters with respect to our previous global fits are mainly due to the NEOS constraints. We present a comparison of the

  9. The IUGS/IAGC Task Group on Global Geochemical Baselines

    Science.gov (United States)

    Smith, David B.; Wang, Xueqiu; Reeder, Shaun; Demetriades, Alecos

    2012-01-01

    The Task Group on Global Geochemical Baselines, operating under the auspices of both the International Union of Geological Sciences (IUGS) and the International Association of Geochemistry (IAGC), has the long-term goal of establishing a global geochemical database to document the concentration and distribution of chemical elements in the Earth’s surface or near-surface environment. The database and accompanying element distribution maps represent a geochemical baseline against which future human-induced or natural changes to the chemistry of the land surface may be recognized and quantified. In order to accomplish this long-term goal, the activities of the Task Group include: (1) developing partnerships with countries conducting broad-scale geochemical mapping studies; (2) providing consultation and training in the form of workshops and short courses; (3) organizing periodic international symposia to foster communication among the geochemical mapping community; (4) developing criteria for certifying those projects whose data are acceptable in a global geochemical database; (5) acting as a repository for data collected by those projects meeting the criteria for standardization; (6) preparing complete metadata for the certified projects; and (7) preparing, ultimately, a global geochemical database. This paper summarizes the history and accomplishments of the Task Group since its first predecessor project was established in 1988.

  10. Prioritizing sites for conservation based on similarity to historical baselines and feasibility of protection.

    Science.gov (United States)

    Popejoy, Traci; Randklev, Charles R; Neeson, Thomas M; Vaughn, Caryn C

    2018-05-08

    The shifting baseline syndrome concept advocates for the use of historical knowledge to inform conservation baselines, but does not address the feasibility of restoring sites to those baselines. In many regions, conservation feasibility varies among sites due to differences in resource availability, statutory power, and land-owner participation. We use zooarchaeological records to identify a historical baseline of the freshwater mussel community's composition before Euro-American influence at a river-reach scale. We evaluate how the community reference position and the feasibility of conservation might enable identification of sites where conservation actions would preserve historically representative communities and be likely to succeed. We first present a conceptual model that incorporates community information and landscape factors to link the best conservation areas to potential cost and conservation benefits. Using fuzzy ordination, we identify modern mussel beds that are most like the historical baseline. We then quantify the housing density and land use near each reach to estimate feasibility of habitat restoration. Using our conceptual framework, we identify reaches that have high conservation value (i.e., reaches that contain the best mussel beds) and where restoration actions would be most likely to succeed. Reaches above Lake Belton in central Texas, U.S.A. were most similar in species composition and relative abundance to zooarchaeological sites. A subset of these mussel beds occurred in locations where conservation actions appear to be most feasible. This study demonstrates how to use zooarchaeological data (biodiversity data often readily available) and estimates of conservation feasibility to inform conservation priorities at a local spatial scale. This article is protected by copyright. All rights reserved.

  11. 40 CFR 80.915 - How are the baseline toxics value and baseline toxics volume determined?

    Science.gov (United States)

    2010-07-01

    ... baseline toxics value if it can determine an applicable toxics value for every batch of gasoline produced... of gasoline batch i produced or imported between January 1, 1998 and December 31, 2000, inclusive. i = Individual batch of gasoline produced or imported between January 1, 1998 and December 31, 2000, inclusive. n...

  12. Shifted Baselines Reduce Willingness to Pay for Conservation

    Directory of Open Access Journals (Sweden)

    Loren McClenachan

    2018-02-01

    Full Text Available A loss of memory of past environmental degradation has resulted in shifted baselines, which may result in conservation and restoration goals that are less ambitious than if stakeholders had a full knowledge of ecosystem potential. However, the link between perception of baseline states and support for conservation planning has not been tested empirically. Here, we investigate how perceptions of change in coral reef ecosystems affect stakeholders' willingness to pay (WTP for the establishment of protected areas. Coral reefs are experiencing rapid, global change that is observable by the public, and therefore provide an ideal ecosystem to test links between beliefs about baseline states and willingness to support conservation. Our survey respondents perceived change to coral reef communities across six variables: coral abundance, fish abundance, fish diversity, fish size, sedimentation, and water pollution. Respondants who accurately perceived declines in reef health had significantly higher WTP for protected areas (US $256.80 vs. $102.50 per year, suggesting that shifted baselines may reduce engagement with conservation efforts. If WTP translates to engagement, this suggests that goals for restoration and recovery are likely to be more ambitious if the public is aware of long term change. Therefore, communicating the scope and depth of environmental problems is essential in engaging the public in conservation.

  13. Modeling of Temperature-Dependent Noise in Silicon Nanowire FETs including Self-Heating Effects

    Directory of Open Access Journals (Sweden)

    P. Anandan

    2014-01-01

    Full Text Available Silicon nanowires are leading the CMOS era towards the downsizing limit and its nature will be effectively suppress the short channel effects. Accurate modeling of thermal noise in nanowires is crucial for RF applications of nano-CMOS emerging technologies. In this work, a perfect temperature-dependent model for silicon nanowires including the self-heating effects has been derived and its effects on device parameters have been observed. The power spectral density as a function of thermal resistance shows significant improvement as the channel length decreases. The effects of thermal noise including self-heating of the device are explored. Moreover, significant reduction in noise with respect to channel thermal resistance, gate length, and biasing is analyzed.

  14. FINE-SCALE STRUCTURE OF THE QUASAR 3C 279 MEASURED WITH 1.3 mm VERY LONG BASELINE INTERFEROMETRY

    Energy Technology Data Exchange (ETDEWEB)

    Lu Rusen; Fish, Vincent L.; Doeleman, Sheperd S.; Crew, Geoffrey; Cappallo, Roger J. [Massachusetts Institute of Technology, Haystack Observatory, Route 40, Westford, MA 01886 (United States); Akiyama, Kazunori; Honma, Mareki [National Astronomical Observatory of Japan, Osawa 2-21-1, Mitaka, Tokyo 181-8588 (Japan); Algaba, Juan C.; Ho, Paul T. P.; Inoue, Makoto [Institute of Astronomy and Astrophysics, Academia Sinica, P.O. Box 23-141, Taipei 10617, Taiwan, R.O.C. (China); Bower, Geoffrey C.; Dexter, Matt [Department of Astronomy, Radio Astronomy Laboratory, University of California Berkeley, 601 Campbell, Berkeley, CA 94720-3411 (United States); Brinkerink, Christiaan [Department of Astrophysics, IMAPP, Radboud University Nijmegen, P.O. Box 9010, 6500-GL Nijmegen (Netherlands); Chamberlin, Richard [Caltech Submillimeter Observatory, 111 Nowelo Street, Hilo, HI 96720 (United States); Freund, Robert [Arizona Radio Observatory, Steward Observatory, University of Arizona, 933 North Cherry Avenue, Tucson, AZ 85721-0065 (United States); Friberg, Per [James Clerk Maxwell Telescope, Joint Astronomy Centre, 660 North A' ohoku Place, University Park, Hilo, HI 96720 (United States); Gurwell, Mark A. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Jorstad, Svetlana G. [Institute for Astrophysical Research, Boston University, Boston, MA 02215 (United States); Krichbaum, Thomas P. [Max-Planck-Institut fuer Radioastronomie, Auf dem Huegel 69, D-53121 Bonn (Germany); Loinard, Laurent, E-mail: rslu@haystack.mit.edu [Centro de Radiostronomia y Astrofisica, Universidad Nacional Autonoma de Mexico, 58089 Morelia, Michoacan (Mexico); and others

    2013-07-20

    We report results from five day very long baseline interferometry observations of the well-known quasar 3C 279 at 1.3 mm (230 GHz) in 2011. The measured nonzero closure phases on triangles including stations in Arizona, California, and Hawaii indicate that the source structure is spatially resolved. We find an unusual inner jet direction at scales of {approx}1 pc extending along the northwest-southeast direction (P.A. = 127 Degree-Sign {+-} 3 Degree-Sign ), as opposed to other (previously) reported measurements on scales of a few parsecs showing inner jet direction extending to the southwest. The 1.3 mm structure corresponds closely with that observed in the central region of quasi-simultaneous super-resolution Very Long Baseline Array images at 7 mm. The closure phase changed significantly on the last day when compared with the rest of observations, indicating that the inner jet structure may be variable on daily timescales. The observed new direction of the inner jet shows inconsistency with the prediction of a class of jet precession models. Our observations indicate a brightness temperature of {approx}8 Multiplication-Sign 10{sup 10} K in the 1.3 mm core, much lower than that at centimeter wavelengths. Observations with better uv coverage and sensitivity in the coming years will allow the discrimination between different structure models and will provide direct images of the inner regions of the jet with 20-30 {mu}as (5-7 light months) resolution.

  15. FINE-SCALE STRUCTURE OF THE QUASAR 3C 279 MEASURED WITH 1.3 mm VERY LONG BASELINE INTERFEROMETRY

    International Nuclear Information System (INIS)

    Lu Rusen; Fish, Vincent L.; Doeleman, Sheperd S.; Crew, Geoffrey; Cappallo, Roger J.; Akiyama, Kazunori; Honma, Mareki; Algaba, Juan C.; Ho, Paul T. P.; Inoue, Makoto; Bower, Geoffrey C.; Dexter, Matt; Brinkerink, Christiaan; Chamberlin, Richard; Freund, Robert; Friberg, Per; Gurwell, Mark A.; Jorstad, Svetlana G.; Krichbaum, Thomas P.; Loinard, Laurent

    2013-01-01

    We report results from five day very long baseline interferometry observations of the well-known quasar 3C 279 at 1.3 mm (230 GHz) in 2011. The measured nonzero closure phases on triangles including stations in Arizona, California, and Hawaii indicate that the source structure is spatially resolved. We find an unusual inner jet direction at scales of ∼1 pc extending along the northwest-southeast direction (P.A. = 127° ± 3°), as opposed to other (previously) reported measurements on scales of a few parsecs showing inner jet direction extending to the southwest. The 1.3 mm structure corresponds closely with that observed in the central region of quasi-simultaneous super-resolution Very Long Baseline Array images at 7 mm. The closure phase changed significantly on the last day when compared with the rest of observations, indicating that the inner jet structure may be variable on daily timescales. The observed new direction of the inner jet shows inconsistency with the prediction of a class of jet precession models. Our observations indicate a brightness temperature of ∼8 × 10 10 K in the 1.3 mm core, much lower than that at centimeter wavelengths. Observations with better uv coverage and sensitivity in the coming years will allow the discrimination between different structure models and will provide direct images of the inner regions of the jet with 20-30 μas (5-7 light months) resolution.

  16. Baseline Projection Data Book: GRI baseline projection of U.S. Energy Supply and Demand to 2010. 1992 Edition. Volume 1 and Volume 2

    International Nuclear Information System (INIS)

    Holtberg, P.D.; Woods, T.J.; Lihn, M.L.; Koklauner, A.K.

    1992-01-01

    The 1992 Baseline Projection Data Book provides backup data in tabular form for the 1992 GRI Baseline Projection of U.S. Energy Supply and Demand to 2010. Summary tables and data for the residential, commercial, industrial, electric utility, and transportation sectors are presented in the volume

  17. Seepage Model for PA Including Drift Collapse

    International Nuclear Information System (INIS)

    Li, G.; Tsang, C.

    2000-01-01

    The purpose of this Analysis/Model Report (AMR) is to document the predictions and analysis performed using the Seepage Model for Performance Assessment (PA) and the Disturbed Drift Seepage Submodel for both the Topopah Spring middle nonlithophysal and lower lithophysal lithostratigraphic units at Yucca Mountain. These results will be used by PA to develop the probability distribution of water seepage into waste-emplacement drifts at Yucca Mountain, Nevada, as part of the evaluation of the long term performance of the potential repository. This AMR is in accordance with the ''Technical Work Plan for Unsaturated Zone (UZ) Flow and Transport Process Model Report'' (CRWMS M andO 2000 [153447]). This purpose is accomplished by performing numerical simulations with stochastic representations of hydrological properties, using the Seepage Model for PA, and evaluating the effects of an alternative drift geometry representing a partially collapsed drift using the Disturbed Drift Seepage Submodel. Seepage of water into waste-emplacement drifts is considered one of the principal factors having the greatest impact of long-term safety of the repository system (CRWMS M andO 2000 [153225], Table 4-1). This AMR supports the analysis and simulation that are used by PA to develop the probability distribution of water seepage into drift, and is therefore a model of primary (Level 1) importance (AP-3.15Q, ''Managing Technical Product Inputs''). The intended purpose of the Seepage Model for PA is to support: (1) PA; (2) Abstraction of Drift-Scale Seepage; and (3) Unsaturated Zone (UZ) Flow and Transport Process Model Report (PMR). Seepage into drifts is evaluated by applying numerical models with stochastic representations of hydrological properties and performing flow simulations with multiple realizations of the permeability field around the drift. The Seepage Model for PA uses the distribution of permeabilities derived from air injection testing in niches and in the cross drift to

  18. Seepage Model for PA Including Dift Collapse

    Energy Technology Data Exchange (ETDEWEB)

    G. Li; C. Tsang

    2000-12-20

    The purpose of this Analysis/Model Report (AMR) is to document the predictions and analysis performed using the Seepage Model for Performance Assessment (PA) and the Disturbed Drift Seepage Submodel for both the Topopah Spring middle nonlithophysal and lower lithophysal lithostratigraphic units at Yucca Mountain. These results will be used by PA to develop the probability distribution of water seepage into waste-emplacement drifts at Yucca Mountain, Nevada, as part of the evaluation of the long term performance of the potential repository. This AMR is in accordance with the ''Technical Work Plan for Unsaturated Zone (UZ) Flow and Transport Process Model Report'' (CRWMS M&O 2000 [153447]). This purpose is accomplished by performing numerical simulations with stochastic representations of hydrological properties, using the Seepage Model for PA, and evaluating the effects of an alternative drift geometry representing a partially collapsed drift using the Disturbed Drift Seepage Submodel. Seepage of water into waste-emplacement drifts is considered one of the principal factors having the greatest impact of long-term safety of the repository system (CRWMS M&O 2000 [153225], Table 4-1). This AMR supports the analysis and simulation that are used by PA to develop the probability distribution of water seepage into drift, and is therefore a model of primary (Level 1) importance (AP-3.15Q, ''Managing Technical Product Inputs''). The intended purpose of the Seepage Model for PA is to support: (1) PA; (2) Abstraction of Drift-Scale Seepage; and (3) Unsaturated Zone (UZ) Flow and Transport Process Model Report (PMR). Seepage into drifts is evaluated by applying numerical models with stochastic representations of hydrological properties and performing flow simulations with multiple realizations of the permeability field around the drift. The Seepage Model for PA uses the distribution of permeabilities derived from air injection testing in

  19. Laboratory-based validation of the baseline sensors of the ITER diagnostic residual gas analyzer

    International Nuclear Information System (INIS)

    Klepper, C.C.; Biewer, T.M.; Marcus, C.; Graves, V.B.; Andrew, P.; Hughes, S.; Gardner, W.L.

    2017-01-01

    The divertor-specific ITER Diagnostic Residual Gas Analyzer (DRGA) will provide essential information relating to DT fusion plasma performance. This includes pulse-resolving measurements of the fuel isotopic mix reaching the pumping ducts, as well as the concentration of the helium generated as the ash of the fusion reaction. In the present baseline design, the cluster of sensors attached to this diagnostic's differentially pumped analysis chamber assembly includes a radiation compatible version of a commercial quadrupole mass spectrometer, as well as an optical gas analyzer using a plasma-based light excitation source. This paper reports on a laboratory study intended to validate the performance of this sensor cluster, with emphasis on the detection limit of the isotopic measurement. This validation study was carried out in a laboratory set-up that closely prototyped the analysis chamber assembly configuration of the baseline design. This includes an ITER-specific placement of the optical gas measurement downstream from the first turbine of the chamber's turbo-molecular pump to provide sufficient light emission while preserving the gas dynamics conditions that allow for /textasciitilde 1 s response time from the sensor cluster [1].

  20. Laboratory-based validation of the baseline sensors of the ITER diagnostic residual gas analyzer

    Science.gov (United States)

    Klepper, C. C.; Biewer, T. M.; Marcus, C.; Andrew, P.; Gardner, W. L.; Graves, V. B.; Hughes, S.

    2017-10-01

    The divertor-specific ITER Diagnostic Residual Gas Analyzer (DRGA) will provide essential information relating to DT fusion plasma performance. This includes pulse-resolving measurements of the fuel isotopic mix reaching the pumping ducts, as well as the concentration of the helium generated as the ash of the fusion reaction. In the present baseline design, the cluster of sensors attached to this diagnostic's differentially pumped analysis chamber assembly includes a radiation compatible version of a commercial quadrupole mass spectrometer, as well as an optical gas analyzer using a plasma-based light excitation source. This paper reports on a laboratory study intended to validate the performance of this sensor cluster, with emphasis on the detection limit of the isotopic measurement. This validation study was carried out in a laboratory set-up that closely prototyped the analysis chamber assembly configuration of the baseline design. This includes an ITER-specific placement of the optical gas measurement downstream from the first turbine of the chamber's turbo-molecular pump to provide sufficient light emission while preserving the gas dynamics conditions that allow for \\textasciitilde 1 s response time from the sensor cluster [1].

  1. Laboratory-based validation of the baseline sensors of the ITER diagnostic residual gas analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Biewer, Theodore M. [ORNL; Marcus, Chris [ORNL; Klepper, C Christopher [ORNL; Andrew, Philip [ITER Organization, Cadarache, France; Gardner, W. L. [United States ITER Project Office; Graves, Van B. [ORNL; Hughes, Shaun [ITER Organization, Saint Paul Lez Durance, France

    2017-10-01

    The divertor-specific ITER Diagnostic Residual Gas Analyzer (DRGA) will provide essential information relating to DT fusion plasma performance. This includes pulse-resolving measurements of the fuel isotopic mix reaching the pumping ducts, as well as the concentration of the helium generated as the ash of the fusion reaction. In the present baseline design, the cluster of sensors attached to this diagnostic's differentially pumped analysis chamber assembly includes a radiation compatible version of a commercial quadrupole mass spectrometer, as well as an optical gas analyzer using a plasma-based light excitation source. This paper reports on a laboratory study intended to validate the performance of this sensor cluster, with emphasis on the detection limit of the isotopic measurement. This validation study was carried out in a laboratory set-up that closely prototyped the analysis chamber assembly configuration of the baseline design. This includes an ITER-specific placement of the optical gas measurement downstream from the first turbine of the chamber's turbo-molecular pump to provide sufficient light emission while preserving the gas dynamics conditions that allow for \\textasciitilde 1 s response time from the sensor cluster [1].

  2. Particle-based modeling of heterogeneous chemical kinetics including mass transfer.

    Science.gov (United States)

    Sengar, A; Kuipers, J A M; van Santen, Rutger A; Padding, J T

    2017-08-01

    Connecting the macroscopic world of continuous fields to the microscopic world of discrete molecular events is important for understanding several phenomena occurring at physical boundaries of systems. An important example is heterogeneous catalysis, where reactions take place at active surfaces, but the effective reaction rates are determined by transport limitations in the bulk fluid and reaction limitations on the catalyst surface. In this work we study the macro-micro connection in a model heterogeneous catalytic reactor by means of stochastic rotation dynamics. The model is able to resolve the convective and diffusive interplay between participating species, while including adsorption, desorption, and reaction processes on the catalytic surface. Here we apply the simulation methodology to a simple straight microchannel with a catalytic strip. Dimensionless Damkohler numbers are used to comment on the spatial concentration profiles of reactants and products near the catalyst strip and in the bulk. We end the discussion with an outlook on more complicated geometries and increasingly complex reactions.

  3. Particle-based modeling of heterogeneous chemical kinetics including mass transfer

    Science.gov (United States)

    Sengar, A.; Kuipers, J. A. M.; van Santen, Rutger A.; Padding, J. T.

    2017-08-01

    Connecting the macroscopic world of continuous fields to the microscopic world of discrete molecular events is important for understanding several phenomena occurring at physical boundaries of systems. An important example is heterogeneous catalysis, where reactions take place at active surfaces, but the effective reaction rates are determined by transport limitations in the bulk fluid and reaction limitations on the catalyst surface. In this work we study the macro-micro connection in a model heterogeneous catalytic reactor by means of stochastic rotation dynamics. The model is able to resolve the convective and diffusive interplay between participating species, while including adsorption, desorption, and reaction processes on the catalytic surface. Here we apply the simulation methodology to a simple straight microchannel with a catalytic strip. Dimensionless Damkohler numbers are used to comment on the spatial concentration profiles of reactants and products near the catalyst strip and in the bulk. We end the discussion with an outlook on more complicated geometries and increasingly complex reactions.

  4. The Effect of Pretest Exercise on Baseline Computerized Neurocognitive Test Scores.

    Science.gov (United States)

    Pawlukiewicz, Alec; Yengo-Kahn, Aaron M; Solomon, Gary

    2017-10-01

    Baseline neurocognitive assessment plays a critical role in return-to-play decision making following sport-related concussions. Prior studies have assessed the effect of a variety of modifying factors on neurocognitive baseline test scores. However, relatively little investigation has been conducted regarding the effect of pretest exercise on baseline testing. The aim of our investigation was to determine the effect of pretest exercise on baseline Immediate Post-Concussion Assessment and Cognitive Testing (ImPACT) scores in adolescent and young adult athletes. We hypothesized that athletes undergoing self-reported strenuous exercise within 3 hours of baseline testing would perform more poorly on neurocognitive metrics and would report a greater number of symptoms than those who had not completed such exercise. Cross-sectional study; Level of evidence, 3. The ImPACT records of 18,245 adolescent and young adult athletes were retrospectively analyzed. After application of inclusion and exclusion criteria, participants were dichotomized into groups based on a positive (n = 664) or negative (n = 6609) self-reported history of strenuous exercise within 3 hours of the baseline test. Participants with a positive history of exercise were then randomly matched, based on age, sex, education level, concussion history, and hours of sleep prior to testing, on a 1:2 basis with individuals who had reported no pretest exercise. The baseline ImPACT composite scores of the 2 groups were then compared. Significant differences were observed for the ImPACT composite scores of verbal memory, visual memory, reaction time, and impulse control as well as for the total symptom score. No significant between-group difference was detected for the visual motor composite score. Furthermore, pretest exercise was associated with a significant increase in the overall frequency of invalid test results. Our results suggest a statistically significant difference in ImPACT composite scores between

  5. Autonomous and controlled motivation for eating disorders treatment: baseline predictors and relationship to treatment outcome.

    Science.gov (United States)

    Carter, Jacqueline C; Kelly, Allison C

    2015-03-01

    This study aimed to identify baseline predictors of autonomous and controlled motivation for treatment (ACMT) in a transdiagnostic eating disorder sample, and to examine whether ACMT at baseline predicted change in eating disorder psychopathology during treatment. Participants were 97 individuals who met DSM-IV-TR criteria for an eating disorder and were admitted to a specialized intensive treatment programme. Self-report measures of eating disorder psychopathology, ACMT, and various psychosocial variables were completed at the start of treatment. A subset of these measures was completed again after 3, 6, 9, and 12 weeks of treatment. Multiple regression analyses showed that baseline autonomous motivation was higher among patients who reported more self-compassion and more received social support, whereas the only baseline predictor of controlled motivation was shame. Multilevel modelling revealed that higher baseline autonomous motivation predicted faster decreases in global eating disorder psychopathology, whereas the level of controlled motivation at baseline did not. The current findings suggest that developing interventions designed to foster autonomous motivation specifically and employing autonomy supportive strategies may be important to improving eating disorders treatment outcome. The findings of this study suggest that developing motivational interventions that focus specifically on enhancing autonomous motivation for change may be important for promoting eating disorder recovery. Our results lend support for the use of autonomy supportive strategies to strengthen personally meaningful reasons to achieve freely chosen change goals in order to enhance treatment for eating disorders. One study limitation is that there were no follow-up assessments beyond the 12-week study and we therefore do not know whether the relationships that we observed persisted after treatment. Another limitation is that this was a correlational study and it is therefore important

  6. Energy and emission scenarios for China in the 21st century - exploration of baseline development and mitigation options

    International Nuclear Information System (INIS)

    Vuuren, Detlef van; Zhou Fengqi; Vries, Bert de; Jiang Kejun; Graveland, Cor; Li Yun

    2003-01-01

    In this paper, we have used the simulation model IMAGE/TIMER to develop a set of energy and emission scenarios for China between 1995 and 2100, based on the global baseline scenarios published by IPCC. The purpose of the study was to explore possible baseline developments and available options to mitigate emissions. The two main baseline scenarios of the study differ, among others, in the openness of the Chinese economy and in economic growth, but both indicate a rapid growth in carbon emissions (2.0% and 2.6% per year in the 2000-2050 period). The baseline scenario analysis also shows that an orientation on environmental sustainability can not only reduce other environmental pressures but also lower carbon emissions. In the mitigation analysis, a large number of options has been evaluated in terms of impacts on investments, user costs, fuel imports costs and emissions. It is found that a large potential exists to mitigate carbon emissions in China, among others in the form of energy efficiency improvement (with large co-benefits) and measures in the electricity sector. Combining all options considered, it appears to be possible to reduce emissions compared to the baseline scenarios by 50%

  7. Influences of Mental Illness, Current Psychological State, and Concussion History on Baseline Concussion Assessment Performance.

    Science.gov (United States)

    Weber, Michelle L; Dean, John-Henry L; Hoffman, Nicole L; Broglio, Steven P; McCrea, Michael; McAllister, Thomas W; Schmidt, Julianne D; Hoy, April Reed; Hazzard, Joseph B; Kelly, Louise A; Ortega, Justus D; Port, Nicholas; Putukian, Margot; Langford, T Dianne; Tierney, Ryan; Campbell, Darren E; McGinty, Gerald; O'Donnell, Patrick; Svoboda, Steven J; DiFiori, John P; Giza, Christopher C; Benjamin, Holly J; Buckley, Thomas; Kaminski, Thomas W; Clugston, James R; Feigenbaum, Luis A; Eckner, James T; Guskiewicz, Kevin; Mihalik, Jason P; Miles, Jessica Dysart; Anderson, Scott; Master, Christina L; Collins, Micky; Kontos, Anthony P; Bazarian, Jeffrey J; Chrisman, Sara P D; Brooks, Allison; Duma, Stefan; Bullers, Christopher Todd; Miles, Christopher M; Dykhuizen, Brian H

    2018-04-01

    A student-athlete's mental state, including history of trait anxiety and depression, or current psychological state may affect baseline concussion assessment performance. (1) To determine if mental illness (anxiety, depression, anxiety with depression) influences baseline scores, (2) to determine if psychological state correlates with baseline performance, and (3) to determine if history of concussion affects Brief Symptom Inventory-18 (BSI-18) subscores of state anxiety, depression, and somatization. Cross-sectional study; Level of evidence, 3. A sample of 8652 collegiate student-athletes (54.5% males, 45.5% females) participated in the Concussion Assessment, Research and Education (CARE) Consortium. Baseline assessments included a demographic form, a symptom evaluation, Standardized Assessment of Concussion, Balance Error Scoring System, a psychological state assessment (BSI-18), and Immediate Post-concussion Assessment and Cognitive Test. Baseline scores were compared between individuals with a history of anxiety (n = 59), depression (n = 283), and anxiety with depression (n = 68) and individuals without a history of those conditions (n = 8242). Spearman's rho correlations were conducted to assess the relationship between baseline and psychological state subscores (anxiety, depression, somatization) (α = .05). Psychological state subscores were compared between individuals with a self-reported history of concussions (0, 1, 2, 3, 4+) using Kruskal-Wallis tests (α = .05). Student-athletes with anxiety, depression, and anxiety with depression demonstrated higher scores in number of symptoms reported (anxiety, 4.3 ± 4.2; depression, 5.2 ± 4.8; anxiety with depression, 5.4 ± 3.9; no anxiety/depression, 2.5 ± 3.4), symptom severity (anxiety, 8.1 ± 9.8; depression, 10.4 ± 12.4; anxiety with depression, 12.4 ± 10.7; no anxiety/depression, 4.1 ± 6.9), and psychological distress in state anxiety (anxiety, 3.7 ± 4.7; depression, 2.5 ± 3.6; anxiety with

  8. Baseline development, economic risk, and schedule risk: An integrated approach

    International Nuclear Information System (INIS)

    Tonkinson, J.A.

    1994-01-01

    The economic and schedule risks of Environmental Restoration (ER) projects are commonly analyzed toward the end of the baseline development process. Risk analysis is usually performed as the final element of the scheduling or estimating processes for the purpose of establishing cost and schedule contingency. However, there is an opportunity for earlier assessment of risks, during development of the technical scope and Work Breakdown Structure (WBS). Integrating the processes of risk management and baselining provides for early incorporation of feedback regarding schedule and cost risk into the proposed scope of work. Much of the information necessary to perform risk analysis becomes available during development of the technical baseline, as the scope of work and WBS are being defined. The analysis of risk can actually be initiated early on during development of the technical baseline and continue throughout development of the complete project baseline. Indeed, best business practices suggest that information crucial to the success of a project be analyzed and incorporated into project planning as soon as it is available and usable

  9. Observational constraint on the interacting dark energy models including the Sandage-Loeb test

    Science.gov (United States)

    Zhang, Ming-Jian; Liu, Wen-Biao

    2014-05-01

    Two types of interacting dark energy models are investigated using the type Ia supernova (SNIa), observational data (OHD), cosmic microwave background shift parameter, and the secular Sandage-Loeb (SL) test. In the investigation, we have used two sets of parameter priors including WMAP-9 and Planck 2013. They have shown some interesting differences. We find that the inclusion of SL test can obviously provide a more stringent constraint on the parameters in both models. For the constant coupling model, the interaction term has been improved to be only a half of the original scale on corresponding errors. Comparing with only SNIa and OHD, we find that the inclusion of the SL test almost reduces the best-fit interaction to zero, which indicates that the higher-redshift observation including the SL test is necessary to track the evolution of the interaction. For the varying coupling model, data with the inclusion of the SL test show that the parameter at C.L. in Planck priors is , where the constant is characteristic for the severity of the coincidence problem. This indicates that the coincidence problem will be less severe. We then reconstruct the interaction , and we find that the best-fit interaction is also negative, similar to the constant coupling model. However, for a high redshift, the interaction generally vanishes at infinity. We also find that the phantom-like dark energy with is favored over the CDM model.

  10. MEMLS3&a: Microwave Emission Model of Layered Snowpacks adapted to include backscattering

    Directory of Open Access Journals (Sweden)

    M. Proksch

    2015-08-01

    Full Text Available The Microwave Emission Model of Layered Snowpacks (MEMLS was originally developed for microwave emissions of snowpacks in the frequency range 5–100 GHz. It is based on six-flux theory to describe radiative transfer in snow including absorption, multiple volume scattering, radiation trapping due to internal reflection and a combination of coherent and incoherent superposition of reflections between horizontal layer interfaces. Here we introduce MEMLS3&a, an extension of MEMLS, which includes a backscatter model for active microwave remote sensing of snow. The reflectivity is decomposed into diffuse and specular components. Slight undulations of the snow surface are taken into account. The treatment of like- and cross-polarization is accomplished by an empirical splitting parameter q. MEMLS3&a (as well as MEMLS is set up in a way that snow input parameters can be derived by objective measurement methods which avoid fitting procedures of the scattering efficiency of snow, required by several other models. For the validation of the model we have used a combination of active and passive measurements from the NoSREx (Nordic Snow Radar Experiment campaign in Sodankylä, Finland. We find a reasonable agreement between the measurements and simulations, subject to uncertainties in hitherto unmeasured input parameters of the backscatter model. The model is written in Matlab and the code is publicly available for download through the following website: http://www.iapmw.unibe.ch/research/projects/snowtools/memls.html.

  11. Dipole model analysis of highest precision HERA data, including very low Q"2's

    International Nuclear Information System (INIS)

    Luszczak, A.; Kowalski, H.

    2016-12-01

    We analyse, within a dipole model, the final, inclusive HERA DIS cross section data in the low χ region, using fully correlated errors. We show, that these highest precision data are very well described within the dipole model framework starting from Q"2 values of 3.5 GeV"2 to the highest values of Q"2=250 GeV"2. To analyze the saturation effects we evaluated the data including also the very low 0.35< Q"2 GeV"2 region. The fits including this region show a preference of the saturation ansatz.

  12. Energy Consumption Analysis for Concrete Residences—A Baseline Study in Taiwan

    Directory of Open Access Journals (Sweden)

    Kuo-Liang Lin

    2017-02-01

    Full Text Available Estimating building energy consumption is difficult because it deals with complex interactions among uncertain weather conditions, occupant behaviors, and building characteristics. To facilitate estimation, this study employs a benchmarking methodology to obtain energy baseline for sample buildings. Utilizing a scientific simulation tool, this study attempts to develop energy consumption baselines of two typical concrete residences in Taiwan, and subsequently allows a simplified energy consumption prediction process at an early design stage of building development. Using weather data of three metropolitan cities as testbeds, annual energy consumption of two types of modern residences are determined through a series of simulation sessions with different building settings. The impacts of key building characteristics, including building insulation, air tightness, orientation, location, and residence type, are carefully investigated. Sample utility bills are then collected to validate the simulated results, resulting in three adjustment parameters for normalization, including ‘number of residents’, ‘total floor area’, and ‘air conditioning comfort level’, for justification of occupant behaviors in different living conditions. Study results not only provide valuable benchmarking data serving as references for performance evaluation of different energy-saving strategies, but also show how effective extended building insulation, enhanced air tightness, and prudent selection of residence location and orientation can be for successful implementation of building sustainability in tropical and subtropical regions.

  13. Combining biological and psychosocial baseline variables did not improve prediction of outcome of a very-low-energy diet in a clinic referral population.

    Science.gov (United States)

    Sumithran, P; Purcell, K; Kuyruk, S; Proietto, J; Prendergast, L A

    2018-02-01

    Consistent, strong predictors of obesity treatment outcomes have not been identified. It has been suggested that broadening the range of predictor variables examined may be valuable. We explored methods to predict outcomes of a very-low-energy diet (VLED)-based programme in a clinically comparable setting, using a wide array of pre-intervention biological and psychosocial participant data. A total of 61 women and 39 men (mean ± standard deviation [SD] body mass index: 39.8 ± 7.3 kg/m 2 ) underwent an 8-week VLED and 12-month follow-up. At baseline, participants underwent a blood test and assessment of psychological, social and behavioural factors previously associated with treatment outcomes. Logistic regression, linear discriminant analysis, decision trees and random forests were used to model outcomes from baseline variables. Of the 100 participants, 88 completed the VLED and 42 attended the Week 60 visit. Overall prediction rates for weight loss of ≥10% at weeks 8 and 60, and attrition at Week 60, using combined data were between 77.8 and 87.6% for logistic regression, and lower for other methods. When logistic regression analyses included only baseline demographic and anthropometric variables, prediction rates were 76.2-86.1%. In this population, considering a wide range of biological and psychosocial data did not improve outcome prediction compared to simply-obtained baseline characteristics. © 2017 World Obesity Federation.

  14. Tank waste remediation system baseline tank waste inventory estimates for fiscal year 1995

    International Nuclear Information System (INIS)

    Shelton, L.W.

    1996-01-01

    A set of tank-by-tank waste inventories is derived from historical waste models, flowsheet records, and analytical data to support the Tank Waste Remediation System flowsheet and retrieval sequence studies. Enabling assumptions and methodologies used to develop the inventories are discussed. These provisional inventories conform to previously established baseline inventories and are meant to serve as an interim basis until standardized inventory estimates are made available

  15. Influence of Baseline Psychological Health on Muscle Pain During Atorvastatin Treatment.

    Science.gov (United States)

    Zaleski, Amanda L; Taylor, Beth A; Pescatello, Linda S; Dornelas, Ellen A; White, Charles Michael; Thompson, Paul D

    3-hydroxy-3-methylglutaryl coenzyme A reductase reductase inhibitors (statins) are generally well tolerated, with statin-associated muscle symptoms (SAMS) the most common side effect (~10%) seen in statin users. However, studies and clinical observations indicate that many of the self-reported SAMS appear to be nonspecific (ie, potentially not attributable to statins). Mental health and well-being influence self-perception of pain, so we sought to assess the effect of baseline well-being and depression on the development of muscle pain with 6 months of atorvastatin 80 mg/d (ATORVA) or placebo in healthy, statin-naive adults. The Psychological General Well-being Index (n = 83) and Beck Depression Inventory (n = 55) questionnaires were administered at baseline in participants (aged 59.5 ± 1.2 years) from the effect of Statins on Skeletal Muscle Function and Performance (STOMP) trial (NCT00609063). Muscle pain (Short-Form McGill Pain Questionnaire [SF-MPQ]), pain that interferes with daily life (Brief Pain Inventory [BPI]), and pain severity (BPI) were then measured before, throughout, and after treatment. At baseline, there were no differences in well-being (Psychological General Well-being Index), depression (Beck Depression Inventory), or pain measures (SF-MPQ and BPI) (P values ≥ .05) between the placebo and ATORVA groups. Baseline well-being correlated negatively with baseline BPI pain severity (r = -0.290, P = .008). Baseline depression correlated with baseline pain (SF-MPQ; r = 0.314, P = .020). Baseline well-being and depression did not predict the change in pain severity or interference after 6 months among the total sample or between groups (P values ≥ .05). Baseline well-being and depression were not significant predictors of pain after 6 months of ATORVA (P values ≥ .05). Thus, they do not appear to increase the risk of SAMS in otherwise healthy adults.

  16. A speech production model including the nasal Cavity

    DEFF Research Database (Denmark)

    Olesen, Morten

    In order to obtain articulatory analysis of speech production the model is improved. the standard model, as used in LPC analysis, to a large extent only models the acoustic properties of speech signal as opposed to articulatory modelling of the speech production. In spite of this the LPC model...... is by far the most widely used model in speech technology....

  17. Baseline prediction of combination therapy outcome in hepatitis C virus 1b infected patients by discriminant analysis using viral and host factors.

    Science.gov (United States)

    Saludes, Verónica; Bracho, Maria Alma; Valero, Oliver; Ardèvol, Mercè; Planas, Ramón; González-Candelas, Fernando; Ausina, Vicente; Martró, Elisa

    2010-11-30

    Current treatment of chronic hepatitis C virus (HCV) infection has limited efficacy -especially among genotype 1 infected patients-, is costly, and involves severe side effects. Thus, predicting non-response is of major interest for both patient wellbeing and health care expense. At present, treatment cannot be individualized on the basis of any baseline predictor of response. We aimed to identify pre-treatment clinical and virological parameters associated with treatment failure, as well as to assess whether therapy outcome could be predicted at baseline. Forty-three HCV subtype 1b (HCV-1b) chronically infected patients treated with pegylated-interferon alpha plus ribavirin were retrospectively studied (21 responders and 22 non-responders). Host (gender, age, weight, transaminase levels, fibrosis stage, and source of infection) and viral-related factors (viral load, and genetic variability in the E1-E2 and Core regions) were assessed. Logistic regression and discriminant analyses were used to develop predictive models. A "leave-one-out" cross-validation method was used to assess the reliability of the discriminant models. Lower alanine transaminase levels (ALT, p=0.009), a higher number of quasispecies variants in the E1-E2 region (number of haplotypes, nHap_E1-E2) (p=0.003), and the absence of both amino acid arginine at position 70 and leucine at position 91 in the Core region (p=0.039) were significantly associated with treatment failure. Therapy outcome was most accurately predicted by discriminant analysis (90.5% sensitivity and 95.5% specificity, 85.7% sensitivity and 81.8% specificity after cross-validation); the most significant variables included in the predictive model were the Core amino acid pattern, the nHap_E1-E2, and gamma-glutamyl transferase and ALT levels. Discriminant analysis has been shown as a useful tool to predict treatment outcome using baseline HCV genetic variability and host characteristics. The discriminant models obtained in this

  18. Baseline Report on HB2320

    Science.gov (United States)

    State Council of Higher Education for Virginia, 2015

    2015-01-01

    Staff provides this baseline report as a summary of its preliminary considerations and initial research in fulfillment of the requirements of HB2320 from the 2015 session of the General Assembly. Codified as § 23-7.4:7, this legislation compels the Education Secretary and the State Council of Higher Education for Virginia (SCHEV) Director, in…

  19. Pediatric Heart Transplantation: Transitioning to Adult Care (TRANSIT): Baseline Findings.

    Science.gov (United States)

    Grady, Kathleen L; Hof, Kathleen Van't; Andrei, Adin-Cristian; Shankel, Tamara; Chinnock, Richard; Miyamoto, Shelley; Ambardekar, Amrut V; Anderson, Allen; Addonizio, Linda; Latif, Farhana; Lefkowitz, Debra; Goldberg, Lee; Hollander, Seth A; Pham, Michael; Weissberg-Benchell, Jill; Cool, Nichole; Yancy, Clyde; Pahl, Elfriede

    2018-02-01

    Young adult solid organ transplant recipients who transfer from pediatric to adult care experience poor outcomes related to decreased adherence to the medical regimen. Our pilot trial for young adults who had heart transplant (HT) who transfer to adult care tests an intervention focused on increasing HT knowledge, self-management and self-advocacy skills, and enhancing support, as compared to usual care. We report baseline findings between groups regarding (1) patient-level outcomes and (2) components of the intervention. From 3/14 to 9/16, 88 subjects enrolled and randomized to intervention (n = 43) or usual care (n = 45) at six pediatric HT centers. Patient self-report questionnaires and medical records data were collected at baseline, and 3 and 6 months after transfer. For this report, baseline findings (at enrollment and prior to transfer to adult care) were analyzed using Chi-square and t-tests. Level of significance was p Baseline demographics were similar in the intervention and usual care arms: age 21.3 ± 3.2 vs 21.5 ± 3.3 years and female 44% vs 49%, respectively. At baseline, there were no differences between intervention and usual care for use of tacrolimus (70 vs 62%); tacrolimus level (mean ± SD = 6.5 ± 2.3 ng/ml vs 5.6 ± 2.3 ng/ml); average of the within patient standard deviation of the baseline mean tacrolimus levels (1.6 vs 1.3); and adherence to the medical regimen [3.6 ± 0.4 vs 3.5 ± 0.5 (1 = hardly ever to 4 = all of the time)], respectively. At baseline, both groups had a modest amount of HT knowledge, were learning self-management and self-advocacy, and perceived they were adequately supported. Baseline findings indicate that transitioning HT recipients lack essential knowledge about HT and have incomplete self-management and self-advocacy skills.

  20. Baseline hematology and serum biochemistry results for Indian leopards (Panthera pardus fusca

    Directory of Open Access Journals (Sweden)

    Arun Attur Shanmugam

    2017-07-01

    Full Text Available Aim: The aim of the study was to establish the baseline hematology and serum biochemistry values for Indian leopards (Panthera pardus fusca, and to assess the possible variations in these parameters based on age and gender. Materials and Methods: Hemato-biochemical test reports from a total of 83 healthy leopards, carried out as part of routine health evaluation in Bannerghatta Biological Park and Manikdoh Leopard Rescue Center, were used to establish baseline hematology and serum biochemistry parameters for the subspecies. The hematological parameters considered for the analysis included hemoglobin (Hb, packed cell volume, total erythrocyte count (TEC, total leukocyte count (TLC, mean corpuscular volume (MCV, mean corpuscular Hb (MCH, and MCH concentration. The serum biochemistry parameters considered included total protein (TP, albumin, globulin, aspartate aminotransferase, alanine aminotransferase (ALT, blood urea nitrogen, creatinine, triglycerides, calcium, and phosphorus. Results: Even though few differences were observed in hematologic and biochemistry values between male and female Indian leopards, the differences were statistically not significant. Effects of age, however, were evident in relation to many hematologic and biochemical parameters. Sub-adults had significantly greater values for Hb, TEC, and TLC compared to adults and geriatric group, whereas they had significantly lower MCV and MCH compared to adults and geriatric group. Among, serum biochemistry parameters the sub-adult age group was observed to have significantly lower values for TP and ALT than adult and geriatric leopards. Conclusion: The study provides a comprehensive analysis of hematologic and biochemical parameters for Indian leopards. Baselines established here will permit better captive management of the subspecies, serve as a guide to assess the health and physiological status of the free ranging leopards, and may contribute valuable information for making

  1. Baseline hematology and serum biochemistry results for Indian leopards (Panthera pardus fusca)

    Science.gov (United States)

    Shanmugam, Arun Attur; Muliya, Sanath Krishna; Deshmukh, Ajay; Suresh, Sujay; Nath, Anukul; Kalaignan, Pa; Venkataravanappa, Manjunath; Jose, Lyju

    2017-01-01

    Aim: The aim of the study was to establish the baseline hematology and serum biochemistry values for Indian leopards (Panthera pardus fusca), and to assess the possible variations in these parameters based on age and gender. Materials and Methods: Hemato-biochemical test reports from a total of 83 healthy leopards, carried out as part of routine health evaluation in Bannerghatta Biological Park and Manikdoh Leopard Rescue Center, were used to establish baseline hematology and serum biochemistry parameters for the subspecies. The hematological parameters considered for the analysis included hemoglobin (Hb), packed cell volume, total erythrocyte count (TEC), total leukocyte count (TLC), mean corpuscular volume (MCV), mean corpuscular Hb (MCH), and MCH concentration. The serum biochemistry parameters considered included total protein (TP), albumin, globulin, aspartate aminotransferase, alanine aminotransferase (ALT), blood urea nitrogen, creatinine, triglycerides, calcium, and phosphorus. Results: Even though few differences were observed in hematologic and biochemistry values between male and female Indian leopards, the differences were statistically not significant. Effects of age, however, were evident in relation to many hematologic and biochemical parameters. Sub-adults had significantly greater values for Hb, TEC, and TLC compared to adults and geriatric group, whereas they had significantly lower MCV and MCH compared to adults and geriatric group. Among, serum biochemistry parameters the sub-adult age group was observed to have significantly lower values for TP and ALT than adult and geriatric leopards. Conclusion: The study provides a comprehensive analysis of hematologic and biochemical parameters for Indian leopards. Baselines established here will permit better captive management of the subspecies, serve as a guide to assess the health and physiological status of the free ranging leopards, and may contribute valuable information for making effective

  2. Baseline review of the U.S. LHC Accelerator project

    International Nuclear Information System (INIS)

    1998-02-01

    The Department of Energy (DOE) Review of the U.S. Large Hadron Collider (LHC) Accelerator project was conducted February 23--26, 1998, at the request of Dr. John R. O'Fallon, Director, Division of High Energy Physics, Office of Energy Research, U.S. DOE. This is the first review of the U.S. LHC Accelerator project. Overall, the Committee found that the U.S. LHC Accelerator project effort is off to a good start and that the proposed scope is very conservative for the funding available. The Committee recommends that the project be initially baselined at a total cost of $110 million, with a scheduled completion data of 2005. The U.S. LHC Accelerator project will supply high technology superconducting magnets for the interaction regions (IRs) and the radio frequency (rf) straight section of the LHC intersecting storage rings. In addition, the project provides the cryogenic support interface boxes to service the magnets and radiation absorbers to protect the IR dipoles and the inner triplet quadrupoles. US scientists will provide support in analyzing some of the detailed aspects of accelerator physics in the two rings. The three laboratories participating in this project are Brookhaven National Laboratory, Fermi National Accelerator Laboratory (Fermilab), and Lawrence Berkeley National Laboratory. The Committee was very impressed by the technical capabilities of the US LHC Accelerator project team. Cost estimates for each subsystem of the US LHC Accelerator project were presented to the Review Committee, with a total cost including contingency of $110 million (then year dollars). The cost estimates were deemed to be conservative. A re-examination of the funding profile, costs, and schedules on a centralized project basis should lead to an increased list of deliverables. The Committee concluded that the proposed scope of US deliverables to CERN can be readily accomplished with the $110 million total cost baseline for the project. The current deliverables should serve as

  3. Benefits of combination of insulin degludec and liraglutide are independent of baseline glycated haemoglobin level and duration of type 2 diabetes

    DEFF Research Database (Denmark)

    Rodbard, Helena W; Buse, John B; Woo, Vincent C

    2016-01-01

    liraglutide, irrespective of baseline HbA1c. In DUAL II, insulin dose and hypoglycaemia rate were similar with IDegLira and IDeg (maximum dose limited to 50 U) independent of baseline HbA1c. The reduction in HbA1c with IDegLira was independent of disease duration and previous insulin dose but varied depending...... of disease progression stage including baseline glycated haemoglobin (HbA1c), disease duration and previous insulin dose. RESULTS: Across four categories of baseline HbA1c (≤7.5-9.0%), HbA1c reductions were significantly greater with IDegLira (1.1-2.5%) compared with IDeg or liraglutide alone in DUAL I...

  4. Including an ocean carbon cycle model into iLOVECLIM (v1.0)

    NARCIS (Netherlands)

    Bouttes, N.; Roche, D.M.V.A.P.; Mariotti, V.; Bopp, L.

    2015-01-01

    The atmospheric carbon dioxide concentration plays a crucial role in the radiative balance and as such has a strong influence on the evolution of climate. Because of the numerous interactions between climate and the carbon cycle, it is necessary to include a model of the carbon cycle within a

  5. High baseline left ventricular and systolic volume may identify patients at risk of chemotherapy-induced cardiotoxicity

    International Nuclear Information System (INIS)

    Atiar Rahman; Alex Gedevanishvili; Seham Ali; Elma G Briscoe; Vani Vijaykumar

    2004-01-01

    Introduction and Methods: Use of chemotherapeutic drugs in the treatment of cancer may lead to serious cardiotoxicity and to post-treatment heart failure. Various strategies have been developed to minimize the risk of cardiotoxicity including avoiding the total dosage given to each patient above a certain 'threshold' value; and monitoring the patient's cardiac function by means of the 'Multiple Gated Acquisition' (MUGA) scan using Technetium 99m . However, even with all these precautions some patients still develop cardiotoxicity and it is not well known which factors predict deterioration of cardiac functions in patients with optimized chemotherapeutic dosages. In this retrospective study we sought to evaluate the predictive value of seven variables (age, sex, baseline LV ejection fraction, LV end diastolic [LDEDV] and end systolic volumes [LVESV], peak diastolic filling rate, preexisting malignancies requiring chemotherapy) in 172 patients (n=Breast Carcinoma 86, lymphoma 62, Leukemias and others 24) undergoing chemotherapy from 1995 until 2000. There was no cut off for left ventricular ejection fraction prior to chemotherapy. However, patients were excluded from analysis if they had significant cardiac arrhythmias or received doses higher than considered safe for cardiotoxicity at the beginning of the study. Significant cardiotoxicity was defined as a drop in post chemotherapy LVEF by >15%. Results: Logistic regression models were used to predict the probability of developing cardiotoxicity as a function of the seven prognostic covariates. The mean age of all patients was 51+13 years. Significant Cardiac toxicity was noted in 10 percent of patients. The overall risk estimate for subsequent heart failure after chemotherapy, however, climbed to 18 percent in patients with a presenting LVESD >50 mL. Using multivariate logistic regression model, older age was noted to be a weak risk factors for cardiac toxicity (confidence interval 0.8-1.2; p 50 mL) appeared to

  6. A range of complex probabilistic models for RNA secondary structure prediction that includes the nearest-neighbor model and more.

    Science.gov (United States)

    Rivas, Elena; Lang, Raymond; Eddy, Sean R

    2012-02-01

    The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases.

  7. Multi-objective optimization for an automated and simultaneous phase and baseline correction of NMR spectral data

    Science.gov (United States)

    Sawall, Mathias; von Harbou, Erik; Moog, Annekathrin; Behrens, Richard; Schröder, Henning; Simoneau, Joël; Steimers, Ellen; Neymeyr, Klaus

    2018-04-01

    Spectral data preprocessing is an integral and sometimes inevitable part of chemometric analyses. For Nuclear Magnetic Resonance (NMR) spectra a possible first preprocessing step is a phase correction which is applied to the Fourier transformed free induction decay (FID) signal. This preprocessing step can be followed by a separate baseline correction step. Especially if series of high-resolution spectra are considered, then automated and computationally fast preprocessing routines are desirable. A new method is suggested that applies the phase and the baseline corrections simultaneously in an automated form without manual input, which distinguishes this work from other approaches. The underlying multi-objective optimization or Pareto optimization provides improved results compared to consecutively applied correction steps. The optimization process uses an objective function which applies strong penalty constraints and weaker regularization conditions. The new method includes an approach for the detection of zero baseline regions. The baseline correction uses a modified Whittaker smoother. The functionality of the new method is demonstrated for experimental NMR spectra. The results are verified against gravimetric data. The method is compared to alternative preprocessing tools. Additionally, the simultaneous correction method is compared to a consecutive application of the two correction steps.

  8. Relation of Anxiety and Depressive Symptoms to Coronary Artery Calcium (from the ELSA-Brasil Baseline Data).

    Science.gov (United States)

    Santos, Itamar S; Bittencourt, Marcio S; Rocco, Priscila T; Pereira, Alexandre C; Barreto, Sandhi M; Brunoni, André R; Goulart, Alessandra C; Blaha, Michael J; Lotufo, Paulo A; Bensenor, Isabela M

    2016-07-15

    Previous studies of the association between symptoms of anxiety or depression and coronary artery calcium (CAC) have produced heterogeneous results. Our aim was to investigate whether psychopathological symptoms were associated with CAC in a cross-sectional analysis of the Brazilian Longitudinal Study of Adult Health (ELSA-Brasil) baseline. We analyzed data from 4,279 ELSA-Brasil subjects (aged 35 to 74 years) from the São Paulo site without previous cardiovascular disease who underwent CAC score assessment at baseline. Prevalent CAC was defined as a CAC score >0. Anxiety and depressive symptoms were assessed using the Clinical Interview Schedule-Revised (CIS-R). We built binary logistic regression models to determine whether CIS-R scores, anxiety, or depression were associated with prevalent CAC. Prevalent CAC was found in 1,211 subjects (28.3%). After adjustment for age and gender, a direct association between CIS-R scores and prevalent CAC was revealed (odds ratio for 1-SD increase: 1.12; 95% confidence interval [CI] 1.04 to 1.22). This association persisted after multivariate adjustment (odds ratio for 1-SD increase 1.11; 95% CI 1.02 to 1.20). No independent associations were found for specific diagnoses of anxiety or depression and prevalent CAC. In post hoc models, a significant interaction term (p = 0.019) suggested a stronger association in older subjects. In conclusion, psychopathological symptoms were directly associated with coronary atherosclerosis in the ELSA-Brasil baseline in adjusted models, and this association seems to be stronger in older subjects. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Baseline Design Compliance Matrix for the Rotary Mode Core Sampling System

    International Nuclear Information System (INIS)

    LECHELT, J.A.

    2000-01-01

    The purpose of the design compliance matrix (DCM) is to provide a single-source document of all design requirements associated with the fifteen subsystems that make up the rotary mode core sampling (RMCS) system. It is intended to be the baseline requirement document for the RMCS system and to be used in governing all future design and design verification activities associated with it. This document is the DCM for the RMCS system used on Hanford single-shell radioactive waste storage tanks. This includes the Exhauster System, Rotary Mode Core Sample Trucks, Universal Sampling System, Diesel Generator System, Distribution Trailer, X-Ray Cart System, Breathing Air Compressor, Nitrogen Supply Trailer, Casks and Cask Truck, Service Trailer, Core Sampling Riser Equipment, Core Sampling Support Trucks, Foot Clamp, Ramps and Platforms and Purged Camera System. Excluded items are tools such as light plants and light stands. Other items such as the breather inlet filter are covered by a different design baseline. In this case, the inlet breather filter is covered by the Tank Farms Design Compliance Matrix

  10. Refitting density dependent relativistic model parameters including Center-of-Mass corrections

    International Nuclear Information System (INIS)

    Avancini, Sidney S.; Marinelli, Jose R.; Carlson, Brett Vern

    2011-01-01

    Full text: Relativistic mean field models have become a standard approach for precise nuclear structure calculations. After the seminal work of Serot and Walecka, which introduced a model Lagrangian density where the nucleons interact through the exchange of scalar and vector mesons, several models were obtained through its generalization, including other meson degrees of freedom, non-linear meson interactions, meson-meson interactions, etc. More recently density dependent coupling constants were incorporated into the Walecka-like models, which are then extensively used. In particular, for these models a connection with the density functional theory can be established. Due to the inherent difficulties presented by field theoretical models, only the mean field approximation is used for the solution of these models. In order to calculate finite nuclei properties in the mean field approximation, a reference set has to be fixed and therefore the translational symmetry is violated. It is well known that in such case spurious effects due to the center-of-mass (COM) motion are present, which are more pronounced for light nuclei. In a previous work we have proposed a technique based on the Pierls-Yoccoz projection operator applied to the mean-field relativistic solution, in order to project out spurious COM contributions. In this work we obtain a new fitting for the density dependent parameters of a density dependent hadronic model, taking into account the COM corrections. Our fitting is obtained taking into account the charge radii and binding energies for He 4 , O 16 , Ca 40 , Ca 48 , Ni 56 , Ni 68 , Sn 100 , Sn 132 and Pb 208 . We show that the nuclear observables calculated using our fit are of a quality comparable to others that can be found in the literature, with the advantage that now a translational invariant many-body wave function is at our disposal. (author)

  11. A numerical model including PID control of a multizone crystal growth furnace

    Science.gov (United States)

    Panzarella, Charles H.; Kassemi, Mohammad

    1992-01-01

    This paper presents a 2D axisymmetric combined conduction and radiation model of a multizone crystal growth furnace. The model is based on a programmable multizone furnace (PMZF) designed and built at NASA Lewis Research Center for growing high quality semiconductor crystals. A novel feature of this model is a control algorithm which automatically adjusts the power in any number of independently controlled heaters to establish the desired crystal temperatures in the furnace model. The control algorithm eliminates the need for numerous trial and error runs previously required to obtain the same results. The finite element code, FIDAP, used to develop the furnace model, was modified to directly incorporate the control algorithm. This algorithm, which presently uses PID control, and the associated heat transfer model are briefly discussed. Together, they have been used to predict the heater power distributions for a variety of furnace configurations and desired temperature profiles. Examples are included to demonstrate the effectiveness of the PID controlled model in establishing isothermal, Bridgman, and other complicated temperature profies in the sample. Finally, an example is given to show how the algorithm can be used to change the desired profile with time according to a prescribed temperature-time evolution.

  12. Baseline series fragrance markers fail to predict contact allergy.

    Science.gov (United States)

    Mann, Jack; McFadden, John P; White, Jonathan M L; White, Ian R; Banerjee, Piu

    2014-05-01

    Negative patch test results with fragrance allergy markers in the European baseline series do not always predict a negative reaction to individual fragrance substances. To determine the frequencies of positive test reactions to the 26 fragrance substances for which labelling is mandatory in the EU, and how effectively reactions to fragrance markers in the baseline series predict positive reactions to the fragrance substances that are labelled. The records of 1951 eczema patients, routinely tested with the labelled fragrance substances and with an extended European baseline series in 2011 and 2012, were retrospectively reviewed. Two hundred and eighty-one (14.4%) (71.2% females) reacted to one or more allergens from the labelled-fragrance substance series and/or a fragrance marker from the European baseline series. The allergens that were positive with the greatest frequencies were cinnamyl alcohol (48; 2.46%), Evernia furfuracea (44; 2.26%), and isoeugenol (40; 2.05%). Of the 203 patients who reacted to any of the 26 fragrances in the labelled-fragrance substance series, only 117 (57.6%) also reacted to a fragrance marker in the baseline series. One hundred and seven (52.7%) reacted to either fragrance mix I or fragrance mix II, 28 (13.8%) reacted to Myroxylon pereirae, and 13 (6.4%) reacted to hydroxyisohexyl 3-cyclohexene carboxaldehyde. These findings confirm that the standard fragrance markers fail to identify patients with contact allergies to the 26 fragrances. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  13. Mobile Robots Path Planning Using the Overall Conflict Resolution and Time Baseline Coordination

    Directory of Open Access Journals (Sweden)

    Yong Ma

    2014-01-01

    Full Text Available This paper aims at resolving the path planning problem in a time-varying environment based on the idea of overall conflict resolution and the algorithm of time baseline coordination. The basic task of the introduced path planning algorithms is to fulfill the automatic generation of the shortest paths from the defined start poses to their end poses with consideration of generous constraints for multiple mobile robots. Building on this, by using the overall conflict resolution, within the polynomial based paths, we take into account all the constraints including smoothness, motion boundary, kinematics constraints, obstacle avoidance, and safety constraints among robots together. And time baseline coordination algorithm is proposed to process the above formulated problem. The foremost strong point is that much time can be saved with our approach. Numerical simulations verify the effectiveness of our approach.

  14. Safe distance car-following model including backward-looking and its stability analysis

    Science.gov (United States)

    Yang, Da; Jin, Peter Jing; Pu, Yun; Ran, Bin

    2013-03-01

    The focus of this paper is the car-following behavior including backward-looking, simply called the bi-directional looking car-following behavior. This study is motivated by the potential changes of the physical properties of traffic flow caused by the fast developing intelligent transportation system (ITS), especially the new connected vehicle technology. Existing studies on this topic focused on general motors (GM) models and optimal velocity (OV) models. The safe distance car-following model, Gipps' model, which is more widely used in practice have not drawn too much attention in the bi-directional looking context. This paper explores the property of the bi-directional looking extension of Gipps' safe distance model. The stability condition of the proposed model is derived using the linear stability theory and is verified using numerical simulations. The impacts of the driver and vehicle characteristics appeared in the proposed model on the traffic flow stability are also investigated. It is found that taking into account the backward-looking effect in car-following has three types of effect on traffic flow: stabilizing, destabilizing and producing non-physical phenomenon. This conclusion is more sophisticated than the study results based on the OV bi-directional looking car-following models. Moreover, the drivers who have the smaller reaction time or the larger additional delay and think the other vehicles have larger maximum decelerations can stabilize traffic flow.

  15. Process industry energy retrofits: the importance of emission baselines for greenhouse gas reductions

    International Nuclear Information System (INIS)

    Aadahl, Anders; Harvey, Simon; Berntsson, Thore

    2004-01-01

    Fuel combustion for heat and/or electric power production is often the largest contributor of greenhouse gas (GHG) emissions from an industrial process plant. Economically feasible options to reduce these emissions include fuel switching and retrofitting the plant's energy system. Process integration methods and tools can be used to evaluate potential retrofit measures. For assessing the GHG emissions reduction potential for the measures considered, it is also necessary to define appropriate GHG emission baselines. This paper presents a systematic GHG emission calculation method for retrofit situations including improved heat exchange, integration of combined heat and power (CHP) units, and combinations of both. The proposed method is applied to five different industrial processes in order to compare the impact of process specific parameters and energy market specific parameters. For potential GHG emission reductions the results from the applied study reveal that electricity grid emissions are significantly more important than differences between individual processes. Based on the results of the study, it is suggested that for sustainable investment decision considerations a conservative emission baseline is most appropriate. Even so, new industrial CHP in the Northern European energy market could play a significant role in the common effort to decrease GHG emissions

  16. Back-calculating baseline creatinine overestimates prevalence of acute kidney injury with poor sensitivity.

    Science.gov (United States)

    Kork, F; Balzer, F; Krannich, A; Bernardi, M H; Eltzschig, H K; Jankowski, J; Spies, C

    2017-03-01

    Acute kidney injury (AKI) is diagnosed by a 50% increase in creatinine. For patients without a baseline creatinine measurement, guidelines suggest estimating baseline creatinine by back-calculation. The aim of this study was to evaluate different glomerular filtration rate (GFR) equations and different GFR assumptions for back-calculating baseline creatinine as well as the effect on the diagnosis of AKI. The Modification of Diet in Renal Disease, the Chronic Kidney Disease Epidemiology (CKD-EPI) and the Mayo quadratic (MQ) equation were evaluated to estimate baseline creatinine, each under the assumption of either a fixed GFR of 75 mL min -1  1.73 m -2 or an age-adjusted GFR. Estimated baseline creatinine, diagnoses and severity stages of AKI based on estimated baseline creatinine were compared to measured baseline creatinine and corresponding diagnoses and severity stages of AKI. The data of 34 690 surgical patients were analysed. Estimating baseline creatinine overestimated baseline creatinine. Diagnosing AKI based on estimated baseline creatinine had only substantial agreement with AKI diagnoses based on measured baseline creatinine [Cohen's κ ranging from 0.66 (95% CI 0.65-0.68) to 0.77 (95% CI 0.76-0.79)] and overestimated AKI prevalence with fair sensitivity [ranging from 74.3% (95% CI 72.3-76.2) to 90.1% (95% CI 88.6-92.1)]. Staging AKI severity based on estimated baseline creatinine had moderate agreement with AKI severity based on measured baseline creatinine [Cohen's κ ranging from 0.43 (95% CI 0.42-0.44) to 0.53 (95% CI 0.51-0.55)]. Diagnosing AKI and staging AKI severity on the basis of estimated baseline creatinine in surgical patients is not feasible. Patients at risk for post-operative AKI should have a pre-operative creatinine measurement to adequately assess post-operative AKI. © 2016 Scandinavian Physiological Society. Published by John Wiley & Sons Ltd.

  17. COMPARISON OF THREE METHODS TO PROJECT FUTURE BASELINE CARBON EMISSIONS IN TEMPERATE RAINFOREST, CURINANCO, CHILE

    Energy Technology Data Exchange (ETDEWEB)

    Patrick Gonzalez; Antonio Lara; Jorge Gayoso; Eduardo Neira; Patricio Romero; Leonardo Sotomayor

    2005-07-14

    evaluate the three methods to project future baseline carbon emissions. Extrapolation from Landsat change detection uses the observed rate of change to estimate change in the near future. Geomod is a software program that models the geographic distribution of change using a defined rate of change. FRCA is an integrated spatial analysis of forest inventory, biodiversity, and remote sensing that produces estimates of forest biodiversity and forest carbon density, spatial data layers of future probabilities of reforestation and deforestation, and a projection of future baseline forest carbon sequestration and emissions for an ecologically-defined area of analysis. For the period 1999-2012, extrapolation from Landsat change detection estimated a loss of 5000 ha and 520,000 t carbon from closed natural forest; Geomod modeled a loss of 2500 ha and 250 000 t; FRCA projected a loss of 4700 {+-} 100 ha and 480,000 t (maximum 760,000 t, minimum 220,000 t). Concerning labor time, extrapolation for Landsat required 90 actual days or 120 days normalized to Bachelor degree level wages; Geomod required 240 actual days or 310 normalized days; FRCA required 110 actual days or 170 normalized days. Users experienced difficulties with an MS-DOS version of Geomod before turning to the Idrisi version. For organizations with limited time and financing, extrapolation from Landsat change provides a cost-effective method. Organizations with more time and financing could use FRCA, the only method where that calculates the deforestation rate as a dependent variable rather than assuming a deforestation rate as an independent variable. This research indicates that best practices for the projection of baseline carbon emissions include integration of forest inventory and remote sensing tasks from the beginning of the analysis, definition of an analysis area using ecological characteristics, use of standard and widely used geographic information systems (GIS) software applications, and the use of species

  18. SPheno 3.1: extensions including flavour, CP-phases and models beyond the MSSM

    Science.gov (United States)

    Porod, W.; Staub, F.

    2012-11-01

    We describe recent extensions of the program SPhenoincluding flavour aspects, CP-phases, R-parity violation and low energy observables. In case of flavour mixing all masses of supersymmetric particles are calculated including the complete flavour structure and all possible CP-phases at the 1-loop level. We give details on implemented seesaw models, low energy observables and the corresponding extension of the SUSY Les Houches Accord. Moreover, we comment on the possibilities to include MSSM extensions in SPheno. Catalogue identifier: ADRV_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADRV_v2_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 154062 No. of bytes in distributed program, including test data, etc.: 1336037 Distribution format: tar.gz Programming language: Fortran95. Computer: PC running under Linux, should run in every Unix environment. Operating system: Linux, Unix. Classification: 11.6. Catalogue identifier of previous version: ADRV_v1_0 Journal reference of previous version: Comput. Phys. Comm. 153(2003)275 Does the new version supersede the previous version?: Yes Nature of problem: The first issue is the determination of the masses and couplings of supersymmetric particles in various supersymmetric models, the R-parity conserved MSSM with generation mixing and including CP-violating phases, various seesaw extensions of the MSSM and the MSSM with bilinear R-parity breaking. Low energy data on Standard Model fermion masses, gauge couplings and electroweak gauge boson masses serve as constraints. Radiative corrections from supersymmetric particles to these inputs must be calculated. Theoretical constraints on the soft SUSY breaking parameters from a high scale theory are imposed and the parameters at the electroweak scale are obtained from the

  19. Solid Waste Program technical baseline description

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, A.B.

    1994-07-01

    The system engineering approach has been taken to describe the technical baseline under which the Solid Waste Program is currently operating. The document contains a mission analysis, function analysis, system definition, documentation requirements, facility and project bases, and uncertainties facing the program.

  20. Enhanced battery model including temperature effects

    NARCIS (Netherlands)

    Rosca, B.; Wilkins, S.

    2013-01-01

    Within electric and hybrid vehicles, batteries are used to provide/buffer the energy required for driving. However, battery performance varies throughout the temperature range specific to automotive applications, and as such, models that describe this behaviour are required. This paper presents a

  1. Geodesy by radio interferometry - Determinations of baseline vector, earth rotation, and solid earth tide parameters with the Mark I very long baseline radio interferometery system

    Science.gov (United States)

    Ryan, J. W.; Clark, T. A.; Coates, R. J.; Ma, C.; Wildes, W. T.

    1986-01-01

    Thirty-seven very long baseline radio interferometry experiments performed between 1972 and 1978 are analyzed and estimates of baseline vectors between six sites, five in the continental United States and one in Europe are derived. No evidence of significant changes in baseline length is found. For example, with a statistical level of confidence of approximately 85 percent, upper bounds on such changes within the United States ranged from a low of 10 mm/yr for the 850 km baseline between Westford, Massachusetts, and Green Bank, West Virginia, to a high of 90 mm/yr for the nearly 4000 km baseline between Westford and Goldstone, California. Estimates for universal time and for the x component of the position of the earth's pole are obtained. For the last 15 experiments, the only ones employing wideband receivers, the root-mean-square differences between the derived values and the corresponding ones published by the Bureau International de l'Heure are 0.0012 s and 0.018 arc sec respectively. The average value obtained for the radial Love number for the solid earth is 0.62 + or - 0.02 (estimated standard error).

  2. A Score for Risk of Thrombolysis-Associated Hemorrhage Including Pretreatment with Statins

    Directory of Open Access Journals (Sweden)

    Hebun Erdur

    2018-02-01

    Full Text Available BackgroundSymptomatic intracranial hemorrhage (sICH after intravenous thrombolysis with recombinant tissue-plasminogen activator (rt-PA for acute ischemic stroke is associated with a poor functional outcome. We aimed to develop a score assessing risk of sICH including novel putative predictors—namely, pretreatment with statins and severe renal impairment.MethodsWe analyzed our local cohort (Berlin of patients receiving rt-PA for acute ischemic stroke between 2006 and 2016. Outcome was sICH according to ECASS-III criteria. A multiple regression model identified variables associated with sICH and receiver operating characteristics were calculated for the best discriminatory model for sICH. The model was validated in an independent thrombolysis cohort (Basel.ResultssICH occurred in 53 (4.0% of 1,336 patients in the derivation cohort. Age, baseline National Institutes of Health Stroke Scale, systolic blood pressure on admission, blood glucose on admission, and prior medication with medium- or high-dose statins were associated with sICH and included into the risk of intracranial hemorrhage score. The validation cohort included 983 patients of whom 33 (3.4% had a sICH. c-Statistics for sICH was 0.72 (95% CI 0.66–0.79 in the derivation cohort and 0.69 (95% CI 0.60–0.77 in the independent validation cohort. Inclusion of severe renal impairment did not improve the score.ConclusionWe developed a simple score with fair discriminating capability to predict rt-PA-related sICH by adding prior statin use to known prognostic factors of sICH. This score may help clinicians to identify patients with higher risk of sICH requiring intensive monitoring.

  3. Long-baseline neutrino oscillation experiments

    International Nuclear Information System (INIS)

    Crane, D.; Goodman, M.

    1994-01-01

    There is no unambiguous definition for long baseline neutrino oscillation experiments. The term is generally used for accelerator neutrino oscillation experiments which are sensitive to Δm 2 2 , and for which the detector is not on the accelerator site. The Snowmass N2L working group met to discuss the issues facing such experiments. The Fermilab Program Advisory Committee adopted several recommendations concerning the Fermilab neutrino program at their Aspen meeting immediately prior to the Snowmass Workshop. This heightened the attention for the proposals to use Fermilab for a long-baseline neutrino experiment at the workshop. The plan for a neutrino oscillation program at Brookhaven was also thoroughly discussed. Opportunities at CERN were considered, particularly the use of detectors at the Gran Sasso laboratory. The idea to build a neutrino beam from KEK towards Superkamiokande was not discussed at the Snowmass meeting, but there has been considerable development of this idea since then. Brookhaven and KEK would use low energy neutrino beams, while FNAL and CERN would plan have medium energy beams. This report will summarize a few topics common to LBL proposals and attempt to give a snapshot of where things stand in this fast developing field

  4. Models of epidemics: when contact repetition and clustering should be included

    Directory of Open Access Journals (Sweden)

    Scholz Roland W

    2009-06-01

    Full Text Available Abstract Background The spread of infectious disease is determined by biological factors, e.g. the duration of the infectious period, and social factors, e.g. the arrangement of potentially contagious contacts. Repetitiveness and clustering of contacts are known to be relevant factors influencing the transmission of droplet or contact transmitted diseases. However, we do not yet completely know under what conditions repetitiveness and clustering should be included for realistically modelling disease spread. Methods We compare two different types of individual-based models: One assumes random mixing without repetition of contacts, whereas the other assumes that the same contacts repeat day-by-day. The latter exists in two variants, with and without clustering. We systematically test and compare how the total size of an outbreak differs between these model types depending on the key parameters transmission probability, number of contacts per day, duration of the infectious period, different levels of clustering and varying proportions of repetitive contacts. Results The simulation runs under different parameter constellations provide the following results: The difference between both model types is highest for low numbers of contacts per day and low transmission probabilities. The number of contacts and the transmission probability have a higher influence on this difference than the duration of the infectious period. Even when only minor parts of the daily contacts are repetitive and clustered can there be relevant differences compared to a purely random mixing model. Conclusion We show that random mixing models provide acceptable estimates of the total outbreak size if the number of contacts per day is high or if the per-contact transmission probability is high, as seen in typical childhood diseases such as measles. In the case of very short infectious periods, for instance, as in Norovirus, models assuming repeating contacts will also behave

  5. 40 CFR 80.1285 - How does a refiner apply for a benzene baseline?

    Science.gov (United States)

    2010-07-01

    ... baseline? 80.1285 Section 80.1285 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... (abt) Program § 80.1285 How does a refiner apply for a benzene baseline? (a) A benzene baseline... credits. (b) For U.S. Postal delivery, the benzene baseline application shall be sent to: Attn: MSAT2...

  6. Expedited technology demonstration project (Revised mixed waste management facility project) Project baseline revision 4.0 and FY98 plan

    International Nuclear Information System (INIS)

    Adamson, M. G.

    1997-01-01

    The re-baseline of the Expedited Technology Demonstration Project (Revised Mixed Waste Facility Project) is designated as Project Baseline Revision 4.0. The last approved baseline was identified as Project Baseline Revision 3.0 and was issued in October 1996. Project Baseline Revision 4.0 does not depart from the formal DOE guidance followed by, and contained in, Revision 3.0. This revised baseline document describes the MSO and Final Forms testing activities that will occur during FY98, the final year of the ETD Project. The cost estimate for work during FY98 continues to be $2.OM as published in Revision 3.0. However, the funds will be all CENRTC rather than the OPEX/CENTRC split previously anticipated. LLNL has waived overhead charges on ETD Project CENRTC funds since the beginning of project activities. By requesting the $2.OM as all CENTRC a more aggressive approach to staffing and testing can be taken. Due to a cost under- run condition during FY97 procurements were made and work was accomplished, with the knowledge of DOE, in the Feed Preparation and Final Forms areas that were not in the scope of Revision 3.0. Feed preparation activities for FY98 have been expanded to include the drum opening station/enclosure previously deleted

  7. IEA Wind Task 26: Offshore Wind Farm Baseline Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Smart, Gavin [Offshore Renewable Energy Catapult, Blyth, Northumberland (United Kingdom); Smith, Aaron [National Renewable Energy Lab. (NREL), Golden, CO (United States); Warner, Ethan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sperstad, Iver Bakken [SINTEF Energy Research, Trondheim (Norway); Prinsen, Bob [Ecofys, Utrecht (Netherlands). TKI Wind Op Zee; Lacal-Arantegui, Roberto [European Commission Joint Research Centre (JRC), Brussels (Belgium)

    2016-06-02

    This document has been produced to provide the definition and rationale for the Baseline Offshore Wind Farm established within IEA Wind Task 26--Cost of Wind Energy. The Baseline has been developed to provide a common starting point for country comparisons and sensitivity analysis on key offshore wind cost and value drivers. The baseline project reflects an approximate average of the characteristics of projects installed between 2012 and 2014, with the project life assumed to be 20 years. The baseline wind farm is located 40 kilometres (km) from construction and operations and maintenance (O&M) ports and from export cable landfall. The wind farm consists of 100 4-megawatt (MW) wind turbines mounted on monopile foundations in an average water depth of 25 metres (m), connected by 33-kilovolt (kV) inter-array cables. The arrays are connected to a single offshore substation (33kV/220kV) mounted on a jacket foundation, with the substation connected via a single 220kV export cable to an onshore substation, 10km from landfall. The wind farm employs a port-based O&M strategy using crew-transfer vessels.

  8. Baselining PMU Data to Find Patterns and Anomalies

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G.; Follum, James D.; Freeman, Kimberly A.; Dagle, Jeffery E.

    2016-10-25

    This paper looks at the application of situational awareness methodologies with respect to power grid data. These methodologies establish baselines that look for typical patterns and atypical behavior in the data. The objectives of the baselining analyses are to provide: real-time analytics, the capability to look at historical trends and events, and reliable predictions of the near future state of the grid. Multivariate algorithms were created to establish normal baseline behavior and then score each moment in time according to its variance from the baseline. Detailed multivariate analytical techniques are described in this paper that produced ways to identify typical patterns and atypical behavior. In this case, atypical behavior is behavior that is unenvisioned. Visualizations were also produced to help explain the behavior that was identified mathematically. Examples are shown to help describe how to read and interpret the analyses and visualizations. Preliminary work has been performed on PMU data sets from BPA (Bonneville Power Administration) and EI (Eastern Interconnect). Actual results are not fully shown here because of confidentiality issues. Comparisons between atypical events found mathematically and actual events showed that many of the actual events are also atypical events; however there are many atypical events that do not correlate to any actual events. Additional work needs to be done to help classify the atypical events into actual events, so that the importance of the events can be better understood.

  9. Is the relationship between increased knee muscle strength and improved physical function following exercise dependent on baseline physical function status?

    Science.gov (United States)

    Hall, Michelle; Hinman, Rana S; van der Esch, Martin; van der Leeden, Marike; Kasza, Jessica; Wrigley, Tim V; Metcalf, Ben R; Dobson, Fiona; Bennell, Kim L

    2017-12-08

    Clinical guidelines recommend knee muscle strengthening exercises to improve physical function. However, the amount of knee muscle strength increase needed for clinically relevant improvements in physical function is unclear. Understanding how much increase in knee muscle strength is associated with improved physical function could assist clinicians in providing appropriate strength gain targets for their patients in order to optimise outcomes from exercise. The aim of this study was to investigate whether an increase in knee muscle strength is associated with improved self-reported physical function following exercise; and whether the relationship differs according to physical function status at baseline. Data from 100 participants with medial knee osteoarthritis enrolled in a 12-week randomised controlled trial comparing neuromuscular exercise to quadriceps strengthening exercise were pooled. Participants were categorised as having mild, moderate or severe physical dysfunction at baseline using the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC). Associations between 12-week changes in physical function (dependent variable) and peak isometric knee extensor and flexor strength (independent variables) were evaluated with and without accounting for baseline physical function status and covariates using linear regression models. In covariate-adjusted models without accounting for baseline physical function, every 1-unit (Nm/kg) increase in knee extensor strength was associated with physical function improvement of 17 WOMAC units (95% confidence interval (CI) -29 to -5). When accounting for baseline severity of physical function, every 1-unit increase in knee extensor strength was associated with physical function improvement of 24 WOMAC units (95% CI -42 to -7) in participants with severe physical dysfunction. There were no associations between change in strength and change in physical function in participants with mild or moderate physical

  10. The Ahmed Versus Baerveldt study: design, baseline patient characteristics, and intraoperative complications.

    Science.gov (United States)

    Christakis, Panos G; Tsai, James C; Zurakowski, David; Kalenak, Jeffrey W; Cantor, Louis B; Ahmed, Iqbal I K

    2011-11-01

    To report the design, baseline patient characteristics, and intraoperative complications of the Ahmed Versus Baerveldt (AVB) Study. Multicenter, randomized, clinical trial. Patients were recruited from 7 international clinical sites and treated by 10 surgeons between 2005 and 2009. Inclusion criteria required that patients be at least 18 years of age and have uncontrolled glaucoma refractory to medicinal, laser, and surgical therapy. Eligible patients were randomized to undergo implantation of an Ahmed-FP7 valve (New World Medical, Inc., Rancho Cucamonga, CA) or a Baerveldt-350 implant (Abbott Medical Optics, Inc., Santa Ana, CA) using standardized surgical technique, to be followed for 5 years. The primary outcome measure was failure, defined as intraocular pressure (IOP) out of target range (5-18 mmHg with ≥ 20% reduction from baseline) for 2 consecutive visits after 3 months, vision-threatening complications, additional glaucoma procedures, or loss of light perception. Secondary outcome measures included IOP, medication use, visual acuity, complications, and interventions. A total of 238 patients were enrolled in the study; 124 received the Ahmed-FP7 valve implant and 114 received the Baerveldt-350 implant. The 2 treatment groups did not differ in any baseline characteristics with the exception of sex. The mean age of the study group was 66 ± 16 years, and 55% were women, with a greater proportion in the Baerveldt group (P=0.01). The mean baseline IOP of the study group was 31.4 ± 10.8 on a mean of 3.1 ± 1.0 glaucoma medications. The median Snellen visual acuity was 20/100, mean number of previous laser therapies was 0.9 ± 1.1, and mean number of previous surgeries was 1.7 ± 1.2. Five (4%) patients in the Ahmed group and 4 (4%) patients in the Baerveldt group experienced significant intraoperative complications. Aqueous drainage devices are being increasingly used for glaucoma refractory to conventional treatment, and the AVB Study compares the 2 most

  11. Developing protocols for geochemical baseline studies: An example from the Coles Hill uranium deposit, Virginia, USA

    International Nuclear Information System (INIS)

    Levitan, Denise M.; Schreiber, Madeline E.; Seal, Robert R.; Bodnar, Robert J.; Aylor, Joseph G.

    2014-01-01

    Highlights: • We outline protocols for baseline geochemical surveys of stream sediments and water. • Regression on order statistics was used to handle non-detect data. • U concentrations in stream water near this unmined ore were below regulatory standards. • Concentrations of major and trace elements were correlated with stream discharge. • Methods can be applied to other extraction activities, including hydraulic fracturing. - Abstract: In this study, we determined baseline geochemical conditions in stream sediments and surface waters surrounding an undeveloped uranium deposit. Emphasis was placed on study design, including site selection to encompass geological variability and temporal sampling to encompass hydrological and climatic variability, in addition to statistical methods for baseline data analysis. The concentrations of most elements in stream sediments were above analytical detection limits, making them amenable to standard statistical analysis. In contrast, some trace elements in surface water had concentrations that were below the respective detection limits, making statistical analysis more challenging. We describe and compare statistical methods appropriate for concentrations that are below detection limits (non-detect data) and conclude that regression on order statistics provided the most rigorous analysis of our results, particularly for trace elements. Elevated concentrations of U and deposit-associated elements (e.g. Ba, Pb, and V) were observed in stream sediments and surface waters downstream of the deposit, but concentrations were below regulatory guidelines for the protection of aquatic ecosystems and for drinking water. Analysis of temporal trends indicated that concentrations of major and trace elements were most strongly related to stream discharge. These findings highlight the need for sampling protocols that will identify and evaluate the temporal and spatial variations in a thorough baseline study

  12. Underlying topography extraction over forest areas from multi-baseline PolInSAR data

    Science.gov (United States)

    Fu, Haiqiang; Zhu, Jianjun; Wang, Changcheng; Li, Zhiwei

    2017-11-01

    In this paper, the digital elevation model (DEM) for a forest area is extracted from multi-baseline (MB) polarimetric interferometric synthetic aperture radar (PolInSAR) data. On the basis of the random-volume-over-ground (RVoG) model, the weighted complex least-squares adjustment (WCLSA) method is proposed for the ground phase estimation, so that the MB PolInSAR observations can be constrained by a generalized observation function and the observation contribution to the solution can be adjusted by a weighting strategy. A baseline length weighting strategy is then adopted to syncretize the DEMs estimated with the ground phases. The results of the simulated experiment undertaken in this study demonstrate that the WCLSA method is sensitive to the number of redundant observations and can adjust the contributions of the different observations. We also applied the WCLSA method to E-SAR L- and P-band MB PolInSAR data from the Krycklan River catchment in Northern Sweden. The results show that the two extracted DEMs are in close agreement with the Light Detection and Ranging (Lidar) DEM, with root-mean-square errors of 3.54 and 3.16 m. The DEM vertical error is correlated with the terrain slope and ground-cover condition, but not with the forest height.

  13. Early return to baseline range of motion and strength after anterior shoulder instability surgery: a Multicenter Orthopaedic Outcomes Network (MOON) shoulder group cohort study.

    Science.gov (United States)

    Buckwalter V, Joseph A; Wolf, Brian R; Glass, Natalie; Bollier, Matt; Kuhn, John E; Hettrich, Carolyn M

    2018-03-23

    Patients often return to higher-level activities and sports at 4 to 8 months after anterior shoulder stabilization procedures. It is unknown what percentage of patients have regained normal function at this time frame and what factors predict residual deficits, range of motion (ROM), and strength after anterior shoulder instability surgery. Ten participating sites throughout the United States enrolled patients in a prospective cohort study including primary, revision, arthroscopic, and open anterior stabilization procedures. Baseline demographic data and patient outcomes questionnaires were collected with initial physical examination, treatment, surgical findings, and surgical repair details. At the 6-month follow-up visit, ROM and strength measurements were collected and compared with preoperative measurements. There were 348 patients identified who underwent surgical treatment for anterior shoulder instability. Of these, 259 patients (74.0%) returned to baseline, and 89 (26.0%) did not return to baseline shoulder ROM (≥20° loss of ROM) or strength. A higher Beighton score (P = .01) and number of dislocations (P baseline ROM and strength at early follow-up. No surgical variables were found to influence return to baseline function, including open vs. arthroscopic surgery, primary vs. revision surgery, and number of suture anchors. By 4 to 8 months postoperatively, 76% of patients return to baseline ROM, 98% return to baseline strength, and 74% return to both baseline ROM and strength. An increased number of dislocations and generalized joint laxity were associated with failure to return to baseline ROM and strength at early follow-up after anterior shoulder instability surgery. Copyright © 2018 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  14. Baseline requirements for assessment of mining impact using biological monitoring

    International Nuclear Information System (INIS)

    Humphrey, C.L.; Dostine, P.L.

    1995-01-01

    Biological monitoring programmes for environmental protection should provide for both early detection of possible adverse effects, and assessment of the ecological significance of these effects. Monitoring techniques are required that include responses sensitive to the impact, that can be subjected to rigorous statistical analysis and for which statistical power is high. Such issues in baseline research of 'what and how to measure?' and 'for how long?' have been the focus of a programme being developed to monitor and assess effects of mining operations on the essentially pristine, freshwater ecosystems of the Alligator Rivers Region (ARR) in tropical northern Australia. Application of the BACIP (Before, After, Control, Impact, Paired differences) design, utilizing a form of temporal replication, to univariate (single species) and multivariate (community) data is described. The BACIP design incorporates data from single control and impact sites. We argue for modification of the design for particular studies conducted in streams, to incorporate additional independent control sites from adjacent catchment. Inferential power, by way of (i) more confidently attributing cause to an observed change and (ii) providing information about the ecological significance of the change, will be enhanced using a modified BACIP design. In highly valued environments such as the ARR, monitoring programmes require application of statistical tests with high power to guarantee that an impact no greater than a prescribed amount has gone undetected. A minimum number of baseline years using the BACIP approach would therefore be required in order to achieve some desired level of statistical power. This paper describes the results of power analyses conducted on 2-5 years (depending upon the technique) of baseline data from streams of the ARR and discuss the implications of these results for management. 44 refs., 1 tab., 3 figs

  15. Prognostic value of baseline seric Syndecan-1 in initially unresectable metastatic colorectal cancer patients: a simple biological score.

    Science.gov (United States)

    Jary, Marine; Lecomte, Thierry; Bouché, Olivier; Kim, Stefano; Dobi, Erion; Queiroz, Lise; Ghiringhelli, Francois; Etienne, Hélène; Léger, Julie; Godet, Yann; Balland, Jérémy; Lakkis, Zaher; Adotevi, Olivier; Bonnetain, Franck; Borg, Christophe; Vernerey, Dewi

    2016-11-15

    In first-line metastatic colorectal cancer (mCRC), baseline prognostic factors allowing death risk and treatment strategy stratification are lacking. Syndecan-1 (CD138) soluble form was never described as a prognostic biomarker in mCRC. We investigated its additional prognostic value for overall survival (OS). mCRC patients with unresectable disease at diagnosis were treated with bevacizumab-based chemotherapy in two independent prospective clinical trials (development set: n = 126, validation set: n = 51, study NCT00489697 and study NCT00544011, respectively). Serums were collected at baseline for CD138 measurement. OS determinants were assessed and, based on the final multivariate model, a prognostic score was proposed. Two independent OS prognostic factors were identified: Lactate Dehydrogenase (LDH) high level (p = 0.0066) and log-CD138 high level (p = 0.0190). The determination of CD138 binary information (cutoff: 75 ng/mL) allowed the assessment of a biological prognostic score with CD138 and LDH values, identifying three risk groups for death (median OS= 38.9, 30.1 and 19.8 months for the low, intermediate and high risk groups, respectively; p value for OS, in mCRC patients. A simple biological scoring system is proposed including LDH and CD138 binary status values. © 2016 UICC.

  16. 200-BP-5 operable unit Technical Baseline report

    International Nuclear Information System (INIS)

    Jacques, I.D.; Kent, S.K.

    1991-10-01

    This report supports development of a remedial investigation/feasibility study work plan for the 200-BP-5 operable unit. The report summarizes baseline information for waste sites and unplanned release sites located in the 200-BP-5 operable unit. The sites were investigated by the Technical Baseline Section of the Environmental Engineering Group, Westinghouse Hanford Company (Westinghouse Hanford). The investigation consisted of review and evaluation of current and historical Hanford Site reports, drawings, and photographs, and was supplemented with recent inspections of the Hanford Site and employee interviews. No field investigations or sampling were conducted

  17. Evidence of the shifting baseline syndrome in ethnobotanical research.

    Science.gov (United States)

    Hanazaki, Natalia; Herbst, Dannieli Firme; Marques, Mel Simionato; Vandebroek, Ina

    2013-11-14

    The shifting baseline syndrome is a concept from ecology that can be analyzed in the context of ethnobotanical research. Evidence of shifting baseline syndrome can be found in studies dealing with intracultural variation of knowledge, when knowledge from different generations is compared and combined with information about changes in the environment and/or natural resources. We reviewed 84 studies published between 1993 and 2012 that made comparisons of ethnobotanical knowledge according to different age classes. After analyzing these studies for evidence of the shifting baseline syndrome (lower knowledge levels in younger generations and mention of declining abundance of local natural resources), we searched within these studies for the use of the expressions "cultural erosion", "loss of knowledge", or "acculturation". The studies focused on different groups of plants (e.g. medicinal plants, foods, plants used for general purposes, or the uses of specific important species). More than half of all 84 studies (57%) mentioned a concern towards cultural erosion or knowledge loss; 54% of the studies showed evidence of the shifting baseline syndrome; and 37% of the studies did not provide any evidence of shifting baselines (intergenerational knowledge differences but no information available about the abundance of natural resources). The general perception of knowledge loss among young people when comparing ethnobotanical repertoires among different age groups should be analyzed with caution. Changes in the landscape or in the abundance of plant resources may be associated with changes in ethnobotanical repertoires held by people of different age groups. Also, the relationship between the availability of resources and current plant use practices rely on a complexity of factors. Fluctuations in these variables can cause changes in the reference (baseline) of different generations and consequently be responsible for differences in intergenerational knowledge. Unraveling

  18. Importance of Baseline Specification in Evaluating Conservation Interventions and Achieving No Net Loss of Biodiversity

    Science.gov (United States)

    Bull, J W; Gordon, A; Law, E A; Suttle, K B; Milner-Gulland, E J

    2014-01-01

    There is an urgent need to improve the evaluation of conservation interventions. This requires specifying an objective and a frame of reference from which to measure performance. Reference frames can be baselines (i.e., known biodiversity at a fixed point in history) or counterfactuals (i.e., a scenario that would have occurred without the intervention). Biodiversity offsets are interventions with the objective of no net loss of biodiversity (NNL). We used biodiversity offsets to analyze the effects of the choice of reference frame on whether interventions met stated objectives. We developed 2 models to investigate the implications of setting different frames of reference in regions subject to various biodiversity trends and anthropogenic impacts. First, a general analytic model evaluated offsets against a range of baseline and counterfactual specifications. Second, a simulation model then replicated these results with a complex real world case study: native grassland offsets in Melbourne, Australia. Both models showed that achieving NNL depended upon the interaction between reference frame and background biodiversity trends. With a baseline, offsets were less likely to achieve NNL where biodiversity was decreasing than where biodiversity was stable or increasing. With a no-development counterfactual, however, NNL was achievable only where biodiversity was declining. Otherwise, preventing development was better for biodiversity. Uncertainty about compliance was a stronger determinant of success than uncertainty in underlying biodiversity trends. When only development and offset locations were considered, offsets sometimes resulted in NNL, but not across an entire region. Choice of reference frame determined feasibility and effort required to attain objectives when designing and evaluating biodiversity offset schemes. We argue the choice is thus of fundamental importance for conservation policy. Our results shed light on situations in which biodiversity offsets may

  19. 41 CFR 109-1.5202 - Establishment of a personal property holdings baseline.

    Science.gov (United States)

    2010-07-01

    ... personal property holdings baseline. 109-1.5202 Section 109-1.5202 Public Contracts and Property Management...-1.5202 Establishment of a personal property holdings baseline. (a) If the contractor is a new... baseline or may perform a complete physical inventory of all personal property. This physical inventory is...

  20. Dynamic model of a micro-tubular solid oxide fuel cell stack including an integrated cooling system

    Science.gov (United States)

    Hering, Martin; Brouwer, Jacob; Winkler, Wolfgang

    2017-02-01

    A novel dynamic micro-tubular solid oxide fuel cell (MT-SOFC) and stack model including an integrated cooling system is developed using a quasi three-dimensional, spatially resolved, transient thermodynamic, physical and electrochemical model that accounts for the complex geometrical relations between the cells and cooling-tubes. The modeling approach includes a simplified tubular geometry and stack design including an integrated cooling structure, detailed pressure drop and gas property calculations, the electrical and physical constraints of the stack design that determine the current, as well as control strategies for the temperature. Moreover, an advanced heat transfer balance with detailed radiative heat transfer between the cells and the integrated cooling-tubes, convective heat transfer between the gas flows and the surrounding structures and conductive heat transfer between the solid structures inside of the stack, is included. The detailed model can be used as a design basis for the novel MT-SOFC stack assembly including an integrated cooling system, as well as for the development of a dynamic system control strategy. The evaluated best-case design achieves very high electrical efficiency between around 75 and 55% in the entire power density range between 50 and 550 mW /cm2 due to the novel stack design comprising an integrated cooling structure.

  1. Multilevel Analysis of Multiple-Baseline Data Evaluating Precision Teaching as an Intervention for Improving Fluency in Foundational Reading Skills for at Risk Readers

    Science.gov (United States)

    Brosnan, Julie; Moeyaert, Mariola; Brooks Newsome, Kendra; Healy, Olive; Heyvaert, Mieke; Onghena, Patrick; Van den Noortgate, Wim

    2018-01-01

    In this article, multiple-baseline across participants designs were used to evaluate the impact of a precision teaching (PT) program, within a Tier 2 Response to Intervention framework, targeting fluency in foundational reading skills with at risk kindergarten readers. Thirteen multiple-baseline design experiments that included participation from…

  2. Finite element modeling of contaminant transport in soils including the effect of chemical reactions.

    Science.gov (United States)

    Javadi, A A; Al-Najjar, M M

    2007-05-17

    The movement of chemicals through soils to the groundwater is a major cause of degradation of water resources. In many cases, serious human and stock health implications are associated with this form of pollution. Recent studies have shown that the current models and methods are not able to adequately describe the leaching of nutrients through soils, often underestimating the risk of groundwater contamination by surface-applied chemicals, and overestimating the concentration of resident solutes. Furthermore, the effect of chemical reactions on the fate and transport of contaminants is not included in many of the existing numerical models for contaminant transport. In this paper a numerical model is presented for simulation of the flow of water and air and contaminant transport through unsaturated soils with the main focus being on the effects of chemical reactions. The governing equations of miscible contaminant transport including advection, dispersion-diffusion and adsorption effects together with the effect of chemical reactions are presented. The mathematical framework and the numerical implementation of the model are described in detail. The model is validated by application to a number of test cases from the literature and is then applied to the simulation of a physical model test involving transport of contaminants in a block of soil with particular reference to the effects of chemical reactions. Comparison of the results of the numerical model with the experimental results shows that the model is capable of predicting the effects of chemical reactions with very high accuracy. The importance of consideration of the effects of chemical reactions is highlighted.

  3. 75 FR 30014 - Consumers Energy Company; Notice of Baseline Filing

    Science.gov (United States)

    2010-05-28

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR10-25-000] Consumers Energy Company; Notice of Baseline Filing May 21, 2010. Take notice that on May 17, 2010, Consumers Energy Company (Consumers) submitted a baseline filing of its Statement of Operating Conditions for the...

  4. Baseline mitral regurgitation predicts outcome in patients referred for dobutamine stress echocardiography.

    Science.gov (United States)

    O'Driscoll, Jamie M; Gargallo-Fernandez, Paula; Araco, Marco; Perez-Lopez, Manuel; Sharma, Rajan

    2017-11-01

    A number of parameters recorded during dobutamine stress echocardiography (DSE) are associated with worse outcome. However, the relative importance of baseline mitral regurgitation (MR) is unknown. The aim of this study was to assess the prevalence and associated implications of functional MR with long-term mortality in a large cohort of patients referred for DSE. 6745 patients (mean age 64.9 ± 12.2 years) were studied. Demographic, baseline and peak DSE data were collected. All-cause mortality was retrospectively analyzed. DSE was successfully completed in all patients with no adverse outcomes. MR was present in 1019 (15.1%) patients. During a mean follow up of 5.1 ± 1.8 years, 1642 (24.3%) patients died and MR was significantly associated with increased all-cause mortality (p statistic models significantly improved discrimination. MR is associated with all-cause mortality and adds incremental prognostic information among patients referred for DSE. The presence of MR should be taken into account when evaluating the prognostic significance of DSE results.

  5. Including Effects of Water Stress on Dead Organic Matter Decay to a Forest Carbon Model

    Science.gov (United States)

    Kim, H.; Lee, J.; Han, S. H.; Kim, S.; Son, Y.

    2017-12-01

    Decay of dead organic matter is a key process of carbon (C) cycling in forest ecosystems. The change in decay rate depends on temperature sensitivity and moisture conditions. The Forest Biomass and Dead organic matter Carbon (FBDC) model includes a decay sub-model considering temperature sensitivity, yet does not consider moisture conditions as drivers of the decay rate change. This study aimed to improve the FBDC model by including a water stress function to the decay sub-model. Also, soil C sequestration under climate change with the FBDC model including the water stress function was simulated. The water stress functions were determined with data from decomposition study on Quercus variabilis forests and Pinus densiflora forests of Korea, and adjustment parameters of the functions were determined for both species. The water stress functions were based on the ratio of precipitation to potential evapotranspiration. Including the water stress function increased the explained variances of the decay rate by 19% for the Q. variabilis forests and 7% for the P. densiflora forests, respectively. The increase of the explained variances resulted from large difference in temperature range and precipitation range across the decomposition study plots. During the period of experiment, the mean annual temperature range was less than 3°C, while the annual precipitation ranged from 720mm to 1466mm. Application of the water stress functions to the FBDC model constrained increasing trend of temperature sensitivity under climate change, and thus increased the model-estimated soil C sequestration (Mg C ha-1) by 6.6 for the Q. variabilis forests and by 3.1 for the P. densiflora forests, respectively. The addition of water stress functions increased reliability of the decay rate estimation and could contribute to reducing the bias in estimating soil C sequestration under varying moisture condition. Acknowledgement: This study was supported by Korea Forest Service (2017044B10-1719-BB01)

  6. Cumulative Effects of Concussion History on Baseline Computerized Neurocognitive Test Scores: Systematic Review and Meta-analysis.

    Science.gov (United States)

    Alsalaheen, Bara; Stockdale, Kayla; Pechumer, Dana; Giessing, Alexander; He, Xuming; Broglio, Steven P

    It is unclear whether individuals with a history of single or multiple clinically recovered concussions exhibit worse cognitive performance on baseline testing compared with individuals with no concussion history. To analyze the effects of concussion history on baseline neurocognitive performance using a computerized neurocognitive test. PubMed, CINAHL, and psycINFO were searched in November 2015. The search was supplemented by a hand search of references. Studies were included if participants completed the Immediate Post-concussion Assessment and Cognitive Test (ImPACT) at baseline (ie, preseason) and if performance was stratified by previous history of single or multiple concussions. Systematic review and meta-analysis. Level 2. Sample size, demographic characteristics of participants, as well as performance of participants on verbal memory, visual memory, visual-motor processing speed, and reaction time were extracted from each study. A random-effects pooled meta-analysis revealed that, with the exception of worsened visual memory for those with 1 previous concussion (Hedges g = 0.10), no differences were observed between participants with 1 or multiple concussions compared with participants without previous concussions. With the exception of decreased visual memory based on history of 1 concussion, history of 1 or multiple concussions was not associated with worse baseline cognitive performance.

  7. Waste management project technical baseline description

    International Nuclear Information System (INIS)

    Sederburg, J.P.

    1997-01-01

    A systems engineering approach has been taken to describe the technical baseline under which the Waste Management Project is currently operating. The document contains a mission analysis, function analysis, requirement analysis, interface definitions, alternative analysis, system definition, documentation requirements, implementation definitions, and discussion of uncertainties facing the Project

  8. A unitarized meson model including color Coulomb interaction

    International Nuclear Information System (INIS)

    Metzger, Kees.

    1990-01-01

    Ch. 1 gives a general introduction into the problem field of the thesis. It discusses in how far the internal structure of mesons is understood theoretically and which models exist. It discusses from a phenomenological point of view the problem of confinement indicates how quark models of mesons may provide insight in this phenomenon. In ch. 2 the formal theory of scattering in a system with confinement is given. It is shown how a coupled channel (CC) description and the work of other authors fit into this general framework. Explicit examples and arguments are given to support the CC treatment of such a system. In ch. 3 the full coupled-channel model as is employed in this thesis is presented. On the basis of arguments from the former chapters and the observed regularities in the experimental data, the choices underlying the model are supported. In this model confinement is described with a mass-dependent harmonic-oscillator potential and the presence of open (meson-meson) channels plays an essential role. In ch. 4 the unitarized model is applied to light scalar meson resonances. In this regime the contribution of the open channels is considerable. It is demonstrated that the model parameters as used for the description of the pseudo-scalar and vector mesons, unchanged can be used for the description of these mesons. Ch. 5 treats the color-Coulomb interaction. There the effect of the Coulomb interaction is studied in simple models without decay. The results of incorporating the color-Coulomb interaction into the full CC model are given in ch.6. Ch. 7 discusses the results of the previous chapters and the present status of the model. (author). 182 refs.; 16 figs.; 33 tabs

  9. Baseline Tumor Size Is an Independent Prognostic Factor for Overall Survival in Patients With Melanoma Treated With Pembrolizumab.

    Science.gov (United States)

    Joseph, Richard W; Elassaiss-Schaap, Jeroen; Kefford, Richard F; Hwu, Wen-Jen; Wolchok, Jedd D; Joshua, Anthony Michael; Ribas, Antoni; Hodi, F Stephen; Hamid, Omid; Robert, Caroline; Daud, Adil I; Dronca, Roxana S; Hersey, Peter; Weber, Jeffrey S; Patnaik, Amita; de Alwis, Dinesh P; Perrone, Andrea M; Zhang, Jin; Kang, Soonmo Peter; Ebbinghaus, Scot W; Anderson, Keaven M; Gangadhar, Tara

    2018-04-23

    To assess the association of baseline tumor size (BTS) with other baseline clinical factors and outcomes in pembrolizumab-treated patients with advanced melanoma in KEYNOTE-001 (NCT01295827). BTS was quantified by adding the sum of the longest dimensions of all measurable baseline target lesions. BTS as a dichotomous and continuous variable was evaluated with other baseline factors using logistic regression for objective response rate (ORR) and Cox regression for overall survival (OS). Nominal P values with no multiplicity adjustment describe the strength of observed associations. Per central review by RECIST v1.1, 583 of 655 patients had baseline measurable disease and were included in this post hoc analysis. Median BTS was 10.2 cm (range, 1-89.5). Larger median BTS was associated with Eastern Cooperative Oncology Group performance status 1, elevated lactate dehydrogenase (LDH), stage M1c disease, and liver metastases (with or without any other sites) (all P ≤ 0.001). In univariate analyses, BTS below the median was associated with higher ORR (44% vs 23%; P BTS below the median remained an independent prognostic marker of OS (P BTS below the median and PD-L1-positive tumors were independently associated with higher ORR and longer OS. BTS is associated with many other baseline clinical factors but is also independently prognostic of survival in pembrolizumab-treated patients with advanced melanoma. Copyright ©2018, American Association for Cancer Research.

  10. Historical baselines of coral cover on tropical reefs as estimated by expert opinion

    Directory of Open Access Journals (Sweden)

    Tyler D. Eddy

    2018-01-01

    Full Text Available Coral reefs are important habitats that represent global marine biodiversity hotspots and provide important benefits to people in many tropical regions. However, coral reefs are becoming increasingly threatened by climate change, overfishing, habitat destruction, and pollution. Historical baselines of coral cover are important to understand how much coral cover has been lost, e.g., to avoid the ‘shifting baseline syndrome’. There are few quantitative observations of coral reef cover prior to the industrial revolution, and therefore baselines of coral reef cover are difficult to estimate. Here, we use expert and ocean-user opinion surveys to estimate baselines of global coral reef cover. The overall mean estimated baseline coral cover was 59% (±19% standard deviation, compared to an average of 58% (±18% standard deviation estimated by professional scientists. We did not find evidence of the shifting baseline syndrome, whereby respondents who first observed coral reefs more recently report lower estimates of baseline coral cover. These estimates of historical coral reef baseline cover are important for scientists, policy makers, and managers to understand the extent to which coral reefs have become depleted and to set appropriate recovery targets.

  11. Historical baselines of coral cover on tropical reefs as estimated by expert opinion.

    Science.gov (United States)

    Eddy, Tyler D; Cheung, William W L; Bruno, John F

    2018-01-01

    Coral reefs are important habitats that represent global marine biodiversity hotspots and provide important benefits to people in many tropical regions. However, coral reefs are becoming increasingly threatened by climate change, overfishing, habitat destruction, and pollution. Historical baselines of coral cover are important to understand how much coral cover has been lost, e.g., to avoid the 'shifting baseline syndrome'. There are few quantitative observations of coral reef cover prior to the industrial revolution, and therefore baselines of coral reef cover are difficult to estimate. Here, we use expert and ocean-user opinion surveys to estimate baselines of global coral reef cover. The overall mean estimated baseline coral cover was 59% (±19% standard deviation), compared to an average of 58% (±18% standard deviation) estimated by professional scientists. We did not find evidence of the shifting baseline syndrome, whereby respondents who first observed coral reefs more recently report lower estimates of baseline coral cover. These estimates of historical coral reef baseline cover are important for scientists, policy makers, and managers to understand the extent to which coral reefs have become depleted and to set appropriate recovery targets.

  12. Soy food frequency questionnaire does not correlate with baseline isoflavone levels in patients with bladder cancer.

    Science.gov (United States)

    Kolesar, Jill M; Pomplun, Marcia; Havighurst, Tom; Stublaski, Jeanne; Wollmer, Barbara; Kim, KyungMann; Tangrea, Joseph A; Parnes, Howard L; House, Margaret G; Gee, Jason; Messing, Edward; Bailey, Howard H

    2015-04-01

    The isoflavone genistein, a natural soy product with receptor tyrosine kinase-inhibiting activity, as well as phytoestrogenic and other potential anticarcinogenic effects, is being studied as an anticancer agent. Since isoflavones are commonly consumed in food products containing soy proteins, a method to control for baseline isoflavone consumption is needed. HPLC was used to evaluate baseline plasma and urine concentrations of isoflavone in fifty-four participants with bladder cancer enrolled on a phase II chemoprevention study of G-2535. The soy food frequency questionnaire was used to assess participant's baseline soy intake. The association between baseline isoflavone concentrations and intakes for genistein and daidzein was assessed by the Spearman's rank correlation coefficient. The majority of participants had no detectable genistein or daidzein in plasma at baseline. The median and range of values were 0 (0-1480) nmol/L for genistein, and 0 (0-1260) nmol/L for daidzein. In urine, the median and range of values were 91.0 (0-9030) nmol/L for genistein and 623 (0-100,000) nmol/L for daidzein. The median and range of weekly estimated genistein intake was 0 (0-236) mg/wk; the median and range of weekly estimated daidzein intake was 0 (0-114) mg/wk. There was no relationship to soy intake as measured by the food frequency questionnaire and baseline isoflavone levels in plasma or urine and the Spearman's rank correlation coefficients were not significant. The soy food frequency questionnaire did not correlate with plasma or urine concentrations of either isoflavone. Alternative methods for controlling for soy consumption, including measuring plasma and urine concentrations, in isoflavone chemoprevention trials should be considered. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  13. Epidemiological and Clinical Baseline Characteristics as Predictive Biomarkers of Response to Anti-VEGF Treatment in Patients with Neovascular AMD

    Directory of Open Access Journals (Sweden)

    Miltiadis K. Tsilimbaris

    2016-01-01

    Full Text Available Purpose. To review the current literature investigating patient response to antivascular endothelial growth factor-A (VEGF therapy in the treatment of neovascular age-related macular degeneration (nAMD and to identify baseline characteristics that might predict response. Method. A literature search of the PubMed database was performed, using the keywords: AMD, anti-VEGF, biomarker, optical coherence tomography, treatment outcome, and predictor. The search was limited to articles published from 2006 to date. Exclusion criteria included phase 1 trials, case reports, studies focusing on indications other than nAMD, and oncology. Results. A total of 1467 articles were identified, of which 845 were excluded. Of the 622 remaining references, 47 met all the search criteria and were included in this review. Conclusion. Several baseline characteristics correlated with anti-VEGF treatment response, including best-corrected visual acuity, age, lesion size, and retinal thickness. The majority of factors were associated with disease duration, suggesting that longer disease duration before treatment results in worse treatment outcomes. This highlights the need for early treatment for patients with nAMD to gain optimal treatment outcomes. Many of the identified baseline characteristics are interconnected and cannot be evaluated in isolation; therefore multivariate analyses will be required to determine any specific relationship with treatment response.

  14. Way to increase the user access at the LCLS baseline

    Energy Technology Data Exchange (ETDEWEB)

    Geloni, Gianluca [European XFEL GmbH, Hamburg; Kocharyan, Vitali; Saldin, Evgeni [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2010-10-15

    Although the LCLS photon beam is meant for a single user, the baseline undulator is long enough to serve two users simultaneously. To this end, we propose a setup composed of two simple elements: an X-ray mirror pair for X-ray beam deflection, and a short (4 m-long) magnetic chicane, which creates an offset for mirror pair installation in the middle of the baseline undulator. The insertable mirror pair can be used for spatial separation of the X-ray beams generated in the first and in the second half of the baseline undulator. The method of deactivating one half and activating another half of the undulator is based on the rapid switching of the FEL amplification process. As proposed elsewhere, using a kicker installed upstream of the LCLS baseline undulator and an already existing corrector in the first half of the undulator, it is possible to rapidly switch the X-ray beam from one user to another, thus providing two active beamlines at any time. We present simulation results dealing with the LCLS baseline, and show that it is possible to generate two saturated SASE X-ray beams in the whole 0.8-8 keV photon energy range in the same baseline undulator. These can be exploited to serve two users. Implementation of the proposed technique does not perturb the baseline mode of operation of the LCLS undulator. Moreover, the magnetic chicane setup is very flexible, and can be used as a self-seeding setup too. We present simulation results for the LCLS baseline undulator with SHAB (second harmonic afterburner) and show that one can produce monochromatic radiation at the 2nd harmonic as well as at the 1st. We describe an efficient way for obtaining multi-user operation at the LCLS hard X-ray FEL. To this end, a photon beam distribution system based on the use of crystals in the Bragg reflection geometry is proposed. The reflectivity of crystal deflectors can be switched fast enough by flipping the crystals with piezoelectric devices similar to those for X-ray phase retarders

  15. Single-Phase Bundle Flows Including Macroscopic Turbulence Model

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Jun; Yoon, Han Young [KAERI, Daejeon (Korea, Republic of); Yoon, Seok Jong; Cho, Hyoung Kyu [Seoul National University, Seoul (Korea, Republic of)

    2016-05-15

    To deal with various thermal hydraulic phenomena due to rapid change of fluid properties when an accident happens, securing mechanistic approaches as much as possible may reduce the uncertainty arising from improper applications of the experimental models. In this study, the turbulence mixing model, which is well defined in the subchannel analysis code such as VIPRE, COBRA, and MATRA by experiments, is replaced by a macroscopic k-e turbulence model, which represents the aspect of mathematical derivation. The performance of CUPID with macroscopic turbulence model is validated against several bundle experiments: CNEN 4x4 and PNL 7x7 rod bundle tests. In this study, the macroscopic k-e model has been validated for the application to subchannel analysis. It has been implemented in the CUPID code and validated against CNEN 4x4 and PNL 7x7 rod bundle tests. The results showed that the macroscopic k-e turbulence model can estimate the experiments properly.

  16. 77 FR 31841 - Hope Gas, Inc.; Notice of Baseline Filing

    Science.gov (United States)

    2012-05-30

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR12-23-001] Hope Gas, Inc.; Notice of Baseline Filing Take notice that on May 16, 2012, Hope Gas, Inc. (Hope Gas) submitted a revised baseline filing of their Statement of Operating Conditions for services provided under Section 311 of the...

  17. 77 FR 26535 - Hope Gas, Inc.; Notice of Baseline Filing

    Science.gov (United States)

    2012-05-04

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR12-23-000] Hope Gas, Inc.; Notice of Baseline Filing Take notice that on April 26, 2012, Hope Gas, Inc. (Hope Gas) submitted a baseline filing of their Statement of Operating Conditions for services provided under Section 311 of the...

  18. 75 FR 33799 - EasTrans, LLC; Notice of Baseline Filing

    Science.gov (United States)

    2010-06-15

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR10-30-000] EasTrans, LLC; Notice of Baseline Filing June 8, 2010. Take notice that on June 4, 2010, EasTrans, LLC submitted a baseline filing of its Statement of Operating Conditions for services provided under section 311 of the...

  19. A High-Rate, Single-Crystal Model including Phase Transformations, Plastic Slip, and Twinning

    Energy Technology Data Exchange (ETDEWEB)

    Addessio, Francis L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Theoretical Division; Bronkhorst, Curt Allan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Theoretical Division; Bolme, Cynthia Anne [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Explosive Science and Shock Physics Division; Brown, Donald William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Materials Science and Technology Division; Cerreta, Ellen Kathleen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Materials Science and Technology Division; Lebensohn, Ricardo A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Materials Science and Technology Division; Lookman, Turab [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Theoretical Division; Luscher, Darby Jon [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Theoretical Division; Mayeur, Jason Rhea [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Theoretical Division; Morrow, Benjamin M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Materials Science and Technology Division; Rigg, Paulo A. [Washington State Univ., Pullman, WA (United States). Dept. of Physics. Inst. for Shock Physics

    2016-08-09

    An anisotropic, rate-­dependent, single-­crystal approach for modeling materials under the conditions of high strain rates and pressures is provided. The model includes the effects of large deformations, nonlinear elasticity, phase transformations, and plastic slip and twinning. It is envisioned that the model may be used to examine these coupled effects on the local deformation of materials that are subjected to ballistic impact or explosive loading. The model is formulated using a multiplicative decomposition of the deformation gradient. A plate impact experiment on a multi-­crystal sample of titanium was conducted. The particle velocities at the back surface of three crystal orientations relative to the direction of impact were measured. Molecular dynamics simulations were conducted to investigate the details of the high-­rate deformation and pursue issues related to the phase transformation for titanium. Simulations using the single crystal model were conducted and compared to the high-­rate experimental data for the impact loaded single crystals. The model was found to capture the features of the experiments.

  20. CryoSat Level1b SAR/SARin BaselineC: Product Format and Algorithm Improvements

    Science.gov (United States)

    Scagliola, Michele; Fornari, Marco; Di Giacinto, Andrea; Bouffard, Jerome; Féménias, Pierre; Parrinello, Tommaso

    2015-04-01

    CryoSat was launched on the 8th April 2010 and is the first European ice mission dedicated to the monitoring of precise changes in the thickness of polar ice sheets and floating sea ice. Cryosat carries an innovative radar altimeter called the Synthetic Aperture Interferometric Altimeter (SIRAL), that transmits pulses at a high pulse repetition frequency thus making the received echoes phase coherent and suitable for azimuth processing. This allows to reach a significantly improved along track resolution with respect to traditional pulse-width limited altimeters. CryoSat is the first altimetry mission operating in SAR mode and continuous improvements in the Level1 Instrument Processing Facility (IPF1) are being identified, tested and validated in order to improve the quality of the Level1b products. The current IPF, Baseline B, was released in operation in February 2012. A reprocessing campaign followed, in order to reprocess the data since July 2010. After more than 2 years of development, the release in operations of Baseline C is expected in the first half of 2015. BaselineC Level1b products will be distributed in an updated format, including for example the attitude information (roll, pitch and yaw) and, for SAR/SARIN, the waveform length doubled with respect to Baseline B. Moreveor, various algorithm improvements have been identified: • a datation bias of about -0.5195 ms will be corrected (SAR/SARIn) • a range bias of about 0.6730 m will be corrected (SAR/SARIn) • a roll bias of 0.1062 deg and a pitch bias of 0.0520 deg • Surface sample stack weighting to filter out the single look echoes acquired at highest look angle, that results in a sharpening of the 20Hz waveforms With the operational release of BaselineC, the second CryoSat reprocessing campaign will be initiated, taking benefit of the upgrade implemented in the IPF1 processing chain but also at IPF2 level. The reprocessing campaign will cover the full Cryosat mission starting on 16th July 2010